Sample records for quantification results obtained

  1. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    PubMed

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  2. An Innovative Method for Obtaining Consistent Images and Quantification of Histochemically Stained Specimens

    PubMed Central

    Sedgewick, Gerald J.; Ericson, Marna

    2015-01-01

    Obtaining digital images of color brightfield microscopy is an important aspect of biomedical research and the clinical practice of diagnostic pathology. Although the field of digital pathology has had tremendous advances in whole-slide imaging systems, little effort has been directed toward standardizing color brightfield digital imaging to maintain image-to-image consistency and tonal linearity. Using a single camera and microscope to obtain digital images of three stains, we show that microscope and camera systems inherently produce image-to-image variation. Moreover, we demonstrate that post-processing with a widely used raster graphics editor software program does not completely correct for session-to-session inconsistency. We introduce a reliable method for creating consistent images with a hardware/software solution (ChromaCal™; Datacolor Inc., NJ) along with its features for creating color standardization, preserving linear tonal levels, providing automated white balancing and setting automated brightness to consistent levels. The resulting image consistency using this method will also streamline mean density and morphometry measurements, as images are easily segmented and single thresholds can be used. We suggest that this is a superior method for color brightfield imaging, which can be used for quantification and can be readily incorporated into workflows. PMID:25575568

  3. Detection and quantification of benzodiazepines in hair by ToF-SIMS: preliminary results

    NASA Astrophysics Data System (ADS)

    Audinot, J.-N.; Yegles, M.; Labarthe, A.; Ruch, D.; Wennig, R.; Migeon, H.-N.

    2003-01-01

    Successful results have been obtained in detection and quantification of buprenorphine in urine and hemolysed blood by time of flight-secondary ion mass spectrometry (ToF-SIMS). The present work is focused on four molecules of the benzodiazepine's family: nordiazepam, aminoflunitrozepam, diazepam and oxazepam. These drugs remain difficult to analyse in routine clinical and forensic toxicology because of their thermal instability and low therapeutic range (0.5-5 ng/ml). Internal standards are prepared by means of deuterated molecules. The benzadiazepine and their deuterated form (nordiazepam-D5, amino-flunitrazepam-D3, diazepam-D5 and oxazepam-D5) were added, in known concentration, in urine. These molecules were then extracted with several methods (pH, solvent, etc.) and, after adsorption on a noble metal, analysed by ToF-SIMS. The paper will focus for the different molecules on the comparison of the different preparation procedures, the optimisation of the SIMS conditions, the limits of detection and the limits of quantification.

  4. New methods and results for quantification of lightning-aircraft electrodynamics

    NASA Technical Reports Server (NTRS)

    Pitts, Felix L.; Lee, Larry D.; Perala, Rodney A.; Rudolph, Terence H.

    1987-01-01

    The NASA F-106 collected data on the rates of change of electromagnetic parameters on the aircraft surface during over 700 direct lightning strikes while penetrating thunderstorms at altitudes from 15,000 t0 40,000 ft (4,570 to 12,190 m). These in situ measurements provided the basis for the first statistical quantification of the lightning electromagnetic threat to aircraft appropriate for determining indirect lightning effects on aircraft. These data are used to update previous lightning criteria and standards developed over the years from ground-based measurements. The proposed standards will be the first which reflect actual aircraft responses measured at flight altitudes. Nonparametric maximum likelihood estimates of the distribution of the peak electromagnetic rates of change for consideration in the new standards are obtained based on peak recorder data for multiple-strike flights. The linear and nonlinear modeling techniques developed provide means to interpret and understand the direct-strike electromagnetic data acquired on the F-106. The reasonable results obtained with the models, compared with measured responses, provide increased confidence that the models may be credibly applied to other aircraft.

  5. Preliminary Results on Uncertainty Quantification for Pattern Analytics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Stracuzzi, David John; Brost, Randolph; Chen, Maximillian Gene

    2015-09-01

    This report summarizes preliminary research into uncertainty quantification for pattern ana- lytics within the context of the Pattern Analytics to Support High-Performance Exploitation and Reasoning (PANTHER) project. The primary focus of PANTHER was to make large quantities of remote sensing data searchable by analysts. The work described in this re- port adds nuance to both the initial data preparation steps and the search process. Search queries are transformed from does the specified pattern exist in the data? to how certain is the system that the returned results match the query? We show example results for both data processing and search,more » and discuss a number of possible improvements for each.« less

  6. Legionella in water samples: how can you interpret the results obtained by quantitative PCR?

    PubMed

    Ditommaso, Savina; Ricciardi, Elisa; Giacomuzzi, Monica; Arauco Rivera, Susan R; Zotti, Carla M

    2015-02-01

    Evaluation of the potential risk associated with Legionella has traditionally been determined from culture-based methods. Quantitative polymerase chain reaction (qPCR) is an alternative tool that offers rapid, sensitive and specific detection of Legionella in environmental water samples. In this study we compare the results obtained by conventional qPCR (iQ-Check™ Quanti Legionella spp.; Bio-Rad) and by culture method on artificial samples prepared in Page's saline by addiction of Legionella pneumophila serogroup 1 (ATCC 33152) and we analyse the selective quantification of viable Legionella cells by the qPCR-PMA method. The amount of Legionella DNA (GU) determined by qPCR was 28-fold higher than the load detected by culture (CFU). Applying the qPCR combined with PMA treatment we obtained a reduction of 98.5% of the qPCR signal from dead cells. We observed a dissimilarity in the ability of PMA to suppress the PCR signal in samples with different amounts of bacteria: the effective elimination of detection signals by PMA depended on the concentration of GU and increasing amounts of cells resulted in higher values of reduction. Using the results from this study we created an algorithm to facilitate the interpretation of viable cell level estimation with qPCR-PMA. Copyright © 2014 Elsevier Ltd. All rights reserved.

  7. Protein quantification using a cleavable reporter peptide.

    PubMed

    Duriez, Elodie; Trevisiol, Stephane; Domon, Bruno

    2015-02-06

    Peptide and protein quantification based on isotope dilution and mass spectrometry analysis are widely employed for the measurement of biomarkers and in system biology applications. The accuracy and reliability of such quantitative assays depend on the quality of the stable-isotope labeled standards. Although the quantification using stable-isotope labeled peptides is precise, the accuracy of the results can be severely biased by the purity of the internal standards, their stability and formulation, and the determination of their concentration. Here we describe a rapid and cost-efficient method to recalibrate stable isotope labeled peptides in a single LC-MS analysis. The method is based on the equimolar release of a protein reference peptide (used as surrogate for the protein of interest) and a universal reporter peptide during the trypsinization of a concatenated polypeptide standard. The quality and accuracy of data generated with such concatenated polypeptide standards are highlighted by the quantification of two clinically important proteins in urine samples and compared with results obtained with conventional stable isotope labeled reference peptides. Furthermore, the application of the UCRP standards in complex samples is described.

  8. Normal Databases for the Relative Quantification of Myocardial Perfusion

    PubMed Central

    Rubeaux, Mathieu; Xu, Yuan; Germano, Guido; Berman, Daniel S.; Slomka, Piotr J.

    2016-01-01

    Purpose of review Myocardial perfusion imaging (MPI) with SPECT is performed clinically worldwide to detect and monitor coronary artery disease (CAD). MPI allows an objective quantification of myocardial perfusion at stress and rest. This established technique relies on normal databases to compare patient scans against reference normal limits. In this review, we aim to introduce the process of MPI quantification with normal databases and describe the associated perfusion quantitative measures that are used. Recent findings New equipment and new software reconstruction algorithms have been introduced which require the development of new normal limits. The appearance and regional count variations of normal MPI scan may differ between these new scanners and standard Anger cameras. Therefore, these new systems may require the determination of new normal limits to achieve optimal accuracy in relative myocardial perfusion quantification. Accurate diagnostic and prognostic results rivaling those obtained by expert readers can be obtained by this widely used technique. Summary Throughout this review, we emphasize the importance of the different normal databases and the need for specific databases relative to distinct imaging procedures. use of appropriate normal limits allows optimal quantification of MPI by taking into account subtle image differences due to the hardware and software used, and the population studied. PMID:28138354

  9. Extension of the International Atomic Energy Agency phantom study in image quantification: results of multicentre evaluation in Croatia.

    PubMed

    Grošev, Darko; Gregov, Marin; Wolfl, Miroslava Radić; Krstonošić, Branislav; Debeljuh, Dea Dundara

    2018-06-07

    To make quantitative methods of nuclear medicine more available, four centres in Croatia participated in the national intercomparison study, following the materials and methods used in the previous international study organized by the International Atomic Energy Agency (IAEA). The study task was to calculate the activities of four Ba sources (T1/2=10.54 years; Eγ=356 keV) using planar and single-photon emission computed tomography (SPECT) or SPECT/CT acquisitions of the sources inside a water-filled cylindrical phantom. The sources were previously calibrated by the US National Institute of Standards and Technology. Triple-energy window was utilized for scatter correction. Planar studies were corrected for attenuation correction (AC) using the conjugate-view method. For SPECT/CT studies, data from X-ray computed tomography were used for attenuation correction (CT-AC), whereas for SPECT-only acquisition, the Chang-AC method was applied. Using the lessons learned from the IAEA study, data were acquired according to the harmonized data acquisition protocol, and the acquired images were then processed using centralized data analysis. The accuracy of the activity quantification was evaluated as the ratio R between the calculated activity and the value obtained from National Institute of Standards and Technology. For planar studies, R=1.06±0.08; for SPECT/CT study using CT-AC, R=1.00±0.08; and for Chang-AC, R=0.89±0.12. The results are in accordance with those obtained within the larger IAEA study and confirm that SPECT/CT method is the most appropriate for accurate activity quantification.

  10. Sulfur-based absolute quantification of proteins using isotope dilution inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Lee, Hyun-Seok; Heun Kim, Sook; Jeong, Ji-Seon; Lee, Yong-Moon; Yim, Yong-Hyeon

    2015-10-01

    An element-based reductive approach provides an effective means of realizing International System of Units (SI) traceability for high-purity biological standards. Here, we develop an absolute protein quantification method using double isotope dilution (ID) inductively coupled plasma mass spectrometry (ICP-MS) combined with microwave-assisted acid digestion for the first time. We validated the method and applied it to certify the candidate protein certified reference material (CRM) of human growth hormone (hGH). The concentration of hGH was determined by analysing the total amount of sulfur in hGH. Next, the size-exclusion chromatography method was used with ICP-MS to characterize and quantify sulfur-containing impurities. By subtracting the contribution of sulfur-containing impurities from the total sulfur content in the hGH CRM, we obtained a SI-traceable certification value. The quantification result obtained with the present method based on sulfur analysis was in excellent agreement with the result determined via a well-established protein quantification method based on amino acid analysis using conventional acid hydrolysis combined with an ID liquid chromatography-tandem mass spectrometry. The element-based protein quantification method developed here can be generally used for SI-traceable absolute quantification of proteins, especially pure-protein standards.

  11. Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.

    PubMed

    Hawkins, Steve F C; Guest, Paul C

    2018-01-01

    The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.

  12. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    PubMed Central

    Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina

    2006-01-01

    Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary

  13. Protein, enzyme and carbohydrate quantification using smartphone through colorimetric digitization technique.

    PubMed

    Dutta, Sibasish; Saikia, Gunjan Prasad; Sarma, Dhruva Jyoti; Gupta, Kuldeep; Das, Priyanka; Nath, Pabitra

    2017-05-01

    In this paper the utilization of smartphone as a detection platform for colorimetric quantification of biological macromolecules has been demonstrated. Using V-channel of HSV color space, the quantification of BSA protein, catalase enzyme and carbohydrate (using D-glucose) have been successfully investigated. A custom designed android application has been developed for estimating the total concentration of biological macromolecules. The results have been compared with that of a standard spectrophotometer which is generally used for colorimetric quantification in laboratory settings by measuring its absorbance at a specific wavelength. The results obtained with the designed sensor is found to be similar when compared with the spectrophotometer data. The designed sensor is low cost, robust and we envision that it could promote diverse fields of bio-analytical investigations. Schematic illustration of the smartphone sensing mechanism for colorimetric analysis of biomolecular samples. © 2017 Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Capillary electrophoresis with contactless conductivity detection for the quantification of fluoride in lithium ion battery electrolytes and in ionic liquids-A comparison to the results gained with a fluoride ion-selective electrode.

    PubMed

    Pyschik, Marcelina; Klein-Hitpaß, Marcel; Girod, Sabrina; Winter, Martin; Nowak, Sascha

    2017-02-01

    In this study, an optimized method using capillary electrophoresis (CE) with a direct contactless conductivity detector (C 4 D) for a new application field is presented for the quantification of fluoride in common used lithium ion battery (LIB) electrolyte using LiPF 6 in organic carbonate solvents and in ionic liquids (ILs) after contacted to Li metal. The method development for finding the right buffer and the suitable CE conditions for the quantification of fluoride was investigated. The results of the concentration of fluoride in different LIB electrolyte samples were compared to the results from the ion-selective electrode (ISE). The relative standard deviations (RSDs) and recovery rates for fluoride were obtained with a very high accuracy in both methods. The results of the fluoride concentration in the LIB electrolytes were in very good agreement for both methods. In addition, the limit of detection (LOD) and limit of quantification (LOQ) values were determined for the CE method. The CE method has been applied also for the quantification of fluoride in ILs. In the fresh IL sample, the concentration of fluoride was under the LOD. Another sample of the IL mixed with Li metal has been investigated as well. It was possible to quantify the fluoride concentration in this sample. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. A phase quantification method based on EBSD data for a continuously cooled microalloyed steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, H.; Wynne, B.P.; Palmiere, E.J., E-mail: e.j

    2017-01-15

    Mechanical properties of steels depend on the phase constitutions of the final microstructures which can be related to the processing parameters. Therefore, accurate quantification of different phases is necessary to investigate the relationships between processing parameters, final microstructures and mechanical properties. Point counting on micrographs observed by optical or scanning electron microscopy is widely used as a phase quantification method, and different phases are discriminated according to their morphological characteristics. However, it is difficult to differentiate some of the phase constituents with similar morphology. Differently, for EBSD based phase quantification methods, besides morphological characteristics, other parameters derived from the orientationmore » information can also be used for discrimination. In this research, a phase quantification method based on EBSD data in the unit of grains was proposed to identify and quantify the complex phase constitutions of a microalloyed steel subjected to accelerated coolings. Characteristics of polygonal ferrite/quasi-polygonal ferrite, acicular ferrite and bainitic ferrite on grain averaged misorientation angles, aspect ratios, high angle grain boundary fractions and grain sizes were analysed and used to develop the identification criteria for each phase. Comparing the results obtained by this EBSD based method and point counting, it was found that this EBSD based method can provide accurate and reliable phase quantification results for microstructures with relatively slow cooling rates. - Highlights: •A phase quantification method based on EBSD data in the unit of grains was proposed. •The critical grain area above which GAM angles are valid parameters was obtained. •Grain size and grain boundary misorientation were used to identify acicular ferrite. •High cooling rates deteriorate the accuracy of this EBSD based method.« less

  16. Automated lobar quantification of emphysema in patients with severe COPD.

    PubMed

    Revel, Marie-Pierre; Faivre, Jean-Baptiste; Remy-Jardin, Martine; Deken, Valérie; Duhamel, Alain; Marquette, Charles-Hugo; Tacelli, Nunzia; Bakai, Anne-Marie; Remy, Jacques

    2008-12-01

    Automated lobar quantification of emphysema has not yet been evaluated. Unenhanced 64-slice MDCT was performed in 47 patients evaluated before bronchoscopic lung-volume reduction. CT images reconstructed with a standard (B20) and high-frequency (B50) kernel were analyzed using a dedicated prototype software (MevisPULMO) allowing lobar quantification of emphysema extent. Lobar quantification was obtained following (a) a fully automatic delineation of the lobar limits by the software and (b) a semiautomatic delineation with manual correction of the lobar limits when necessary and was compared with the visual scoring of emphysema severity per lobe. No statistically significant difference existed between automated and semiautomated lobar quantification (p > 0.05 in the five lobes), with differences ranging from 0.4 to 3.9%. The agreement between the two methods (intraclass correlation coefficient, ICC) was excellent for left upper lobe (ICC = 0.94), left lower lobe (ICC = 0.98), and right lower lobe (ICC = 0.80). The agreement was good for right upper lobe (ICC = 0.68) and moderate for middle lobe (IC = 0.53). The Bland and Altman plots confirmed these results. A good agreement was observed between the software and visually assessed lobar predominance of emphysema (kappa 0.78; 95% CI 0.64-0.92). Automated and semiautomated lobar quantifications of emphysema are concordant and show good agreement with visual scoring.

  17. Improving microstructural quantification in FIB/SEM nanotomography.

    PubMed

    Taillon, Joshua A; Pellegrinelli, Christopher; Huang, Yi-Lin; Wachsman, Eric D; Salamanca-Riba, Lourdes G

    2018-01-01

    FIB/SEM nanotomography (FIB-nt) is a powerful technique for the determination and quantification of the three-dimensional microstructure in subsurface features. Often times, the microstructure of a sample is the ultimate determiner of the overall performance of a system, and a detailed understanding of its properties is crucial in advancing the materials engineering of a resulting device. While the FIB-nt technique has developed significantly in the 15 years since its introduction, advanced nanotomographic analysis is still far from routine, and a number of challenges remain in data acquisition and post-processing. In this work, we present a number of techniques to improve the quality of the acquired data, together with easy-to-implement methods to obtain "advanced" microstructural quantifications. The techniques are applied to a solid oxide fuel cell cathode of interest to the electrochemistry community, but the methodologies are easily adaptable to a wide range of material systems. Finally, results from an analyzed sample are presented as a practical example of how these techniques can be implemented. Copyright © 2017 Elsevier B.V. All rights reserved.

  18. Standardless quantification by parameter optimization in electron probe microanalysis

    NASA Astrophysics Data System (ADS)

    Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.

    2012-11-01

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively.

  19. Quantification by SEM-EDS in uncoated non-conducting samples

    NASA Astrophysics Data System (ADS)

    Galván Josa, V.; Castellano, G.; Bertolino, S. R.

    2013-07-01

    An approach to perform elemental quantitative analysis in a conventional scanning electron microscope with an energy dispersive spectrometer has been developed for non-conductive samples in which the conductive coating should be avoided. Charge accumulation effects, which basically decrease the energy of the primary beam, were taken into account by means of the Duane-Hunt limit. This value represents the maximum energy of the continuum X-ray spectrum, and is related to the effective energy of the incident electron beam. To validate the results obtained by this procedure, a non-conductive sample of known composition was quantified without conductive coating. Complementarily, changes in the X-ray spectrum due to charge accumulation effects were studied by Monte Carlo simulations, comparing relative characteristic intensities as a function of the incident energy. This methodology is exemplified here to obtain the chemical composition of white and reddish archaeological pigments belonging to the Ambato style of "Aguada" culture (Catamarca, Argentina 500-1100 AD). The results obtained in this work show that the quantification procedure taking into account the Duane-Hunt limit is suitable for this kind of samples. This approach may be recommended for the quantification of samples for which coating is not desirable, such as ancient artwork, forensic or archaeological samples, or when the coating element is also present in the sample.

  20. Simultaneous quantification of adalimumab and infliximab in human plasma by liquid chromatography-tandem mass spectrometry.

    PubMed

    Jourdil, Jean-François; Némoz, Benjamin; Gautier-Veyret, Elodie; Romero, Charlotte; Stanke-Labesque, Françoise

    2018-03-30

    Adalimumab (ADA) and infliximab (IFX) are therapeutic monoclonal antibodies (TMabs) targeting tumor necrosis factor-alpha (TNFα). They are used to treat inflammatory diseases. Clinical trials have suggested that therapeutic drug monitoring for ADA or IFX could improve treatment response and cost-effectiveness. However, ADA and IFX were quantified by ELISA in all these studies, and the discrepancies between the results obtained raise questions about their reliability.We describe here the validation of a liquid chromatography-tandem mass spectrometry (LC-MS/MS) method for the simultaneous quantification of ADA and IFX in human samples. Full-length antibodies labeled with stable isotopes were added to plasma samples as an internal standard. Samples were then prepared using Mass Spectrometry Immuno Assay (MSIA) followed by trypsin digestion prior ADA and IFX quantification by LC-MS/MS.ADA and IFX were quantified in serum from patients treated with ADA (n=21) or IFX (n=22), and the concentrations obtained were compared with those obtained with a commercial ELISA kit. The chromatography run lasted 8.6 minutes and the quantification range was 1 to 26 mg/L. The method was reproducible, repeatable and accurate. For both levels of internal quality control, for ADA and IFX inter and intra-day coefficients of variation and accuracies were all within 15%, in accordance with FDA recommendations. No significant cross-contamination effect was noted.Good agreement was found between LC-MS/MS and ELISA results, for both ADA and IFX. This LC-MS/MS method can be used for the quantification of ADA and IFX in a single analytical run and for the optimization of LC-MS/MS resource use in clinical pharmacology laboratories.

  1. An alternative method for irones quantification in iris rhizomes using headspace solid-phase microextraction.

    PubMed

    Roger, B; Fernandez, X; Jeannot, V; Chahboun, J

    2010-01-01

    The essential oil obtained from iris rhizomes is one of the most precious raw materials for the perfume industry. Its fragrance is due to irones that are gradually formed by oxidative degradation of iridals during rhizome ageing. The development of an alternative method allowing irone quantification in iris rhizomes using HS-SPME-GC. The development of the method using HS-SPME-GC was achieved using the results obtained from a conventional method, i.e. a solid-liquid extraction (SLE) followed by irone quantification by CG. Among several calibration methods tested, internal calibration gave the best results and was the least sensitive to the matrix effect. The proposed method using HS-SPME-GC is as accurate and reproducible as the conventional one using SLE. These two methods were used to monitor and compare irone concentrations in iris rhizomes that had been stored for 6 months to 9 years. Irone quantification in iris rhizome can be achieved using HS-SPME-GC. This method can thus be used for the quality control of the iris rhizomes. It offers the advantage of combining extraction and analysis with an automated device and thus allows a large number of rhizome batches to be analysed and compared in a limited amount of time. Copyright © 2010 John Wiley & Sons, Ltd.

  2. Development and validation of an open source quantification tool for DSC-MRI studies.

    PubMed

    Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J

    2015-03-01

    This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.

  3. Quaternary ammonium isobaric tag for a relative and absolute quantification of peptides.

    PubMed

    Setner, Bartosz; Stefanowicz, Piotr; Szewczuk, Zbigniew

    2018-02-01

    Isobaric labeling quantification of peptides has become a method of choice for mass spectrometry-based proteomics studies. However, despite of wide variety of commercially available isobaric tags, none of the currently available methods offers significant improvement of sensitivity of detection during MS experiment. Recently, many strategies were applied to increase the ionization efficiency of peptides involving chemical modifications introducing quaternary ammonium fixed charge. Here, we present a novel quaternary ammonium-based isobaric tag for relative and absolute quantification of peptides (QAS-iTRAQ 2-plex). Upon collisional activation, the new stable benzylic-type cationic reporter ion is liberated from the tag. Deuterium atoms were used to offset the differential masses of a reporter group. We tested the applicability of QAS-iTRAQ 2-plex reagent on a series of model peptides as well as bovine serum albumin tryptic digest. Obtained results suggest usefulness of this isobaric ionization tag for relative and absolute quantification of peptides. Copyright © 2017 John Wiley & Sons, Ltd.

  4. Quantification of biofilm in microtiter plates: overview of testing conditions and practical recommendations for assessment of biofilm production by staphylococci.

    PubMed

    Stepanović, Srdjan; Vuković, Dragana; Hola, Veronika; Di Bonaventura, Giovanni; Djukić, Slobodanka; Cirković, Ivana; Ruzicka, Filip

    2007-08-01

    The details of all steps involved in the quantification of biofilm formation in microtiter plates are described. The presented protocol incorporates information on assessment of biofilm production by staphylococci, gained both by direct experience as well as by analysis of methods for assaying biofilm production. The obtained results should simplify quantification of biofilm formation in microtiter plates, and make it more reliable and comparable among different laboratories.

  5. Quantification of Wine Mixtures with an Electronic Nose and a Human Panel

    PubMed Central

    Aleixandre, Manuel; Cabellos, Juan M.; Arroyo, Teresa; Horrillo, M. C.

    2018-01-01

    In this work, an electronic nose and a human panel were used for the quantification of wines formed by binary mixtures of four white grape varieties and two varieties of red wines at different percentages (from 0 to 100% in 10% steps for the electronic nose and from 0 to 100% in 25% steps for the human panel). The wines were prepared using the traditional method with commercial yeasts. Both techniques were able to quantify the mixtures tested, but it is important to note that the technology of the electronic nose is faster, simpler, and more objective than the human panel. In addition, better results of quantification were also obtained using the electronic nose. PMID:29484296

  6. Quantification of Wine Mixtures with an Electronic Nose and a Human Panel.

    PubMed

    Aleixandre, Manuel; Cabellos, Juan M; Arroyo, Teresa; Horrillo, M C

    2018-01-01

    In this work, an electronic nose and a human panel were used for the quantification of wines formed by binary mixtures of four white grape varieties and two varieties of red wines at different percentages (from 0 to 100% in 10% steps for the electronic nose and from 0 to 100% in 25% steps for the human panel). The wines were prepared using the traditional method with commercial yeasts. Both techniques were able to quantify the mixtures tested, but it is important to note that the technology of the electronic nose is faster, simpler, and more objective than the human panel. In addition, better results of quantification were also obtained using the electronic nose.

  7. Convex geometry of quantum resource quantification

    NASA Astrophysics Data System (ADS)

    Regula, Bartosz

    2018-01-01

    We introduce a framework unifying the mathematical characterisation of different measures of general quantum resources and allowing for a systematic way to define a variety of faithful quantifiers for any given convex quantum resource theory. The approach allows us to describe many commonly used measures such as matrix norm-based quantifiers, robustness measures, convex roof-based measures, and witness-based quantifiers together in a common formalism based on the convex geometry of the underlying sets of resource-free states. We establish easily verifiable criteria for a measure to possess desirable properties such as faithfulness and strong monotonicity under relevant free operations, and show that many quantifiers obtained in this framework indeed satisfy them for any considered quantum resource. We derive various bounds and relations between the measures, generalising and providing significantly simplified proofs of results found in the resource theories of quantum entanglement and coherence. We also prove that the quantification of resources in this framework simplifies for pure states, allowing us to obtain more easily computable forms of the considered measures, and show that many of them are in fact equal on pure states. Further, we investigate the dual formulation of resource quantifiers, which provide a characterisation of the sets of resource witnesses. We present an explicit application of the results to the resource theories of multi-level coherence, entanglement of Schmidt number k, multipartite entanglement, as well as magic states, providing insight into the quantification of the four resources by establishing novel quantitative relations and introducing new quantifiers, such as a measure of entanglement of Schmidt number k which generalises the convex roof-extended negativity, a measure of k-coherence which generalises the \

  8. Quantification of concentrated Chinese medicine granules by quantitative polymerase chain reaction.

    PubMed

    Lo, Yat-Tung; Shaw, Pang-Chui

    2017-10-25

    Determination of the amount of constituent in a multi-herb product is important for quality control. In concentrated Chinese medicine granules (CCMG), no dregs are left after dissolution of the CCMG. This study is the first to examine the feasibility of using quantitative polymerase chain reaction (qPCR) to find the amount of CCMG in solution form. DNA was extracted from Hirudo and Zaocys CCMG mixed at different ratios and amplified in qPCR using species-specific primers. The threshold cycle (C T ) obtained was compared with the respective standard curves. Results showed that reproducible quantification results could be obtained (1) for 5-50mg CCMG using a modified DNA extraction protocol, (2) amongst DNA extracted from the same batch of CCMG and (3) amongst different batches of CCMG from the same company. This study demonstrated the constitute amount of CCMG in a mixture could be determined using qPCR. This work has extended the application of DNA techniques for the quantification of herbal products and this approach may be developed for quality assurance in the CCMG industry. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Quantification of prebiotics in commercial infant formulas.

    PubMed

    Sabater, Carlos; Prodanov, Marin; Olano, Agustín; Corzo, Nieves; Montilla, Antonia

    2016-03-01

    Since breastfeeding is not always possible, infant formulas (IFs) are supplemented with prebiotic oligosaccharides, such as galactooligosaccharides (GOS) and/or fructooligosaccharides (FOS) to exert similar effects to those of the breast milk. Nowadays, a great number of infant formulas enriched with prebiotics are disposal in the market, however there are scarce data about their composition. In this study, the combined use of two chromatographic methods (GC-FID and HPLC-RID) for the quantification of carbohydrates present in commercial infant formulas have been used. According to the results obtained by GC-FID for products containing prebiotics, the content of FOS, GOS and GOS/FOS was in the ranges of 1.6-5.0, 1.7-3.2, and 0.08-0.25/2.3-3.8g/100g of product, respectively. HPLC-RID analysis allowed quantification of maltodextrins with degree of polymerization (DP) up to 19. The methodology proposed here may be used for routine quality control of infant formula and other food ingredients containing prebiotics. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. LC-MS/MS quantification of next-generation biotherapeutics: a case study for an IgE binding Nanobody in cynomolgus monkey plasma.

    PubMed

    Sandra, Koen; Mortier, Kjell; Jorge, Lucie; Perez, Luis C; Sandra, Pat; Priem, Sofie; Poelmans, Sofie; Bouche, Marie-Paule

    2014-05-01

    Nanobodies(®) are therapeutic proteins derived from the smallest functional fragments of heavy chain-only antibodies. The development and validation of an LC-MS/MS-based method for the quantification of an IgE binding Nanobody in cynomolgus monkey plasma is presented. Nanobody quantification was performed making use of a proteotypic tryptic peptide chromatographically enriched prior to LC-MS/MS analysis. The validated LLOQ at 36 ng/ml was measured with an intra- and inter-assay precision and accuracy <20%. The required sensitivity could be obtained based on the selectivity of 2D LC combined with MS/MS. No analyte specific tools for affinity purification were used. Plasma samples originating from a PK/PD study were analyzed and compared with the results obtained with a traditional ligand-binding assay. Excellent correlations between the two techniques were obtained, and similar PK parameters were estimated. A 2D LC-MS/MS method was successfully developed and validated for the quantification of a next generation biotherapeutic.

  11. Critical aspects of data analysis for quantification in laser-induced breakdown spectroscopy

    NASA Astrophysics Data System (ADS)

    Motto-Ros, V.; Syvilay, D.; Bassel, L.; Negre, E.; Trichard, F.; Pelascini, F.; El Haddad, J.; Harhira, A.; Moncayo, S.; Picard, J.; Devismes, D.; Bousquet, B.

    2018-02-01

    In this study, a collaborative contest focused on LIBS data processing has been conducted in an original way since the participants did not share the same samples to be analyzed on their own LIBS experiments but a set of LIBS spectra obtained from one single experiment. Each participant was asked to provide the predicted concentrations of several elements for two glass samples. The analytical contest revealed a wide diversity of results among participants, even when the same spectral lines were considered for the analysis. Then, a parametric study was conducted to investigate the influence of each step during the data processing. This study was based on several analytical figures of merit such as the determination coefficient, uncertainty, limit of quantification and prediction ability (i.e., trueness). Then, it was possible to interpret the results provided by the participants, emphasizing the fact that the type of data extraction, baseline modeling as well as the calibration model play key roles in the quantification performance of the technique. This work provides a set of recommendations based on a systematic evaluation of the quantification procedure with the aim of optimizing the methodological steps toward the standardization of LIBS.

  12. Plasma cell quantification in bone marrow by computer-assisted image analysis.

    PubMed

    Went, P; Mayer, S; Oberholzer, M; Dirnhofer, S

    2006-09-01

    Minor and major criteria for the diagnosis of multiple meloma according to the definition of the WHO classification include different categories of the bone marrow plasma cell count: a shift from the 10-30% group to the > 30% group equals a shift from a minor to a major criterium, while the < 10% group does not contribute to the diagnosis. Plasma cell fraction in the bone marrow is therefore critical for the classification and optimal clinical management of patients with plasma cell dyscrasias. The aim of this study was (i) to establish a digital image analysis system able to quantify bone marrow plasma cells and (ii) to evaluate two quantification techniques in bone marrow trephines i.e. computer-assisted digital image analysis and conventional light-microscopic evaluation. The results were compared regarding inter-observer variation of the obtained results. Eighty-seven patients, 28 with multiple myeloma, 29 with monoclonal gammopathy of undetermined significance, and 30 with reactive plasmocytosis were included in the study. Plasma cells in H&E- and CD138-stained slides were quantified by two investigators using light-microscopic estimation and computer-assisted digital analysis. The sets of results were correlated with rank correlation coefficients. Patients were categorized according to WHO criteria addressing the plasma cell content of the bone marrow (group 1: 0-10%, group 2: 11-30%, group 3: > 30%), and the results compared by kappa statistics. The degree of agreement in CD138-stained slides was higher for results obtained using the computer-assisted image analysis system compared to light microscopic evaluation (corr.coeff. = 0.782), as was seen in the intra- (corr.coeff. = 0.960) and inter-individual results correlations (corr.coeff. = 0.899). Inter-observer agreement for categorized results (SM/PW: kappa 0.833) was in a high range. Computer-assisted image analysis demonstrated a higher reproducibility of bone marrow plasma cell quantification. This might

  13. Virus detection and quantification using electrical parameters

    NASA Astrophysics Data System (ADS)

    Ahmad, Mahmoud Al; Mustafa, Farah; Ali, Lizna M.; Rizvi, Tahir A.

    2014-10-01

    Here we identify and quantitate two similar viruses, human and feline immunodeficiency viruses (HIV and FIV), suspended in a liquid medium without labeling, using a semiconductor technique. The virus count was estimated by calculating the impurities inside a defined volume by observing the change in electrical parameters. Empirically, the virus count was similar to the absolute value of the ratio of the change of the virus suspension dopant concentration relative to the mock dopant over the change in virus suspension Debye volume relative to mock Debye volume. The virus type was identified by constructing a concentration-mobility relationship which is unique for each kind of virus, allowing for a fast (within minutes) and label-free virus quantification and identification. For validation, the HIV and FIV virus preparations were further quantified by a biochemical technique and the results obtained by both approaches corroborated well. We further demonstrate that the electrical technique could be applied to accurately measure and characterize silica nanoparticles that resemble the virus particles in size. Based on these results, we anticipate our present approach to be a starting point towards establishing the foundation for label-free electrical-based identification and quantification of an unlimited number of viruses and other nano-sized particles.

  14. Statistical image quantification toward optimal scan fusion and change quantification

    NASA Astrophysics Data System (ADS)

    Potesil, Vaclav; Zhou, Xiang Sean

    2007-03-01

    Recent advance of imaging technology has brought new challenges and opportunities for automatic and quantitative analysis of medical images. With broader accessibility of more imaging modalities for more patients, fusion of modalities/scans from one time point and longitudinal analysis of changes across time points have become the two most critical differentiators to support more informed, more reliable and more reproducible diagnosis and therapy decisions. Unfortunately, scan fusion and longitudinal analysis are both inherently plagued with increased levels of statistical errors. A lack of comprehensive analysis by imaging scientists and a lack of full awareness by physicians pose potential risks in clinical practice. In this paper, we discuss several key error factors affecting imaging quantification, studying their interactions, and introducing a simulation strategy to establish general error bounds for change quantification across time. We quantitatively show that image resolution, voxel anisotropy, lesion size, eccentricity, and orientation are all contributing factors to quantification error; and there is an intricate relationship between voxel anisotropy and lesion shape in affecting quantification error. Specifically, when two or more scans are to be fused at feature level, optimal linear fusion analysis reveals that scans with voxel anisotropy aligned with lesion elongation should receive a higher weight than other scans. As a result of such optimal linear fusion, we will achieve a lower variance than naïve averaging. Simulated experiments are used to validate theoretical predictions. Future work based on the proposed simulation methods may lead to general guidelines and error lower bounds for quantitative image analysis and change detection.

  15. Modeling qRT-PCR dynamics with application to cancer biomarker quantification.

    PubMed

    Chervoneva, Inna; Freydin, Boris; Hyslop, Terry; Waldman, Scott A

    2017-01-01

    Quantitative reverse transcription polymerase chain reaction (qRT-PCR) is widely used for molecular diagnostics and evaluating prognosis in cancer. The utility of mRNA expression biomarkers relies heavily on the accuracy and precision of quantification, which is still challenging for low abundance transcripts. The critical step for quantification is accurate estimation of efficiency needed for computing a relative qRT-PCR expression. We propose a new approach to estimating qRT-PCR efficiency based on modeling dynamics of polymerase chain reaction amplification. In contrast, only models for fluorescence intensity as a function of polymerase chain reaction cycle have been used so far for quantification. The dynamics of qRT-PCR efficiency is modeled using an ordinary differential equation model, and the fitted ordinary differential equation model is used to obtain effective polymerase chain reaction efficiency estimates needed for efficiency-adjusted quantification. The proposed new qRT-PCR efficiency estimates were used to quantify GUCY2C (Guanylate Cyclase 2C) mRNA expression in the blood of colorectal cancer patients. Time to recurrence and GUCY2C expression ratios were analyzed in a joint model for survival and longitudinal outcomes. The joint model with GUCY2C quantified using the proposed polymerase chain reaction efficiency estimates provided clinically meaningful results for association between time to recurrence and longitudinal trends in GUCY2C expression.

  16. Quantification of the activity of biomolecules in microarrays obtained by direct laser transfer.

    PubMed

    Dinca, V; Ranella, A; Farsari, M; Kafetzopoulos, D; Dinescu, M; Popescu, A; Fotakis, C

    2008-10-01

    The direct-writing technique laser-induced forward transfer has been employed for the micro-array printing of liquid solutions of the enzyme horseradish peroxidase and the protein Titin on nitrocellulose solid surfaces. The effect of two UV laser pulse lengths, femtosecond and nanosecond has been studied in relation with maintaining the activity of the transferred biomolecules. The quantification of the active biomolecules after transfer has been carried out using Bradford assay, quantitative colorimetric enzymatic assay and fluorescence techniques. Spectrophotometric measurements of the HRP and the Titin activity as well as chromatogenic and fluorescence assay studies have revealed a connection between the properties of the deposited, biologically active biomolecules, the experimental conditions and the target composition. The bioassays have shown that up to 78% of the biomolecules remained active after femtosecond laser transfer, while this value reduced to 54% after nanosecond laser transfer. The addition of glycerol in a percentage up to 70% in the solution to be transferred has contributed to the stabilization of the micro-array patterns and the increase of their resolution.

  17. A new analytical method for quantification of olive and palm oil in blends with other vegetable edible oils based on the chromatographic fingerprints from the methyl-transesterified fraction.

    PubMed

    Jiménez-Carvelo, Ana M; González-Casado, Antonio; Cuadros-Rodríguez, Luis

    2017-03-01

    A new analytical method for the quantification of olive oil and palm oil in blends with other vegetable edible oils (canola, safflower, corn, peanut, seeds, grapeseed, linseed, sesame and soybean) using normal phase liquid chromatography, and applying chemometric tools was developed. The procedure for obtaining of chromatographic fingerprint from the methyl-transesterified fraction from each blend is described. The multivariate quantification methods used were Partial Least Square-Regression (PLS-R) and Support Vector Regression (SVR). The quantification results were evaluated by several parameters as the Root Mean Square Error of Validation (RMSEV), Mean Absolute Error of Validation (MAEV) and Median Absolute Error of Validation (MdAEV). It has to be highlighted that the new proposed analytical method, the chromatographic analysis takes only eight minutes and the results obtained showed the potential of this method and allowed quantification of mixtures of olive oil and palm oil with other vegetable oils. Copyright © 2016 Elsevier B.V. All rights reserved.

  18. Quantification of sterol lipids in plants by quadrupole time-of-flight mass spectrometry

    PubMed Central

    Wewer, Vera; Dombrink, Isabel; vom Dorp, Katharina; Dörmann, Peter

    2011-01-01

    Glycerolipids, sphingolipids, and sterol lipids constitute the major lipid classes in plants. Sterol lipids are composed of free and conjugated sterols, i.e., sterol esters, sterol glycosides, and acylated sterol glycosides. Sterol lipids play crucial roles during adaption to abiotic stresses and plant-pathogen interactions. Presently, no comprehensive method for sterol lipid quantification in plants is available. We used nanospray ionization quadrupole-time-of-flight mass spectrometry (Q-TOF MS) to resolve and identify the molecular species of all four sterol lipid classes from Arabidopsis thaliana. Free sterols were derivatized with chlorobetainyl chloride. Sterol esters, sterol glycosides, and acylated sterol glycosides were ionized as ammonium adducts. Quantification of molecular species was achieved in the positive mode after fragmentation in the presence of internal standards. The amounts of sterol lipids quantified by Q-TOF MS/MS were validated by comparison with results obtained with TLC/GC. Quantification of sterol lipids from leaves and roots of phosphate-deprived A. thaliana plants revealed changes in the amounts and molecular species composition. The Q-TOF method is far more sensitive than GC or HPLC. Therefore, Q-TOF MS/MS provides a comprehensive strategy for sterol lipid quantification that can be adapted to other tandem mass spectrometers. PMID:21382968

  19. Integrated protocol for reliable and fast quantification and documentation of electrophoresis gels.

    PubMed

    Rehbein, Peter; Schwalbe, Harald

    2015-06-01

    Quantitative analysis of electrophoresis gels is an important part in molecular cloning, as well as in protein expression and purification. Parallel quantifications in yield and purity can be most conveniently obtained from densitometric analysis. This communication reports a comprehensive, reliable and simple protocol for gel quantification and documentation, applicable for single samples and with special features for protein expression screens. As major component of the protocol, the fully annotated code of a proprietary open source computer program for semi-automatic densitometric quantification of digitized electrophoresis gels is disclosed. The program ("GelQuant") is implemented for the C-based macro-language of the widespread integrated development environment of IGOR Pro. Copyright © 2014 Elsevier Inc. All rights reserved.

  20. A new background subtraction method for Western blot densitometry band quantification through image analysis software.

    PubMed

    Gallo-Oller, Gabriel; Ordoñez, Raquel; Dotor, Javier

    2018-06-01

    Since its first description, Western blot has been widely used in molecular labs. It constitutes a multistep method that allows the detection and/or quantification of proteins from simple to complex protein mixtures. Western blot quantification method constitutes a critical step in order to obtain accurate and reproducible results. Due to the technical knowledge required for densitometry analysis together with the resources availability, standard office scanners are often used for the imaging acquisition of developed Western blot films. Furthermore, the use of semi-quantitative software as ImageJ (Java-based image-processing and analysis software) is clearly increasing in different scientific fields. In this work, we describe the use of office scanner coupled with the ImageJ software together with a new image background subtraction method for accurate Western blot quantification. The proposed method represents an affordable, accurate and reproducible approximation that could be used in the presence of limited resources availability. Copyright © 2018 Elsevier B.V. All rights reserved.

  1. Processing and domain selection: Quantificational variability effects

    PubMed Central

    Harris, Jesse A.; Clifton, Charles; Frazier, Lyn

    2014-01-01

    Three studies investigated how readers interpret sentences with variable quantificational domains, e.g., The army was mostly in the capital, where mostly may quantify over individuals or parts (Most of the army was in the capital) or over times (The army was in the capital most of the time). It is proposed that a general conceptual economy principle, No Extra Times (Majewski 2006, in preparation), discourages the postulation of potentially unnecessary times, and thus favors the interpretation quantifying over parts. Disambiguating an ambiguously quantified sentence to a quantification over times interpretation was rated as less natural than disambiguating it to a quantification over parts interpretation (Experiment 1). In an interpretation questionnaire, sentences with similar quantificational variability were constructed so that both interpretations of the sentence would require postulating multiple times; this resulted in the elimination of the preference for a quantification over parts interpretation, suggesting the parts preference observed in Experiment 1 is not reducible to a lexical bias of the adverb mostly (Experiment 2). An eye movement recording study showed that, in the absence of prior evidence for multiple times, readers exhibit greater difficulty when reading material that forces a quantification over times interpretation than when reading material that allows a quantification over parts interpretation (Experiment 3). These experiments contribute to understanding readers’ default assumptions about the temporal properties of sentences, which is essential for understanding the selection of a domain for adverbial quantifiers and, more generally, for understanding how situational constraints influence sentence processing. PMID:25328262

  2. quantGenius: implementation of a decision support system for qPCR-based gene quantification.

    PubMed

    Baebler, Špela; Svalina, Miha; Petek, Marko; Stare, Katja; Rotter, Ana; Pompe-Novak, Maruša; Gruden, Kristina

    2017-05-25

    Quantitative molecular biology remains a challenge for researchers due to inconsistent approaches for control of errors in the final results. Due to several factors that can influence the final result, quantitative analysis and interpretation of qPCR data are still not trivial. Together with the development of high-throughput qPCR platforms, there is a need for a tool allowing for robust, reliable and fast nucleic acid quantification. We have developed "quantGenius" ( http://quantgenius.nib.si ), an open-access web application for a reliable qPCR-based quantification of nucleic acids. The quantGenius workflow interactively guides the user through data import, quality control (QC) and calculation steps. The input is machine- and chemistry-independent. Quantification is performed using the standard curve approach, with normalization to one or several reference genes. The special feature of the application is the implementation of user-guided QC-based decision support system, based on qPCR standards, that takes into account pipetting errors, assay amplification efficiencies, limits of detection and quantification of the assays as well as the control of PCR inhibition in individual samples. The intermediate calculations and final results are exportable in a data matrix suitable for further statistical analysis or visualization. We additionally compare the most important features of quantGenius with similar advanced software tools and illustrate the importance of proper QC system in the analysis of qPCR data in two use cases. To our knowledge, quantGenius is the only qPCR data analysis tool that integrates QC-based decision support and will help scientists to obtain reliable results which are the basis for biologically meaningful data interpretation.

  3. Quantification of 11C-Laniquidar Kinetics in the Brain.

    PubMed

    Froklage, Femke E; Boellaard, Ronald; Bakker, Esther; Hendrikse, N Harry; Reijneveld, Jaap C; Schuit, Robert C; Windhorst, Albert D; Schober, Patrick; van Berckel, Bart N M; Lammertsma, Adriaan A; Postnov, Andrey

    2015-11-01

    Overexpression of the multidrug efflux transport P-glycoprotein may play an important role in pharmacoresistance. (11)C-laniquidar is a newly developed tracer of P-glycoprotein expression. The aim of this study was to develop a pharmacokinetic model for quantification of (11)C-laniquidar uptake and to assess its test-retest variability. Two (test-retest) dynamic (11)C-laniquidar PET scans were obtained in 8 healthy subjects. Plasma input functions were obtained using online arterial blood sampling with metabolite corrections derived from manual samples. Coregistered T1 MR images were used for region-of-interest definition. Time-activity curves were analyzed using various plasma input compartmental models. (11)C-laniquidar was metabolized rapidly, with a parent plasma fraction of 50% at 10 min after tracer injection. In addition, the first-pass extraction of (11)C-laniquidar was low. (11)C-laniquidar time-activity curves were best fitted to an irreversible single-tissue compartment (1T1K) model using conventional models. Nevertheless, significantly better fits were obtained using 2 parallel single-tissue compartments, one for parent tracer and the other for labeled metabolites (dual-input model). Robust K1 results were also obtained by fitting the first 5 min of PET data to the 1T1K model, at least when 60-min plasma input data were used. For both models, the test-retest variability of (11)C-laniquidar rate constant for transfer from arterial plasma to tissue (K1) was approximately 19%. The accurate quantification of (11)C-laniquidar kinetics in the brain is hampered by its fast metabolism and the likelihood that labeled metabolites enter the brain. Best fits for the entire 60 min of data were obtained using a dual-input model, accounting for uptake of (11)C-laniquidar and its labeled metabolites. Alternatively, K1 could be obtained from a 5-min scan using a standard 1T1K model. In both cases, the test-retest variability of K1 was approximately 19%. © 2015 by the

  4. Quantification of key long-term risks at CO₂ sequestration sites: Latest results from US DOE's National Risk Assessment Partnership (NRAP) Project

    DOE PAGES

    Pawar, Rajesh; Bromhal, Grant; Carroll, Susan; ...

    2014-12-31

    Risk assessment for geologic CO₂ storage including quantification of risks is an area of active investigation. The National Risk Assessment Partnership (NRAP) is a US-Department of Energy (US-DOE) effort focused on developing a defensible, science-based methodology and platform for quantifying risk profiles at geologic CO₂ sequestration sites. NRAP has been developing a methodology that centers round development of an integrated assessment model (IAM) using system modeling approach to quantify risks and risk profiles. The IAM has been used to calculate risk profiles with a few key potential impacts due to potential CO₂ and brine leakage. The simulation results are alsomore » used to determine long-term storage security relationships and compare the long-term storage effectiveness to IPCC storage permanence goal. Additionally, we also demonstrate application of IAM for uncertainty quantification in order to determine parameters to which the uncertainty in model results is most sensitive.« less

  5. Comparison of Anaerobic Susceptibility Results Obtained by Different Methods

    PubMed Central

    Rosenblatt, J. E.; Murray, P. R.; Sonnenwirth, A. C.; Joyce, J. L.

    1979-01-01

    Susceptibility tests using 7 antimicrobial agents (carbenicillin, chloramphenicol, clindamycin, penicillin, cephalothin, metronidazole, and tetracycline) were run against 35 anaerobes including Bacteroides fragilis (17), other gram-negative bacilli (7), clostridia (5), peptococci (4), and eubacteria (2). Results in triplicate obtained by the microbroth dilution method and the aerobic modification of the broth disk method were compared with those obtained with an agar dilution method using Wilkins-Chalgren agar. Media used in the microbroth dilution method included Wilkins-Chalgren broth, brain heart infusion broth, brucella broth, tryptic soy broth, thioglycolate broth, and Schaedler's broth. A result differing by more than one dilution from the Wilkins-Chalgren agar result was considered a discrepancy, and when there was a change in susceptibility status this was termed a significant discrepancy. The microbroth dilution method using Wilkins-Chalgren broth and thioglycolate broth produced the fewest total discrepancies (22 and 24, respectively), and Wilkins-Chalgren broth, thioglycolate, and Schaedler's broth had the fewest significant discrepancies (6, 5, and 5, respectively). With the broth disk method, there were 15 significant discrepancies, although half of these were with tetracycline, which was the antimicrobial agent associated with the highest number of significant discrepancies (33), considering all of the test methods and media. PMID:464560

  6. Quantification of skin wrinkles using low coherence interferometry

    NASA Astrophysics Data System (ADS)

    Oh, Jung-Taek; Kim, Beop-Min; Son, Sang-Ryoon; Lee, Sang-Won; Kim, Dong-Yoon; Kim, Youn-Soo

    2004-07-01

    We measure the skin wrinkle topology by means of low coherence interferometry (LCI), which forms the basis of the optical coherence tomography (OCT). The skin topology obtained using LCI and corresponding 2-D fast Fourier transform allow quantification of skin wrinkles. It took approximately 2 minutes to obtain 2.1 mm x 2.1 mm topological image with 4 um and 16 um resolutions in axial and transverse directions, respectively. Measurement examples show the particular case of skin contour change after-wrinkle cosmeceutical treatments and atopic dermatitis

  7. Advanced Modeling and Uncertainty Quantification for Flight Dynamics; Interim Results and Challenges

    NASA Technical Reports Server (NTRS)

    Hyde, David C.; Shweyk, Kamal M.; Brown, Frank; Shah, Gautam

    2014-01-01

    As part of the NASA Vehicle Systems Safety Technologies (VSST), Assuring Safe and Effective Aircraft Control Under Hazardous Conditions (Technical Challenge #3), an effort is underway within Boeing Research and Technology (BR&T) to address Advanced Modeling and Uncertainty Quantification for Flight Dynamics (VSST1-7). The scope of the effort is to develop and evaluate advanced multidisciplinary flight dynamics modeling techniques, including integrated uncertainties, to facilitate higher fidelity response characterization of current and future aircraft configurations approaching and during loss-of-control conditions. This approach is to incorporate multiple flight dynamics modeling methods for aerodynamics, structures, and propulsion, including experimental, computational, and analytical. Also to be included are techniques for data integration and uncertainty characterization and quantification. This research shall introduce new and updated multidisciplinary modeling and simulation technologies designed to improve the ability to characterize airplane response in off-nominal flight conditions. The research shall also introduce new techniques for uncertainty modeling that will provide a unified database model comprised of multiple sources, as well as an uncertainty bounds database for each data source such that a full vehicle uncertainty analysis is possible even when approaching or beyond Loss of Control boundaries. Methodologies developed as part of this research shall be instrumental in predicting and mitigating loss of control precursors and events directly linked to causal and contributing factors, such as stall, failures, damage, or icing. The tasks will include utilizing the BR&T Water Tunnel to collect static and dynamic data to be compared to the GTM extended WT database, characterizing flight dynamics in off-nominal conditions, developing tools for structural load estimation under dynamic conditions, devising methods for integrating various modeling elements

  8. Automated Quantification of Hematopoietic Cell – Stromal Cell Interactions in Histological Images of Undecalcified Bone

    PubMed Central

    Zehentmeier, Sandra; Cseresnyes, Zoltan; Escribano Navarro, Juan; Niesner, Raluca A.; Hauser, Anja E.

    2015-01-01

    Confocal microscopy is the method of choice for the analysis of localization of multiple cell types within complex tissues such as the bone marrow. However, the analysis and quantification of cellular localization is difficult, as in many cases it relies on manual counting, thus bearing the risk of introducing a rater-dependent bias and reducing interrater reliability. Moreover, it is often difficult to judge whether the co-localization between two cells results from random positioning, especially when cell types differ strongly in the frequency of their occurrence. Here, a method for unbiased quantification of cellular co-localization in the bone marrow is introduced. The protocol describes the sample preparation used to obtain histological sections of whole murine long bones including the bone marrow, as well as the staining protocol and the acquisition of high-resolution images. An analysis workflow spanning from the recognition of hematopoietic and non-hematopoietic cell types in 2-dimensional (2D) bone marrow images to the quantification of the direct contacts between those cells is presented. This also includes a neighborhood analysis, to obtain information about the cellular microenvironment surrounding a certain cell type. In order to evaluate whether co-localization of two cell types is the mere result of random cell positioning or reflects preferential associations between the cells, a simulation tool which is suitable for testing this hypothesis in the case of hematopoietic as well as stromal cells, is used. This approach is not limited to the bone marrow, and can be extended to other tissues to permit reproducible, quantitative analysis of histological data. PMID:25938636

  9. New approach for the quantification of processed animal proteins in feed using light microscopy.

    PubMed

    Veys, P; Baeten, V

    2010-07-01

    A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.

  10. WE-AB-204-05: Harmonizing PET/CT Quantification in Multicenter Studies: A Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marques da Silva, A; Fischer, A

    2015-06-15

    Purpose: To present the implementation of a strategy to harmonize FDG PET/CT quantification (SUV), performed with different scanner models and manufacturers. Methods: The strategy was based on Boellaard (2011) and EARL FDG-PET/CT accreditation program, that propose quality control measurements for harmonizing scanner performance. A NEMA IEC Body phantom study was performed using four different devices: PHP-1 (Gemini TF Base, Philips); PHP-2 (Gemini GXL, Philips); GEH (Discovery 600, General Electric); SMS (Biograph Hi-Rez 16, Siemens). The SUV Recovery Coefficient (RC) was calculated using the clinical protocol and other clinically relevant reconstruction parameters. The most appropriate reconstruction parameters (MARP) for SUV harmonization,more » in each scanner, are those which achieve EARL harmonizing standards. They were identified using the lowest root mean square errors (RMSE). To evaluate the strategy’s effectiveness, the Maximum Differences (MD) between the clinical and MARP RC values were calculated. Results: The reconstructions parameters that obtained the lowest RMSE are: FBP 5mm (PHP-1); LOR-RAMLA 2i0.008l (PHP-2); VuePointHD 2i32s10mm (GEH); and FORE+OSEM 4i8s6mm (SMS). Thus, to ensure that quantitative PET image measurements are interchangeable between these sites, images must be reconstructed with the above-mentioned parameters. Although, a decoupling between the best image for PET/CT qualitative analysis and the best image for quantification studies was observed. The MD showed that the strategy was effective in reducing the variability of SUV quantification for small structures (<17mm). Conclusion: The harmonization strategy of the SUV quantification implemented with these devices was effective in reducing the variability of small structures quantification, minimizing the inter-scanner and inter-institution differences in quantification. However, it is essential that, in addition to the harmonization of quantification, the standardization of

  11. Motion-aware stroke volume quantification in 4D PC-MRI data of the human aorta.

    PubMed

    Köhler, Benjamin; Preim, Uta; Grothoff, Matthias; Gutberlet, Matthias; Fischbach, Katharina; Preim, Bernhard

    2016-02-01

    4D PC-MRI enables the noninvasive measurement of time-resolved, three-dimensional blood flow data that allow quantification of the hemodynamics. Stroke volumes are essential to assess the cardiac function and evolution of different cardiovascular diseases. The calculation depends on the wall position and vessel orientation, which both change during the cardiac cycle due to the heart muscle contraction and the pumped blood. However, current systems for the quantitative 4D PC-MRI data analysis neglect the dynamic character and instead employ a static 3D vessel approximation. We quantify differences between stroke volumes in the aorta obtained with and without consideration of its dynamics. We describe a method that uses the approximating 3D segmentation to automatically initialize segmentation algorithms that require regions inside and outside the vessel for each temporal position. This enables the use of graph cuts to obtain 4D segmentations, extract vessel surfaces including centerlines for each temporal position and derive motion information. The stroke volume quantification is compared using measuring planes in static (3D) vessels, planes with fixed angulation inside dynamic vessels (this corresponds to the common 2D PC-MRI) and moving planes inside dynamic vessels. Seven datasets with different pathologies such as aneurysms and coarctations were evaluated in close collaboration with radiologists. Compared to the experts' manual stroke volume estimations, motion-aware quantification performs, on average, 1.57% better than calculations without motion consideration. The mean difference between stroke volumes obtained with the different methods is 7.82%. Automatically obtained 4D segmentations overlap by 85.75% with manually generated ones. Incorporating motion information in the stroke volume quantification yields slight but not statistically significant improvements. The presented method is feasible for the clinical routine, since computation times are low and

  12. Cues, quantification, and agreement in language comprehension.

    PubMed

    Tanner, Darren; Bulkes, Nyssa Z

    2015-12-01

    We investigated factors that affect the comprehension of subject-verb agreement in English, using quantification as a window into the relationship between morphosyntactic processes in language production and comprehension. Event-related brain potentials (ERPs) were recorded while participants read sentences with grammatical and ungrammatical verbs, in which the plurality of the subject noun phrase was either doubly marked (via overt plural quantification and morphological marking on the noun) or singly marked (via only plural morphology on the noun). Both acceptability judgments and the ERP data showed heightened sensitivity to agreement violations when quantification provided an additional cue to the grammatical number of the subject noun phrase, over and above plural morphology. This is consistent with models of grammatical comprehension that emphasize feature prediction in tandem with cue-based memory retrieval. Our results additionally contrast with those of prior studies that showed no effects of plural quantification on agreement in language production. These findings therefore highlight some nontrivial divergences in the cues and mechanisms supporting morphosyntactic processing in language production and comprehension.

  13. Quantification of Cannabinoid Content in Cannabis

    NASA Astrophysics Data System (ADS)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  14. Automated Quantification of Pneumothorax in CT

    PubMed Central

    Do, Synho; Salvaggio, Kristen; Gupta, Supriya; Kalra, Mannudeep; Ali, Nabeel U.; Pien, Homer

    2012-01-01

    An automated, computer-aided diagnosis (CAD) algorithm for the quantification of pneumothoraces from Multidetector Computed Tomography (MDCT) images has been developed. Algorithm performance was evaluated through comparison to manual segmentation by expert radiologists. A combination of two-dimensional and three-dimensional processing techniques was incorporated to reduce required processing time by two-thirds (as compared to similar techniques). Volumetric measurements on relative pneumothorax size were obtained and the overall performance of the automated method shows an average error of just below 1%. PMID:23082091

  15. Reproducibility study of whole-brain 1H spectroscopic imaging with automated quantification.

    PubMed

    Gu, Meng; Kim, Dong-Hyun; Mayer, Dirk; Sullivan, Edith V; Pfefferbaum, Adolf; Spielman, Daniel M

    2008-09-01

    A reproducibility study of proton MR spectroscopic imaging ((1)H-MRSI) of the human brain was conducted to evaluate the reliability of an automated 3D in vivo spectroscopic imaging acquisition and associated quantification algorithm. A PRESS-based pulse sequence was implemented using dualband spectral-spatial RF pulses designed to fully excite the singlet resonances of choline (Cho), creatine (Cre), and N-acetyl aspartate (NAA) while simultaneously suppressing water and lipids; 1% of the water signal was left to be used as a reference signal for robust data processing, and additional lipid suppression was obtained using adiabatic inversion recovery. Spiral k-space trajectories were used for fast spectral and spatial encoding yielding high-quality spectra from 1 cc voxels throughout the brain with a 13-min acquisition time. Data were acquired with an 8-channel phased-array coil and optimal signal-to-noise ratio (SNR) for the combined signals was achieved using a weighting based on the residual water signal. Automated quantification of the spectrum of each voxel was performed using LCModel. The complete study consisted of eight healthy adult subjects to assess intersubject variations and two subjects scanned six times each to assess intrasubject variations. The results demonstrate that reproducible whole-brain (1)H-MRSI data can be robustly obtained with the proposed methods.

  16. Microplastics in Baltic bottom sediments: Quantification procedures and first results.

    PubMed

    Zobkov, M; Esiukova, E

    2017-01-30

    Microplastics in the marine environment are known as a global ecological problem but there are still no standardized analysis procedures for their quantification. The first breakthrough in this direction was the NOAA Laboratory Methods for quantifying synthetic particles in water and sediments, but fibers numbers have been found to be underestimated with this approach. We propose modifications for these methods that will allow us to analyze microplastics in bottom sediments, including small fibers. Addition of an internal standard to sediment samples and occasional empty runs are advised for analysis quality control. The microplastics extraction efficiency using the proposed modifications is 92±7%. Distribution of microplastics in bottom sediments of the Russian part of the Baltic Sea is presented. Microplastic particles were found in all of the samples with an average concentration of 34±10 items/kg DW and have the same order of magnitude as neighbor studies reported. Copyright © 2016 Elsevier Ltd. All rights reserved.

  17. Quantification of heterogeneity observed in medical images.

    PubMed

    Brooks, Frank J; Grigsby, Perry W

    2013-03-02

    There has been much recent interest in the quantification of visually evident heterogeneity within functional grayscale medical images, such as those obtained via magnetic resonance or positron emission tomography. In the case of images of cancerous tumors, variations in grayscale intensity imply variations in crucial tumor biology. Despite these considerable clinical implications, there is as yet no standardized method for measuring the heterogeneity observed via these imaging modalities. In this work, we motivate and derive a statistical measure of image heterogeneity. This statistic measures the distance-dependent average deviation from the smoothest intensity gradation feasible. We show how this statistic may be used to automatically rank images of in vivo human tumors in order of increasing heterogeneity. We test this method against the current practice of ranking images via expert visual inspection. We find that this statistic provides a means of heterogeneity quantification beyond that given by other statistics traditionally used for the same purpose. We demonstrate the effect of tumor shape upon our ranking method and find the method applicable to a wide variety of clinically relevant tumor images. We find that the automated heterogeneity rankings agree very closely with those performed visually by experts. These results indicate that our automated method may be used reliably to rank, in order of increasing heterogeneity, tumor images whether or not object shape is considered to contribute to that heterogeneity. Automated heterogeneity ranking yields objective results which are more consistent than visual rankings. Reducing variability in image interpretation will enable more researchers to better study potential clinical implications of observed tumor heterogeneity.

  18. Sensitivity Analysis and Uncertainty Quantification for the LAMMPS Molecular Dynamics Simulation Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Picard, Richard Roy; Bhat, Kabekode Ghanasham

    2017-07-18

    We examine sensitivity analysis and uncertainty quantification for molecular dynamics simulation. Extreme (large or small) output values for the LAMMPS code often occur at the boundaries of input regions, and uncertainties in those boundary values are overlooked by common SA methods. Similarly, input values for which code outputs are consistent with calibration data can also occur near boundaries. Upon applying approaches in the literature for imprecise probabilities (IPs), much more realistic results are obtained than for the complacent application of standard SA and code calibration.

  19. Nucleic Acid-Based Cross-Linking Assay for Detection and Quantification of Hepatitis B Virus DNA

    PubMed Central

    Lai, Vicky C. H.; Guan, Richard; Wood, Michael L.; Lo, Su Kong; Yuen, Man-Fung; Lai, Ching-Lung

    1999-01-01

    A nucleic acid photo-cross-linking technology was used to develop a direct assay for the quantification of hepatitis B virus (HBV) DNA levels in serum. Cross-linker-modified DNA probes complementary to the viral genomes of the major HBV subtypes were synthesized and used in an assay that could be completed in less than 6 h. The quantification range of the assay, as determined by testing serial dilutions of Eurohep HBV reference standards and cloned HBV DNA, was 5 × 105 to 3 × 109 molecules of HBV DNA/ml of serum. Within-run and between-run coefficients of variation (CVs) for the assay were 4.3 and 4.0%, respectively. The assay was used to determine HBV DNA levels in 302 serum samples, and the results were compared to those obtained after testing the same samples with the Chiron branched-DNA (bDNA) assay for HBV DNA. Of the samples tested, 218 were positive for HBV DNA by both assays and 72 gave results below the cutoff for both assays. Of the remaining 12 samples, 10 were positive for HBV DNA by the cross-linking assay only; the 2 other samples were positive by the bDNA assay only. Twenty-eight samples had to be retested by the bDNA assay (CV, >20% between the results obtained from the testing of each sample in duplicate), whereas only three samples required retesting by the cross-linking assay. The correlation between the HBV DNA levels, as measured by the two tests, was very high (r = 0.902; P = 0.01). We conclude that the cross-linking assay is a sensitive and reproducible method for the detection and quantification of HBV DNA levels in serum. PMID:9854083

  20. Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations

    PubMed Central

    Higgs, Richard E.; Butler, Jon P.; Han, Bomie; Knierman, Michael D.

    2013-01-01

    Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference. PMID:23710359

  1. Targeted quantification of low ng/mL level proteins in human serum without immunoaffinity depletion

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Tujin; Sun, Xuefei; Gao, Yuqian

    2013-07-05

    We recently reported an antibody-free targeted protein quantification strategy, termed high-pressure, high-resolution separations with intelligent selection and multiplexing (PRISM) for achieving significantly enhanced sensitivity using selected reaction monitoring (SRM) mass spectrometry. Integrating PRISM with front-end IgY14 immunoaffinity depletion, sensitive detection of targeted proteins at 50-100 pg/mL levels in human blood plasma/serum was demonstrated. However, immunoaffinity depletion is often associated with undesired losses of target proteins of interest. Herein we report further evaluation of PRISM-SRM quantification of low-abundance serum proteins without immunoaffinity depletion and the multiplexing potential of this technique. Limits of quantification (LOQs) at low ng/mL levels with a medianmore » CV of ~12% were achieved for proteins spiked into human female serum using as little as 2 µL serum. PRISM-SRM provided up to ~1000-fold improvement in the LOQ when compared to conventional SRM measurements. Multiplexing capability of PRISM-SRM was also evaluated by two sets of serum samples with 6 and 21 target peptides spiked at the low attomole/µL levels. The results from SRM measurements for pooled or post-concatenated samples were comparable to those obtained from individual peptide fractions in terms of signal-to-noise ratios and SRM peak area ratios of light to heavy peptides. PRISM-SRM was applied to measure several ng/mL-level endogenous plasma proteins, including prostate-specific antigen, in clinical patient sera where correlation coefficients > 0.99 were observed between the results from PRISM-SRM and ELISA assays. Our results demonstrate that PRISM-SRM can be successfully used for quantification of low-abundance endogenous proteins in highly complex samples. Moderate throughput (50 samples/week) can be achieved by applying the post-concatenation or fraction multiplexing strategies. We anticipate broad applications for targeted PRISM

  2. Simplified quantification and whole-body distribution of [18F]FE-PE2I in nonhuman primates: prediction for human studies.

    PubMed

    Varrone, Andrea; Gulyás, Balázs; Takano, Akihiro; Stabin, Michael G; Jonsson, Cathrine; Halldin, Christer

    2012-02-01

    [(18)F]FE-PE2I is a promising dopamine transporter (DAT) radioligand. In nonhuman primates, we examined the accuracy of simplified quantification methods and the estimates of radiation dose of [(18)F]FE-PE2I. In the quantification study, binding potential (BP(ND)) values previously reported in three rhesus monkeys using kinetic and graphical analyses of [(18)F]FE-PE2I were used for comparison. BP(ND) using the cerebellum as reference region was obtained with four reference tissue methods applied to the [(18)F]FE-PE2I data that were compared with the kinetic and graphical analyses. In the whole-body study, estimates of adsorbed radiation were obtained in two cynomolgus monkeys. All reference tissue methods provided BP(ND) values within 5% of the values obtained with the kinetic and graphical analyses. The shortest imaging time for stable BP(ND) estimation was 54 min. The average effective dose of [(18)F]FE-PE2I was 0.021 mSv/MBq, similar to 2-deoxy-2-[(18)F]fluoro-d-glucose. The results in nonhuman primates suggest that [(18)F]FE-PE2I is suitable for accurate and stable DAT quantification, and its radiation dose estimates would allow for a maximal administered radioactivity of 476 MBq in human subjects. Copyright © 2012 Elsevier Inc. All rights reserved.

  3. Simple and accurate quantification of BTEX in ambient air by SPME and GC-MS.

    PubMed

    Baimatova, Nassiba; Kenessov, Bulat; Koziel, Jacek A; Carlsen, Lars; Bektassov, Marat; Demyanenko, Olga P

    2016-07-01

    Benzene, toluene, ethylbenzene and xylenes (BTEX) comprise one of the most ubiquitous and hazardous groups of ambient air pollutants of concern. Application of standard analytical methods for quantification of BTEX is limited by the complexity of sampling and sample preparation equipment, and budget requirements. Methods based on SPME represent simpler alternative, but still require complex calibration procedures. The objective of this research was to develop a simpler, low-budget, and accurate method for quantification of BTEX in ambient air based on SPME and GC-MS. Standard 20-mL headspace vials were used for field air sampling and calibration. To avoid challenges with obtaining and working with 'zero' air, slope factors of external standard calibration were determined using standard addition and inherently polluted lab air. For polydimethylsiloxane (PDMS) fiber, differences between the slope factors of calibration plots obtained using lab and outdoor air were below 14%. PDMS fiber provided higher precision during calibration while the use of Carboxen/PDMS fiber resulted in lower detection limits for benzene and toluene. To provide sufficient accuracy, the use of 20mL vials requires triplicate sampling and analysis. The method was successfully applied for analysis of 108 ambient air samples from Almaty, Kazakhstan. Average concentrations of benzene, toluene, ethylbenzene and o-xylene were 53, 57, 11 and 14µgm(-3), respectively. The developed method can be modified for further quantification of a wider range of volatile organic compounds in air. In addition, the new method is amenable to automation. Copyright © 2016 Elsevier B.V. All rights reserved.

  4. How to Make Data a Blessing to Parametric Uncertainty Quantification and Reduction?

    NASA Astrophysics Data System (ADS)

    Ye, M.; Shi, X.; Curtis, G. P.; Kohler, M.; Wu, J.

    2013-12-01

    In a Bayesian point of view, probability of model parameters and predictions are conditioned on data used for parameter inference and prediction analysis. It is critical to use appropriate data for quantifying parametric uncertainty and its propagation to model predictions. However, data are always limited and imperfect. When a dataset cannot properly constrain model parameters, it may lead to inaccurate uncertainty quantification. While in this case data appears to be a curse to uncertainty quantification, a comprehensive modeling analysis may help understand the cause and characteristics of parametric uncertainty and thus turns data into a blessing. In this study, we illustrate impacts of data on uncertainty quantification and reduction using an example of surface complexation model (SCM) developed to simulate uranyl (U(VI)) adsorption. The model includes two adsorption sites, referred to as strong and weak sites. The amount of uranium adsorption on these sites determines both the mean arrival time and the long tail of the breakthrough curves. There is one reaction on the weak site but two reactions on the strong site. The unknown parameters include fractions of the total surface site density of the two sites and surface complex formation constants of the three reactions. A total of seven experiments were conducted with different geochemical conditions to estimate these parameters. The experiments with low initial concentration of U(VI) result in a large amount of parametric uncertainty. A modeling analysis shows that it is because the experiments cannot distinguish the relative adsorption affinity of the strong and weak sites on uranium adsorption. Therefore, the experiments with high initial concentration of U(VI) are needed, because in the experiments the strong site is nearly saturated and the weak site can be determined. The experiments with high initial concentration of U(VI) are a blessing to uncertainty quantification, and the experiments with low initial

  5. Quantification of chitinase and thaumatin-like proteins in grape juices and wines.

    PubMed

    Le Bourse, D; Conreux, A; Villaume, S; Lameiras, P; Nuzillard, J-M; Jeandet, P

    2011-09-01

    Chitinases and thaumatin-like proteins are important grape proteins as they have a great influence on wine quality. The quantification of these proteins in grape juices and wines, along with their purification, is therefore crucial to study their intrinsic characteristics and the exact role they play in wines. The main isoforms of these two proteins from Chardonnay grape juice were thus purified by liquid chromatography. Two fast protein liquid chromatography (FLPC) steps allowed the fractionation and purification of the juice proteins, using cation exchange and hydrophobic interaction media. A further high-performance liquid chromatography (HPLC) step was used to achieve higher purity levels. Fraction assessment was achieved by mass spectrometry. Fraction purity was determined by HPLC to detect the presence of protein contaminants, and by nuclear magnetic resonance (NMR) spectroscopy to detect the presence of organic contaminants. Once pure fractions of lyophilized chitinase and thaumatin-like protein were obtained, ultra-HPLC (UHPLC) and enzyme-linked immunosorbent assay (ELISA) calibration curves were constructed. The quantification of these proteins in different grape juice and wine samples was thus achieved for the first time with both techniques through comparison with the purified protein calibration curve. UHPLC and ELISA showed very consistent results (less than 16% deviation for both proteins) and either could be considered to provide an accurate and reliable quantification of proteins in the oenology field.

  6. Quantification of triglyceride content in oleaginous materials using thermo-gravimetry

    DOE PAGES

    Maddi, Balakrishna; Vadlamani, Agasteswar; Viamajala, Sridhar; ...

    2017-10-16

    Laboratory analytical methods for quantification of triglyceride content in oleaginous biomass samples, especially microalgae, require toxic chemicals and/or organic solvents and involve multiple steps. We describe a simple triglyceride quantification method that uses thermo-gravimetry. This method is based on the observation that triglycerides undergo near-complete volatilization/degradation over a narrow temperature interval with a derivative weight loss peak at 420 °C when heated in an inert atmosphere. Degradation of the other constituents of oleaginous biomass (protein and carbohydrates) is largely complete after prolonged exposure of samples at 320 °C. Based on these observations, the triglyceride content of oleaginous biomass was estimatedmore » by using the following two-step process. In Step 1, samples were heated to 320 °C and kept isothermal at this temperature for 15 min. In Step 2, samples were heated from 320 °C to 420 °C and then kept isothermal at 420 °C for 15 min. The results show that mass loss in step 2 correlated well with triglyceride content estimates obtained from conventional techniques for diverse microalgae and oilseed samples.« less

  7. Quantification of triglyceride content in oleaginous materials using thermo-gravimetry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Maddi, Balakrishna; Vadlamani, Agasteswar; Viamajala, Sridhar

    Laboratory analytical methods for quantification of triglyceride content in oleaginous biomass samples, especially microalgae, require toxic chemicals and/or organic solvents and involve multiple steps. We describe a simple triglyceride quantification method that uses thermo-gravimetry. This method is based on the observation that triglycerides undergo near-complete volatilization/degradation over a narrow temperature interval with a derivative weight loss peak at 420 °C when heated in an inert atmosphere. Degradation of the other constituents of oleaginous biomass (protein and carbohydrates) is largely complete after prolonged exposure of samples at 320 °C. Based on these observations, the triglyceride content of oleaginous biomass was estimatedmore » by using the following two-step process. In Step 1, samples were heated to 320 °C and kept isothermal at this temperature for 15 min. In Step 2, samples were heated from 320 °C to 420 °C and then kept isothermal at 420 °C for 15 min. The results show that mass loss in step 2 correlated well with triglyceride content estimates obtained from conventional techniques for diverse microalgae and oilseed samples.« less

  8. Quantification of immobilized Candida antarctica lipase B (CALB) using ICP-AES combined with Bradford method.

    PubMed

    Nicolás, Paula; Lassalle, Verónica L; Ferreira, María L

    2017-02-01

    The aim of this manuscript was to study the application of a new method of protein quantification in Candida antarctica lipase B commercial solutions. Error sources associated to the traditional Bradford technique were demonstrated. Eight biocatalysts based on C. antarctica lipase B (CALB) immobilized onto magnetite nanoparticles were used. Magnetite nanoparticles were coated with chitosan (CHIT) and modified with glutaraldehyde (GLUT) and aminopropyltriethoxysilane (APTS). Later, CALB was adsorbed on the modified support. The proposed novel protein quantification method included the determination of sulfur (from protein in CALB solution) by means of Atomic Emission by Inductive Coupling Plasma (AE-ICP). Four different protocols were applied combining AE-ICP and classical Bradford assays, besides Carbon, Hydrogen and Nitrogen (CHN) analysis. The calculated error in protein content using the "classic" Bradford method with bovine serum albumin as standard ranged from 400 to 1200% when protein in CALB solution was quantified. These errors were calculated considering as "true protein content values" the results of the amount of immobilized protein obtained with the improved method. The optimum quantification procedure involved the combination of Bradford method, ICP and CHN analysis. Copyright © 2016 Elsevier Inc. All rights reserved.

  9. Intramyocellular lipid quantification: repeatability with 1H MR spectroscopy.

    PubMed

    Torriani, Martin; Thomas, Bijoy J; Halpern, Elkan F; Jensen, Megan E; Rosenthal, Daniel I; Palmer, William E

    2005-08-01

    To prospectively determine the repeatability and variability of tibialis anterior intramyocellular lipid (IMCL) quantifications performed by using 1.5-T hydrogen 1 (1H) magnetic resonance (MR) spectroscopy in healthy subjects. Institutional review board approval and written informed consent were obtained for this Health Insurance Portability and Accountability Act-compliant study. The authors examined the anterior tibial muscles of 27 healthy subjects aged 19-48 years (12 men, 15 women; mean age, 25 years) by using single-voxel short-echo-time point-resolved 1H MR spectroscopy. During a first visit, the subjects underwent 1H MR spectroscopy before and after being repositioned in the magnet bore, with voxels carefully placed on the basis of osseous landmarks. Measurements were repeated after a mean interval of 12 days. All spectra were fitted by using Java-based MR user interface (jMRUI) and LCModel software, and lipid peaks were scaled to the unsuppressed water peak (at 4.7 ppm) and the total creatine peak (at approximately 3.0 ppm). A one-way random-effects variance components model was used to determine intraday and intervisit coefficients of variation (CVs). A power analysis was performed to determine the detectable percentage change in lipid measurements for two subject sample sizes. Measurements of the IMCL methylene protons peak at a resonance of 1.3 ppm scaled to the unsuppressed water peak (IMCL(W)) that were obtained by using jMRUI software yielded the lowest CVs overall (intraday and intervisit CVs, 13.4% and 14.4%, respectively). The random-effects variance components model revealed that nonbiologic factors (equipment and repositioning) accounted for 50% of the total variability in IMCL quantifications. Power analysis for a sample size of 20 subjects revealed that changes in IMCL(W) of greater than 15% could be confidently detected between 1H MR spectroscopic measurements obtained on different days. 1H MR spectroscopy is feasible for repeatable

  10. Uncertainty Quantification for Robust Control of Wind Turbines using Sliding Mode Observer

    NASA Astrophysics Data System (ADS)

    Schulte, Horst

    2016-09-01

    A new quantification method of uncertain models for robust wind turbine control using sliding-mode techniques is presented with the objective to improve active load mitigation. This approach is based on the so-called equivalent output injection signal, which corresponds to the average behavior of the discontinuous switching term, establishing and maintaining a motion on a so-called sliding surface. The injection signal is directly evaluated to obtain estimates of the uncertainty bounds of external disturbances and parameter uncertainties. The applicability of the proposed method is illustrated by the quantification of a four degree-of-freedom model of the NREL 5MW reference turbine containing uncertainties.

  11. Evaluation of digital PCR for absolute RNA quantification.

    PubMed

    Sanders, Rebecca; Mason, Deborah J; Foy, Carole A; Huggett, Jim F

    2013-01-01

    Gene expression measurements detailing mRNA quantities are widely employed in molecular biology and are increasingly important in diagnostic fields. Reverse transcription (RT), necessary for generating complementary DNA, can be both inefficient and imprecise, but remains a quintessential RNA analysis tool using qPCR. This study developed a Transcriptomic Calibration Material and assessed the RT reaction using digital (d)PCR for RNA measurement. While many studies characterise dPCR capabilities for DNA quantification, less work has been performed investigating similar parameters using RT-dPCR for RNA analysis. RT-dPCR measurement using three, one-step RT-qPCR kits was evaluated using single and multiplex formats when measuring endogenous and synthetic RNAs. The best performing kit was compared to UV quantification and sensitivity and technical reproducibility investigated. Our results demonstrate assay and kit dependent RT-dPCR measurements differed significantly compared to UV quantification. Different values were reported by different kits for each target, despite evaluation of identical samples using the same instrument. RT-dPCR did not display the strong inter-assay agreement previously described when analysing DNA. This study demonstrates that, as with DNA measurement, RT-dPCR is capable of accurate quantification of low copy RNA targets, but the results are both kit and target dependent supporting the need for calibration controls.

  12. Exploiting multicompartment effects in triple-echo steady-state T2 mapping for fat fraction quantification.

    PubMed

    Liu, Dian; Steingoetter, Andreas; Curcic, Jelena; Kozerke, Sebastian

    2018-01-01

    To investigate and exploit the effect of intravoxel off-resonance compartments in the triple-echo steady-state (TESS) sequence without fat suppression for T 2 mapping and to leverage the results for fat fraction quantification. In multicompartment tissue, where at least one compartment is excited off-resonance, the total signal exhibits periodic modulations as a function of echo time (TE). Simulated multicompartment TESS signals were synthesized at various TEs. Fat emulsion phantoms were prepared and scanned at the same TE combinations using TESS. In vivo knee data were obtained with TESS to validate the simulations. The multicompartment effect was exploited for fat fraction quantification in the stomach by acquiring TESS signals at two TE combinations. Simulated and measured multicompartment signal intensities were in good agreement. Multicompartment effects caused erroneous T 2 offsets, even at low water-fat ratios. The choice of TE caused T 2 variations of as much as 28% in cartilage. The feasibility of fat fraction quantification to monitor the decrease of fat content in the stomach during digestion is demonstrated. Intravoxel off-resonance compartments are a confounding factor for T 2 quantification using TESS, causing errors that are dependent on the TE. At the same time, off-resonance effects may allow for efficient fat fraction mapping using steady-state imaging. Magn Reson Med 79:423-429, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  13. A quantification model for the structure of clay materials.

    PubMed

    Tang, Liansheng; Sang, Haitao; Chen, Haokun; Sun, Yinlei; Zhang, Longjian

    2016-07-04

    In this paper, the quantification for clay structure is explicitly explained, and the approach and goals of quantification are also discussed. The authors consider that the purpose of the quantification for clay structure is to determine some parameters that can be used to quantitatively characterize the impact of clay structure on the macro-mechanical behaviour. According to the system theory and the law of energy conservation, a quantification model for the structure characteristics of clay materials is established and three quantitative parameters (i.e., deformation structure potential, strength structure potential and comprehensive structure potential) are proposed. And the corresponding tests are conducted. The experimental results show that these quantitative parameters can accurately reflect the influence of clay structure on the deformation behaviour, strength behaviour and the relative magnitude of structural influence on the above two quantitative parameters, respectively. These quantitative parameters have explicit mechanical meanings, and can be used to characterize the structural influences of clay on its mechanical behaviour.

  14. Multiple headspace-solid-phase microextraction: an application to quantification of mushroom volatiles.

    PubMed

    Costa, Rosaria; Tedone, Laura; De Grazia, Selenia; Dugo, Paola; Mondello, Luigi

    2013-04-03

    Multiple headspace-solid phase microextraction (MHS-SPME) followed by gas chromatography/mass spectrometry (GC-MS) and flame ionization detection (GC-FID) was applied to the identification and quantification of volatiles released by the mushroom Agaricus bisporus, also known as champignon. MHS-SPME allows to perform quantitative analysis of volatiles from solid matrices, free of matrix interferences. Samples analyzed were fresh mushrooms (chopped and homogenized) and mushroom-containing food dressings. 1-Octen-3-ol, 3-octanol, 3-octanone, 1-octen-3-one and benzaldehyde were common constituents of the samples analyzed. Method performance has been tested through the evaluation of limit of detection (LoD, range 0.033-0.078 ng), limit of quantification (LoQ, range 0.111-0.259 ng) and analyte recovery (92.3-108.5%). The results obtained showed quantitative differences among the samples, which can be attributed to critical factors, such as the degree of cell damage upon sample preparation, that are here discussed. Considerations on the mushrooms biochemistry and on the basic principles of MHS analysis are also presented. Copyright © 2013 Elsevier B.V. All rights reserved.

  15. Artifacts Quantification of Metal Implants in MRI

    NASA Astrophysics Data System (ADS)

    Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.

    2017-11-01

    The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.

  16. Fluorescent quantification of melanin.

    PubMed

    Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur

    2016-11-01

    Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Simultaneous quantification of flavonoids and triterpenoids in licorice using HPLC.

    PubMed

    Wang, Yuan-Chuen; Yang, Yi-Shan

    2007-05-01

    Numerous bioactive compounds are present in licorice (Glycyrrhizae Radix), including flavonoids and triterpenoids. In this study, a reversed-phase high-performance liquid chromatography (HPLC) method for simultaneous quantification of three flavonoids (liquiritin, liquiritigenin and isoliquiritigenin) and four triterpenoids (glycyrrhizin, 18alpha-glycyrrhetinic acid, 18beta-glycyrrhetinic acid and 18beta-glycyrrhetinic acid methyl ester) from licorice was developed, and further, to quantify these 7 compounds from 20 different licorice samples. Specifically, the reverse-phase HPLC was performed with a gradient mobile phase composed of 25 mM phosphate buffer (pH 2.5)-acetonitrile featuring gradient elution steps as follows: 0 min, 100:0; 10 min, 80:20; 50 min, 70:30; 73 min, 50:50; 110 min, 50:50; 125 min, 20:80; 140 min, 20:80, and peaks were detected at 254 nm. By using our technique, a rather good specificity was obtained regarding to the separation of these seven compounds. The regression coefficient for the linear equations for the seven compounds lay between 0.9978 and 0.9992. The limits of detection and quantification lay in the range of 0.044-0.084 and 0.13-0.25 microg/ml, respectively. The relative recovery rates for the seven compounds lay between 96.63+/-2.43 and 103.55+/-2.77%. Coefficient variation for intra-day and inter-day precisions lay in the range of 0.20-1.84 and 0.28-1.86%, respectively. Based upon our validation results, this analytical technique is a convenient method to simultaneous quantify numerous bioactive compounds derived from licorice, featuring good quantification parameters, accuracy and precision.

  18. Preclinical Biokinetic Modelling of Tc-99m Radiophamaceuticals Obtained from Semi-Automatic Image Processing.

    PubMed

    Cornejo-Aragón, Luz G; Santos-Cuevas, Clara L; Ocampo-García, Blanca E; Chairez-Oria, Isaac; Diaz-Nieto, Lorenza; García-Quiroz, Janice

    2017-01-01

    The aim of this study was to develop a semi automatic image processing algorithm (AIPA) based on the simultaneous information provided by X-ray and radioisotopic images to determine the biokinetic models of Tc-99m radiopharmaceuticals from quantification of image radiation activity in murine models. These radioisotopic images were obtained by a CCD (charge couple device) camera coupled to an ultrathin phosphorous screen in a preclinical multimodal imaging system (Xtreme, Bruker). The AIPA consisted of different image processing methods for background, scattering and attenuation correction on the activity quantification. A set of parametric identification algorithms was used to obtain the biokinetic models that characterize the interaction between different tissues and the radiopharmaceuticals considered in the study. The set of biokinetic models corresponded to the Tc-99m biodistribution observed in different ex vivo studies. This fact confirmed the contribution of the semi-automatic image processing technique developed in this study.

  19. Instantaneous Wavenumber Estimation for Damage Quantification in Layered Plate Structures

    NASA Technical Reports Server (NTRS)

    Mesnil, Olivier; Leckey, Cara A. C.; Ruzzene, Massimo

    2014-01-01

    This paper illustrates the application of instantaneous and local wavenumber damage quantification techniques for high frequency guided wave interrogation. The proposed methodologies can be considered as first steps towards a hybrid structural health monitoring/ nondestructive evaluation (SHM/NDE) approach for damage assessment in composites. The challenges and opportunities related to the considered type of interrogation and signal processing are explored through the analysis of numerical data obtained via EFIT simulations of damage in CRFP plates. Realistic damage configurations are modeled from x-ray CT scan data of plates subjected to actual impacts, in order to accurately predict wave-damage interactions in terms of scattering and mode conversions. Simulation data is utilized to enhance the information provided by instantaneous and local wavenumbers and mitigate the complexity related to the multi-modal content of the plate response. Signal processing strategies considered for this purpose include modal decoupling through filtering in the frequency/wavenumber domain, the combination of displacement components, and the exploitation of polarization information for the various modes as evaluated through the dispersion analysis of the considered laminate lay-up sequence. The results presented assess the effectiveness of the proposed wavefield processing techniques as a hybrid SHM/NDE technique for damage detection and quantification in composite, plate-like structures.

  20. Estimation of the quantification uncertainty from flow injection and liquid chromatography transient signals in inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Laborda, Francisco; Medrano, Jesús; Castillo, Juan R.

    2004-06-01

    The quality of the quantitative results obtained from transient signals in high-performance liquid chromatography-inductively coupled plasma mass spectrometry (HPLC-ICPMS) and flow injection-inductively coupled plasma mass spectrometry (FI-ICPMS) was investigated under multielement conditions. Quantification methods were based on multiple-point calibration by simple and weighted linear regression, and double-point calibration (measurement of the baseline and one standard). An uncertainty model, which includes the main sources of uncertainty from FI-ICPMS and HPLC-ICPMS (signal measurement, sample flow rate and injection volume), was developed to estimate peak area uncertainties and statistical weights used in weighted linear regression. The behaviour of the ICPMS instrument was characterized in order to be considered in the model, concluding that the instrument works as a concentration detector when it is used to monitorize transient signals from flow injection or chromatographic separations. Proper quantification by the three calibration methods was achieved when compared to reference materials, although the double-point calibration allowed to obtain results of the same quality as the multiple-point calibration, shortening the calibration time. Relative expanded uncertainties ranged from 10-20% for concentrations around the LOQ to 5% for concentrations higher than 100 times the LOQ.

  1. Application of synchrotron radiation computed microtomography for quantification of bone microstructure in human and rat bones

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parreiras Nogueira, Liebert; Barroso, Regina Cely; Pereira de Almeida, Andre

    2012-05-17

    This work aims to evaluate histomorphometric quantification by synchrotron radiation computed microto-mography in bones of human and rat specimens. Bones specimens are classified as normal and pathological (for human samples) and irradiated and non-irradiated samples (for rat ones). Human bones are specimens which were affected by some injury, or not. Rat bones are specimens which were irradiated, simulating radiotherapy procedures, or not. Images were obtained on SYRMEP beamline at the Elettra Synchrotron Laboratory in Trieste, Italy. The system generated 14 {mu}m tomographic images. The quantification of bone structures were performed directly by the 3D rendered images using a home-made software.more » Resolution yielded was excellent what facilitate quantification of bone microstructures.« less

  2. Evaluation of the reliability of maize reference assays for GMO quantification.

    PubMed

    Papazova, Nina; Zhang, David; Gruden, Kristina; Vojvoda, Jana; Yang, Litao; Buh Gasparic, Meti; Blejec, Andrej; Fouilloux, Stephane; De Loose, Marc; Taverniers, Isabel

    2010-03-01

    A reliable PCR reference assay for relative genetically modified organism (GMO) quantification must be specific for the target taxon and amplify uniformly along the commercialised varieties within the considered taxon. Different reference assays for maize (Zea mays L.) are used in official methods for GMO quantification. In this study, we evaluated the reliability of eight existing maize reference assays, four of which are used in combination with an event-specific polymerase chain reaction (PCR) assay validated and published by the Community Reference Laboratory (CRL). We analysed the nucleotide sequence variation in the target genomic regions in a broad range of transgenic and conventional varieties and lines: MON 810 varieties cultivated in Spain and conventional varieties from various geographical origins and breeding history. In addition, the reliability of the assays was evaluated based on their PCR amplification performance. A single base pair substitution, corresponding to a single nucleotide polymorphism (SNP) reported in an earlier study, was observed in the forward primer of one of the studied alcohol dehydrogenase 1 (Adh1) (70) assays in a large number of varieties. The SNP presence is consistent with a poor PCR performance observed for this assay along the tested varieties. The obtained data show that the Adh1 (70) assay used in the official CRL NK603 assay is unreliable. Based on our results from both the nucleotide stability study and the PCR performance test, we can conclude that the Adh1 (136) reference assay (T25 and Bt11 assays) as well as the tested high mobility group protein gene assay, which also form parts of CRL methods for quantification, are highly reliable. Despite the observed uniformity in the nucleotide sequence of the invertase gene assay, the PCR performance test reveals that this target sequence might occur in more than one copy. Finally, although currently not forming a part of official quantification methods, zein and SSIIb

  3. EasyLCMS: an asynchronous web application for the automated quantification of LC-MS data

    PubMed Central

    2012-01-01

    Background Downstream applications in metabolomics, as well as mathematical modelling, require data in a quantitative format, which may also necessitate the automated and simultaneous quantification of numerous metabolites. Although numerous applications have been previously developed for metabolomics data handling, automated calibration and calculation of the concentrations in terms of μmol have not been carried out. Moreover, most of the metabolomics applications are designed for GC-MS, and would not be suitable for LC-MS, since in LC, the deviation in the retention time is not linear, which is not taken into account in these applications. Moreover, only a few are web-based applications, which could improve stand-alone software in terms of compatibility, sharing capabilities and hardware requirements, even though a strong bandwidth is required. Furthermore, none of these incorporate asynchronous communication to allow real-time interaction with pre-processed results. Findings Here, we present EasyLCMS (http://www.easylcms.es/), a new application for automated quantification which was validated using more than 1000 concentration comparisons in real samples with manual operation. The results showed that only 1% of the quantifications presented a relative error higher than 15%. Using clustering analysis, the metabolites with the highest relative error distributions were identified and studied to solve recurrent mistakes. Conclusions EasyLCMS is a new web application designed to quantify numerous metabolites, simultaneously integrating LC distortions and asynchronous web technology to present a visual interface with dynamic interaction which allows checking and correction of LC-MS raw data pre-processing results. Moreover, quantified data obtained with EasyLCMS are fully compatible with numerous downstream applications, as well as for mathematical modelling in the systems biology field. PMID:22884039

  4. On the Confounding Effect of Temperature on Chemical Shift-Encoded Fat Quantification

    PubMed Central

    Hernando, Diego; Sharma, Samir D.; Kramer, Harald; Reeder, Scott B.

    2014-01-01

    Purpose To characterize the confounding effect of temperature on chemical shift-encoded (CSE) fat quantification. Methods The proton resonance frequency of water, unlike triglycerides, depends on temperature. This leads to a temperature dependence of the spectral models of fat (relative to water) that are commonly used by CSE-MRI methods. Simulation analysis was performed for 1.5 Tesla CSE fat–water signals at various temperatures and echo time combinations. Oil–water phantoms were constructed and scanned at temperatures between 0 and 40°C using spectroscopy and CSE imaging at three echo time combinations. An explanted human liver, rejected for transplantation due to steatosis, was scanned using spectroscopy and CSE imaging. Fat–water reconstructions were performed using four different techniques: magnitude and complex fitting, with standard or temperature-corrected signal modeling. Results In all experiments, magnitude fitting with standard signal modeling resulted in large fat quantification errors. Errors were largest for echo time combinations near TEinit ≈ 1.3 ms, ΔTE ≈ 2.2 ms. Errors in fat quantification caused by temperature-related frequency shifts were smaller with complex fitting, and were avoided using a temperature-corrected signal model. Conclusion Temperature is a confounding factor for fat quantification. If not accounted for, it can result in large errors in fat quantifications in phantom and ex vivo acquisitions. PMID:24123362

  5. Quantification and characterization of Si in Pinus Insignis Dougl by TXRF

    NASA Astrophysics Data System (ADS)

    Navarro, Henry; Bennun, Leonardo; Marcó, Lué M.

    2015-03-01

    A simple quantification of silicon is described, in woods such as Pinus Insigne Dougl obtained from the 8th region of Bío-Bío, 37°15″ South-73°19″ West, Chile. The samples were prepared through fractional calcination, and the ashes were directly analyzed by total reflection X-ray fluorescence (TXRF) technique. The analysis of 16 samples that were calcined is presented. The samples were weighed on plastic reflectors in a microbalance with sensitivity of 0.1 µg. Later, the samples were irradiated in a TXRF PICOFOX spectrometer, for 350 and 700 s. To each sample, cobalt was added as an internal standard. Concentrations of silicon over the 1 % in each sample and the self-absorption effect on the quantification were observed, in masses higher than 100 μg.

  6. MaxReport: An Enhanced Proteomic Result Reporting Tool for MaxQuant.

    PubMed

    Zhou, Tao; Li, Chuyu; Zhao, Wene; Wang, Xinru; Wang, Fuqiang; Sha, Jiahao

    2016-01-01

    MaxQuant is a proteomic software widely used for large-scale tandem mass spectrometry data. We have designed and developed an enhanced result reporting tool for MaxQuant, named as MaxReport. This tool can optimize the results of MaxQuant and provide additional functions for result interpretation. MaxReport can generate report tables for protein N-terminal modifications. It also supports isobaric labelling based relative quantification at the protein, peptide or site level. To obtain an overview of the results, MaxReport performs general descriptive statistical analyses for both identification and quantification results. The output results of MaxReport are well organized and therefore helpful for proteomic users to better understand and share their data. The script of MaxReport, which is freely available at http://websdoor.net/bioinfo/maxreport/, is developed using Python code and is compatible across multiple systems including Windows and Linux.

  7. Comparative quantification of human intestinal bacteria based on cPCR and LDR/LCR

    PubMed Central

    Tang, Zhou-Rui; Li, Kai; Zhou, Yu-Xun; Xiao, Zhen-Xian; Xiao, Jun-Hua; Huang, Rui; Gu, Guo-Hao

    2012-01-01

    AIM: To establish a multiple detection method based on comparative polymerase chain reaction (cPCR) and ligase detection reaction (LDR)/ligase chain reaction (LCR) to quantify the intestinal bacterial components. METHODS: Comparative quantification of 16S rDNAs from different intestinal bacterial components was used to quantify multiple intestinal bacteria. The 16S rDNAs of different bacteria were amplified simultaneously by cPCR. The LDR/LCR was examined to actualize the genotyping and quantification. Two beneficial (Bifidobacterium, Lactobacillus) and three conditionally pathogenic bacteria (Enterococcus, Enterobacterium and Eubacterium) were used in this detection. With cloned standard bacterial 16S rDNAs, standard curves were prepared to validate the quantitative relations between the ratio of original concentrations of two templates and the ratio of the fluorescence signals of their final ligation products. The internal controls were added to monitor the whole detection flow. The quantity ratio between two bacteria was tested. RESULTS: cPCR and LDR revealed obvious linear correlations with standard DNAs, but cPCR and LCR did not. In the sample test, the distributions of the quantity ratio between each two bacterial species were obtained. There were significant differences among these distributions in the total samples. But these distributions of quantity ratio of each two bacteria remained stable among groups divided by age or sex. CONCLUSION: The detection method in this study can be used to conduct multiple intestinal bacteria genotyping and quantification, and to monitor the human intestinal health status as well. PMID:22294830

  8. Automatic Quantification of Radiographic Wrist Joint Space Width of Patients With Rheumatoid Arthritis.

    PubMed

    Huo, Yinghe; Vincken, Koen L; van der Heijde, Desiree; de Hair, Maria J H; Lafeber, Floris P; Viergever, Max A

    2017-11-01

    Objective: Wrist joint space narrowing is a main radiographic outcome of rheumatoid arthritis (RA). Yet, automatic radiographic wrist joint space width (JSW) quantification for RA patients has not been widely investigated. The aim of this paper is to present an automatic method to quantify the JSW of three wrist joints that are least affected by bone overlapping and are frequently involved in RA. These joints are located around the scaphoid bone, viz. the multangular-navicular, capitate-navicular-lunate, and radiocarpal joints. Methods: The joint space around the scaphoid bone is detected by using consecutive searches of separate path segments, where each segment location aids in constraining the subsequent one. For joint margin delineation, first the boundary not affected by X-ray projection is extracted, followed by a backtrace process to obtain the actual joint margin. The accuracy of the quantified JSW is evaluated by comparison with the manually obtained ground truth. Results: Two of the 50 radiographs used for evaluation of the method did not yield a correct path through all three wrist joints. The delineated joint margins of the remaining 48 radiographs were used for JSW quantification. It was found that 90% of the joints had a JSW deviating less than 20% from the mean JSW of manual indications, with the mean JSW error less than 10%. Conclusion: The proposed method is able to automatically quantify the JSW of radiographic wrist joints reliably. The proposed method may aid clinical researchers to study the progression of wrist joint damage in RA studies. Objective: Wrist joint space narrowing is a main radiographic outcome of rheumatoid arthritis (RA). Yet, automatic radiographic wrist joint space width (JSW) quantification for RA patients has not been widely investigated. The aim of this paper is to present an automatic method to quantify the JSW of three wrist joints that are least affected by bone overlapping and are frequently involved in RA. These joints

  9. Automated quantification of pancreatic β-cell mass

    PubMed Central

    Golson, Maria L.; Bush, William S.

    2014-01-01

    β-Cell mass is a parameter commonly measured in studies of islet biology and diabetes. However, the rigorous quantification of pancreatic β-cell mass using conventional histological methods is a time-consuming process. Rapidly evolving virtual slide technology with high-resolution slide scanners and newly developed image analysis tools has the potential to transform β-cell mass measurement. To test the effectiveness and accuracy of this new approach, we assessed pancreata from normal C57Bl/6J mice and from mouse models of β-cell ablation (streptozotocin-treated mice) and β-cell hyperplasia (leptin-deficient mice), using a standardized systematic sampling of pancreatic specimens. Our data indicate that automated analysis of virtual pancreatic slides is highly reliable and yields results consistent with those obtained by conventional morphometric analysis. This new methodology will allow investigators to dramatically reduce the time required for β-cell mass measurement by automating high-resolution image capture and analysis of entire pancreatic sections. PMID:24760991

  10. Phylogenetic Quantification of Intra-tumour Heterogeneity

    PubMed Central

    Schwarz, Roland F.; Trinh, Anne; Sipos, Botond; Brenton, James D.; Goldman, Nick; Markowetz, Florian

    2014-01-01

    Intra-tumour genetic heterogeneity is the result of ongoing evolutionary change within each cancer. The expansion of genetically distinct sub-clonal populations may explain the emergence of drug resistance, and if so, would have prognostic and predictive utility. However, methods for objectively quantifying tumour heterogeneity have been missing and are particularly difficult to establish in cancers where predominant copy number variation prevents accurate phylogenetic reconstruction owing to horizontal dependencies caused by long and cascading genomic rearrangements. To address these challenges, we present MEDICC, a method for phylogenetic reconstruction and heterogeneity quantification based on a Minimum Event Distance for Intra-tumour Copy-number Comparisons. Using a transducer-based pairwise comparison function, we determine optimal phasing of major and minor alleles, as well as evolutionary distances between samples, and are able to reconstruct ancestral genomes. Rigorous simulations and an extensive clinical study show the power of our method, which outperforms state-of-the-art competitors in reconstruction accuracy, and additionally allows unbiased numerical quantification of tumour heterogeneity. Accurate quantification and evolutionary inference are essential to understand the functional consequences of tumour heterogeneity. The MEDICC algorithms are independent of the experimental techniques used and are applicable to both next-generation sequencing and array CGH data. PMID:24743184

  11. Ultra-performance liquid chromatography/tandem mass spectrometric quantification of structurally diverse drug mixtures using an ESI-APCI multimode ionization source.

    PubMed

    Yu, Kate; Di, Li; Kerns, Edward; Li, Susan Q; Alden, Peter; Plumb, Robert S

    2007-01-01

    We report in this paper an ultra-performance liquid chromatography/tandem mass spectrometric (UPLC(R)/MS/MS) method utilizing an ESI-APCI multimode ionization source to quantify structurally diverse analytes. Eight commercial drugs were used as test compounds. Each LC injection was completed in 1 min using a UPLC system coupled with MS/MS multiple reaction monitoring (MRM) detection. Results from three separate sets of experiments are reported. In the first set of experiments, the eight test compounds were analyzed as a single mixture. The mass spectrometer was switching rapidly among four ionization modes (ESI+, ESI-, APCI-, and APCI+) during an LC run. Approximately 8-10 data points were collected across each LC peak. This was insufficient for a quantitative analysis. In the second set of experiments, four compounds were analyzed as a single mixture. The mass spectrometer was switching rapidly among four ionization modes during an LC run. Approximately 15 data points were obtained for each LC peak. Quantification results were obtained with a limit of detection (LOD) as low as 0.01 ng/mL. For the third set of experiments, the eight test compounds were analyzed as a batch. During each LC injection, a single compound was analyzed. The mass spectrometer was detecting at a particular ionization mode during each LC injection. More than 20 data points were obtained for each LC peak. Quantification results were also obtained. This single-compound analytical method was applied to a microsomal stability test. Compared with a typical HPLC method currently used for the microsomal stability test, the injection-to-injection cycle time was reduced to 1.5 min (UPLC method) from 3.5 min (HPLC method). The microsome stability results were comparable with those obtained by traditional HPLC/MS/MS.

  12. Determination of Oversulphated Chondroitin Sulphate and Dermatan Sulphate in unfractionated heparin by (1)H-NMR - Collaborative study for quantification and analytical determination of LoD.

    PubMed

    McEwen, I; Mulloy, B; Hellwig, E; Kozerski, L; Beyer, T; Holzgrabe, U; Wanko, R; Spieser, J-M; Rodomonte, A

    2008-12-01

    Oversulphated Chondroitin Sulphate (OSCS) and Dermatan Sulphate (DS) in unfractionated heparins can be identified by nuclear magnetic resonance spectrometry (NMR). The limit of detection (LoD) of OSCS is 0.1% relative to the heparin content. This LoD is obtained at a signal-to-noise ratio (S/N) of 2000:1 of the heparin methyl signal. Quantification is best obtained by comparing peak heights of the OSCS and heparin methyl signals. Reproducibility of less than 10% relative standard deviation (RSD) has been obtained. The accuracy of quantification was good.

  13. Automated quantification of renal interstitial fibrosis for computer-aided diagnosis: A comprehensive tissue structure segmentation method.

    PubMed

    Tey, Wei Keat; Kuang, Ye Chow; Ooi, Melanie Po-Leen; Khoo, Joon Joon

    2018-03-01

    Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses. This study proposes an automated quantification system for measuring the amount of interstitial fibrosis in renal biopsy images as a consistent basis of comparison among pathologists. The system extracts and segments the renal tissue structures based on colour information and structural assumptions of the tissue structures. The regions in the biopsy representing the interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area and quantified as a percentage of the total area of the biopsy sample. A ground truth image dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated a good correlation in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses due to the uncertainties in human judgement. An automated quantification system for accurately measuring the amount of interstitial fibrosis in renal biopsy images is presented as a consistent basis of comparison among pathologists. The system identifies the renal tissue structures

  14. RNA-Skim: a rapid method for RNA-Seq quantification at transcript level

    PubMed Central

    Zhang, Zhaojun; Wang, Wei

    2014-01-01

    Motivation: RNA-Seq technique has been demonstrated as a revolutionary means for exploring transcriptome because it provides deep coverage and base pair-level resolution. RNA-Seq quantification is proven to be an efficient alternative to Microarray technique in gene expression study, and it is a critical component in RNA-Seq differential expression analysis. Most existing RNA-Seq quantification tools require the alignments of fragments to either a genome or a transcriptome, entailing a time-consuming and intricate alignment step. To improve the performance of RNA-Seq quantification, an alignment-free method, Sailfish, has been recently proposed to quantify transcript abundances using all k-mers in the transcriptome, demonstrating the feasibility of designing an efficient alignment-free method for transcriptome quantification. Even though Sailfish is substantially faster than alternative alignment-dependent methods such as Cufflinks, using all k-mers in the transcriptome quantification impedes the scalability of the method. Results: We propose a novel RNA-Seq quantification method, RNA-Skim, which partitions the transcriptome into disjoint transcript clusters based on sequence similarity, and introduces the notion of sig-mers, which are a special type of k-mers uniquely associated with each cluster. We demonstrate that the sig-mer counts within a cluster are sufficient for estimating transcript abundances with accuracy comparable with any state-of-the-art method. This enables RNA-Skim to perform transcript quantification on each cluster independently, reducing a complex optimization problem into smaller optimization tasks that can be run in parallel. As a result, RNA-Skim uses <4% of the k-mers and <10% of the CPU time required by Sailfish. It is able to finish transcriptome quantification in <10 min per sample by using just a single thread on a commodity computer, which represents >100 speedup over the state-of-the-art alignment-based methods, while delivering

  15. Uncertainty quantification and validation of 3D lattice scaffolds for computer-aided biomedical applications.

    PubMed

    Gorguluarslan, Recep M; Choi, Seung-Kyum; Saldana, Christopher J

    2017-07-01

    A methodology is proposed for uncertainty quantification and validation to accurately predict the mechanical response of lattice structures used in the design of scaffolds. Effective structural properties of the scaffolds are characterized using a developed multi-level stochastic upscaling process that propagates the quantified uncertainties at strut level to the lattice structure level. To obtain realistic simulation models for the stochastic upscaling process and minimize the experimental cost, high-resolution finite element models of individual struts were reconstructed from the micro-CT scan images of lattice structures which are fabricated by selective laser melting. The upscaling method facilitates the process of determining homogenized strut properties to reduce the computational cost of the detailed simulation model for the scaffold. Bayesian Information Criterion is utilized to quantify the uncertainties with parametric distributions based on the statistical data obtained from the reconstructed strut models. A systematic validation approach that can minimize the experimental cost is also developed to assess the predictive capability of the stochastic upscaling method used at the strut level and lattice structure level. In comparison with physical compression test results, the proposed methodology of linking the uncertainty quantification with the multi-level stochastic upscaling method enabled an accurate prediction of the elastic behavior of the lattice structure with minimal experimental cost by accounting for the uncertainties induced by the additive manufacturing process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Assessment of a 1H high-resolution magic angle spinning NMR spectroscopy procedure for free sugars quantification in intact plant tissue.

    PubMed

    Delgado-Goñi, Teresa; Campo, Sonia; Martín-Sitjar, Juana; Cabañas, Miquel E; San Segundo, Blanca; Arús, Carles

    2013-08-01

    In most plants, sucrose is the primary product of photosynthesis, the transport form of assimilated carbon, and also one of the main factors determining sweetness in fresh fruits. Traditional methods for sugar quantification (mainly sucrose, glucose and fructose) require obtaining crude plant extracts, which sometimes involve substantial sample manipulation, making the process time-consuming and increasing the risk of sample degradation. Here, we describe and validate a fast method to determine sugar content in intact plant tissue by using high-resolution magic angle spinning nuclear magnetic resonance spectroscopy (HR-MAS NMR). The HR-MAS NMR method was used for quantifying sucrose, glucose and fructose in mesocarp tissues from melon fruits (Cucumis melo var. reticulatus and Cucumis melo var. cantalupensis). The resulting sugar content varied among individual melons, ranging from 1.4 to 7.3 g of sucrose, 0.4-2.5 g of glucose; and 0.73-2.83 g of fructose (values per 100 g fw). These values were in agreement with those described in the literature for melon fruit tissue, and no significant differences were found when comparing them with those obtained using the traditional, enzymatic procedure, on melon tissue extracts. The HR-MAS NMR method offers a fast (usually <30 min) and sensitive method for sugar quantification in intact plant tissues, it requires a small amount of tissue (typically 50 mg fw) and avoids the interferences and risks associated with obtaining plant extracts. Furthermore, this method might also allow the quantification of additional metabolites detectable in the plant tissue NMR spectrum.

  17. Quantification of Fibrosis and Osteosclerosis in Myeloproliferative Neoplasms: A Computer-Assisted Image Study

    PubMed Central

    Teman, Carolin J.; Wilson, Andrew R.; Perkins, Sherrie L.; Hickman, Kimberly; Prchal, Josef T.; Salama, Mohamed E.

    2010-01-01

    Evaluation of bone marrow fibrosis and osteosclerosis in myeloproliferative neoplasms (MPN) is subject to interobserver inconsistency. Performance data for currently utilized fibrosis grading systems are lacking, and classification scales for osteosclerosis do not exist. Digital imaging can serve as a quantification method for fibrosis and osteosclerosis. We used digital imaging techniques for trabecular area assessment and reticulin-fiber quantification. Patients with all Philadelphia negative MPN subtypes had higher trabecular volume than controls (p ≤0.0015). Results suggest that the degree of osteosclerosis helps differentiate primary myelofibrosis from other MPN. Numerical quantification of fibrosis highly correlated with subjective scores, and interobserver correlation was satisfactory. Digital imaging provides accurate quantification for osteosclerosis and fibrosis. PMID:20122729

  18. Flow Quantification by Nuclear Magnetic Resonance Imaging

    NASA Astrophysics Data System (ADS)

    Vu, Anthony Tienhuan

    1994-01-01

    In this dissertation, a robust method for the measurement and visualization of flow field in laminar, complex and turbulent flows by Nuclear Magnetic Resonance Imaging utilizing flow induced Adiabatic Fast Passage (AFP) principle will be presented. This dissertation focuses on the application of AFP in spatially resolvable size vessels. We first review two main flow effects in NMR: time-of-flight and phase dispersion. The discussion of NMR flow imaging application - flow measurements and NMR angiography will be given. The theoretical framework of adiabatic passage will be discussed in order to explain the principle of flow-induced adiabatic passage tagging for flow imaging applications. From a knowledge of the basic flow-induced adiabatic passage principle, we propose a multi-zone AFP excitation scheme to deal with flow in a curved tube, branches and constrictions, i.e. complex and turbulent flow regimes. The technique provides a quick and simple way to acquire flow profiles simultaneously at several locations and arbitrary orientations inside the field-of-view. The flow profile is the time-averaged evolution of the labeled flowing material. Results obtained using a carotid bifurcation and circular jet phantoms are similar to the previous experimental studies employing laser Doppler Anemometry, and other flow visualization techniques. In addition, the preliminary results obtained with a human volunteer support the feasibility of the technique for in vivo flow quantification. Finally, a quantitative comparison of flow measurement of the new proposed techniques with the more established Phase Contrast MRA was performed. The results show excellent correlation between the two methods and with the standard volumetric flow rate measurement indicating that the flow measurements obtained using this technique are reliable and accurate under various flow regimes.

  19. A rapid and accurate quantification method for real-time dynamic analysis of cellular lipids during microalgal fermentation processes in Chlorella protothecoides with low field nuclear magnetic resonance.

    PubMed

    Wang, Tao; Liu, Tingting; Wang, Zejian; Tian, Xiwei; Yang, Yi; Guo, Meijin; Chu, Ju; Zhuang, Yingping

    2016-05-01

    The rapid and real-time lipid determination can provide valuable information on process regulation and optimization in the algal lipid mass production. In this study, a rapid, accurate and precise quantification method of in vivo cellular lipids of Chlorella protothecoides using low field nuclear magnetic resonance (LF-NMR) was newly developed. LF-NMR was extremely sensitive to the algal lipids with the limits of the detection (LOD) of 0.0026g and 0.32g/L in dry lipid samples and algal broth, respectively, as well as limits of quantification (LOQ) of 0.0093g and 1.18g/L. Moreover, the LF-NMR signal was specifically proportional to the cellular lipids of C. protothecoides, thus the superior regression curves existing in a wide detection range from 0.02 to 0.42g for dry lipids and from 1.12 to 8.97gL(-1) of lipid concentration for in vivo lipid quantification were obtained with all R(2) higher than 0.99, irrespective of the lipid content and fatty acids profile variations. The accuracy of this novel method was further verified to be reliable by comparing lipid quantification results to those obtained by GC-MS. And the relative standard deviation (RSD) of LF-NMR results were smaller than 2%, suggesting the precision of this method. Finally, this method was successfully used in the on-line lipid monitoring during the algal lipid fermentation processes, making it possible for better understanding of the lipid accumulation mechanism and dynamic bioprocess control. Copyright © 2016 Elsevier B.V. All rights reserved.

  20. AVQS: attack route-based vulnerability quantification scheme for smart grid.

    PubMed

    Ko, Jongbin; Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.

  1. AVQS: Attack Route-Based Vulnerability Quantification Scheme for Smart Grid

    PubMed Central

    Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification. PMID:25152923

  2. Three-Dimensional Echocardiographic Assessment of Left Heart Chamber Size and Function with Fully Automated Quantification Software in Patients with Atrial Fibrillation.

    PubMed

    Otani, Kyoko; Nakazono, Akemi; Salgo, Ivan S; Lang, Roberto M; Takeuchi, Masaaki

    2016-10-01

    Echocardiographic determination of left heart chamber volumetric parameters by using manual tracings during multiple beats is tedious in atrial fibrillation (AF). The aim of this study was to determine the usefulness of fully automated left chamber quantification software with single-beat three-dimensional transthoracic echocardiographic data sets in patients with AF. Single-beat full-volume three-dimensional transthoracic echocardiographic data sets were prospectively acquired during consecutive multiple cardiac beats (≥10 beats) in 88 patients with AF. In protocol 1, left ventricular volumes, left ventricular ejection fraction, and maximal left atrial volume were validated using automated quantification against the manual tracing method in identical beats in 10 patients. In protocol 2, automated quantification-derived averaged values from multiple beats were compared with the corresponding values obtained from the indexed beat in all patients. Excellent correlations of left chamber parameters between automated quantification and the manual method were observed (r = 0.88-0.98) in protocol 1. The time required for the analysis with the automated quantification method (5 min) was significantly less compared with the manual method (27 min) (P < .0001). In protocol 2, there were excellent linear correlations between the averaged left chamber parameters and the corresponding values obtained from the indexed beat (r = 0.94-0.99), and test-retest variability of left chamber parameters was low (3.5%-4.8%). Three-dimensional transthoracic echocardiography with fully automated quantification software is a rapid and reliable way to measure averaged values of left heart chamber parameters during multiple consecutive beats. Thus, it is a potential new approach for left chamber quantification in patients with AF in daily routine practice. Copyright © 2016 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.

  3. Uncertainty quantification of overpressure buildup through inverse modeling of compaction processes in sedimentary basins

    NASA Astrophysics Data System (ADS)

    Colombo, Ivo; Porta, Giovanni M.; Ruffo, Paolo; Guadagnini, Alberto

    2017-03-01

    This study illustrates a procedure conducive to a preliminary risk analysis of overpressure development in sedimentary basins characterized by alternating depositional events of sandstone and shale layers. The approach rests on two key elements: (1) forward modeling of fluid flow and compaction, and (2) application of a model-complexity reduction technique based on a generalized polynomial chaos expansion (gPCE). The forward model considers a one-dimensional vertical compaction processes. The gPCE model is then used in an inverse modeling context to obtain efficient model parameter estimation and uncertainty quantification. The methodology is applied to two field settings considered in previous literature works, i.e. the Venture Field (Scotian Shelf, Canada) and the Navarin Basin (Bering Sea, Alaska, USA), relying on available porosity and pressure information for model calibration. It is found that the best result is obtained when porosity and pressure data are considered jointly in the model calibration procedure. Uncertainty propagation from unknown input parameters to model outputs, such as pore pressure vertical distribution, is investigated and quantified. This modeling strategy enables one to quantify the relative importance of key phenomena governing the feedback between sediment compaction and fluid flow processes and driving the buildup of fluid overpressure in stratified sedimentary basins characterized by the presence of low-permeability layers. The results here illustrated (1) allow for diagnosis of the critical role played by the parameters of quantitative formulations linking porosity and permeability in compacted shales and (2) provide an explicit and detailed quantification of the effects of their uncertainty in field settings.

  4. Recurrence quantification analysis of electrically evoked surface EMG signal.

    PubMed

    Liu, Chunling; Wang, Xu

    2005-01-01

    Recurrence Plot is a quite useful tool used in time-series analysis, in particular for measuring unstable periodic orbits embedded in a chaotic dynamical system. This paper introduced the structures of the Recurrence Plot and the ways of the plot coming into being. Then the way of the quantification of the Recurrence Plot is defined. In this paper, one of the possible applications of Recurrence Quantification Analysis (RQA) strategy to the analysis of electrical stimulation evoked surface EMG. The result shows the percent determination is increased along with stimulation intensity.

  5. Quantification of febuxostat polymorphs using powder X-ray diffraction technique.

    PubMed

    Qiu, Jing-bo; Li, Gang; Sheng, Yue; Zhu, Mu-rong

    2015-03-25

    Febuxostat is a pharmaceutical compound with more than 20 polymorphs of which form A is most widely used and usually exists in a mixed polymorphic form with form G. In the present study, a quantification method for polymorphic form A and form G of febuxostat (FEB) has been developed using powder X-ray diffraction (PXRD). Prior to development of a quantification method, pure polymorphic form A and form G are characterized. A continuous scan with a scan rate of 3° min(-1) over an angular range of 3-40° 2θ is applied for the construction of the calibration curve using the characteristic peaks of form A at 12.78° 2θ (I/I0100%) and form G at 11.72° 2θ (I/I0100%). The linear regression analysis data for the calibration plots shows good linear relationship with R(2)=0.9985 with respect to peak area in the concentration range 10-60 wt.%. The method is validated for precision, recovery and ruggedness. The limits of detection and quantitation are 1.5% and 4.6%, respectively. The obtained results prove that the method is repeatable, sensitive and accurate. The proposed developed PXRD method can be applied for the quantitative analysis of mixtures of febuxostat polymorphs (forms A and G). Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Reproducibility of Lobar Perfusion and Ventilation Quantification Using SPECT/CT Segmentation Software in Lung Cancer Patients.

    PubMed

    Provost, Karine; Leblond, Antoine; Gauthier-Lemire, Annie; Filion, Édith; Bahig, Houda; Lord, Martin

    2017-09-01

    Planar perfusion scintigraphy with 99m Tc-labeled macroaggregated albumin is often used for pretherapy quantification of regional lung perfusion in lung cancer patients, particularly those with poor respiratory function. However, subdividing lung parenchyma into rectangular regions of interest, as done on planar images, is a poor reflection of true lobar anatomy. New tridimensional methods using SPECT and SPECT/CT have been introduced, including semiautomatic lung segmentation software. The present study evaluated inter- and intraobserver agreement on quantification using SPECT/CT software and compared the results for regional lung contribution obtained with SPECT/CT and planar scintigraphy. Methods: Thirty lung cancer patients underwent ventilation-perfusion scintigraphy with 99m Tc-macroaggregated albumin and 99m Tc-Technegas. The regional lung contribution to perfusion and ventilation was measured on both planar scintigraphy and SPECT/CT using semiautomatic lung segmentation software by 2 observers. Interobserver and intraobserver agreement for the SPECT/CT software was assessed using the intraclass correlation coefficient, Bland-Altman plots, and absolute differences in measurements. Measurements from planar and tridimensional methods were compared using the paired-sample t test and mean absolute differences. Results: Intraclass correlation coefficients were in the excellent range (above 0.9) for both interobserver and intraobserver agreement using the SPECT/CT software. Bland-Altman analyses showed very narrow limits of agreement. Absolute differences were below 2.0% in 96% of both interobserver and intraobserver measurements. There was a statistically significant difference between planar and SPECT/CT methods ( P < 0.001) for quantification of perfusion and ventilation for all right lung lobes, with a maximal mean absolute difference of 20.7% for the right middle lobe. There was no statistically significant difference in quantification of perfusion and

  7. On the quantification and efficient propagation of imprecise probabilities resulting from small datasets

    NASA Astrophysics Data System (ADS)

    Zhang, Jiaxin; Shields, Michael D.

    2018-01-01

    This paper addresses the problem of uncertainty quantification and propagation when data for characterizing probability distributions are scarce. We propose a methodology wherein the full uncertainty associated with probability model form and parameter estimation are retained and efficiently propagated. This is achieved by applying the information-theoretic multimodel inference method to identify plausible candidate probability densities and associated probabilities that each method is the best model in the Kullback-Leibler sense. The joint parameter densities for each plausible model are then estimated using Bayes' rule. We then propagate this full set of probability models by estimating an optimal importance sampling density that is representative of all plausible models, propagating this density, and reweighting the samples according to each of the candidate probability models. This is in contrast with conventional methods that try to identify a single probability model that encapsulates the full uncertainty caused by lack of data and consequently underestimate uncertainty. The result is a complete probabilistic description of both aleatory and epistemic uncertainty achieved with several orders of magnitude reduction in computational cost. It is shown how the model can be updated to adaptively accommodate added data and added candidate probability models. The method is applied for uncertainty analysis of plate buckling strength where it is demonstrated how dataset size affects the confidence (or lack thereof) we can place in statistical estimates of response when data are lacking.

  8. An approach for quantification of platinum distribution in tissues by LA-ICP-MS imaging using isotope dilution analysis.

    PubMed

    Moraleja, I; Mena, M L; Lázaro, A; Neumann, B; Tejedor, A; Jakubowski, N; Gómez-Gómez, M M; Esteban-Fernández, D

    2018-02-01

    Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been revealed as a convenient technique for trace elemental imaging in tissue sections, providing elemental 2D distribution at a quantitative level. For quantification purposes, in the last years several approaches have been proposed in the literature such as the use of CRMs or matrix matched standards. The use of Isotope Dilution (ID) for quantification by LA-ICP-MS has been also described, being mainly useful for bulk analysis but not feasible for spatial measurements so far. In this work, a quantification method based on ID analysis was developed by printing isotope-enriched inks onto kidney slices from rats treated with antitumoral Pt-based drugs using a commercial ink-jet device, in order to perform an elemental quantification in different areas from bio-images. For the ID experiments 194 Pt enriched platinum was used. The methodology was validated by deposition of natural Pt standard droplets with a known amount of Pt onto the surface of a control tissue, where could be quantified even 50pg of Pt, with recoveries higher than 90%. The amount of Pt present in the whole kidney slices was quantified for cisplatin, carboplatin and oxaliplatin-treated rats. The results obtained were in accordance with those previously reported. The amount of Pt distributed between the medullar and cortical areas was also quantified, observing different behavior for the three drugs. Copyright © 2017 Elsevier B.V. All rights reserved.

  9. Quantification is Neither Necessary Nor Sufficient for Measurement

    NASA Astrophysics Data System (ADS)

    Mari, Luca; Maul, Andrew; Torres Irribarra, David; Wilson, Mark

    2013-09-01

    Being an infrastructural, widespread activity, measurement is laden with stereotypes. Some of these concern the role of measurement in the relation between quality and quantity. In particular, it is sometimes argued or assumed that quantification is necessary for measurement; it is also sometimes argued or assumed that quantification is sufficient for or synonymous with measurement. To assess the validity of these positions the concepts of measurement and quantitative evaluation should be independently defined and their relationship analyzed. We contend that the defining characteristic of measurement should be the structure of the process, not a feature of its results. Under this perspective, quantitative evaluation is neither sufficient nor necessary for measurement.

  10. A Comparative Analysis of Computational Approaches to Relative Protein Quantification Using Peptide Peak Intensities in Label-free LC-MS Proteomics Experiments

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Matzke, Melissa M.; Brown, Joseph N.; Gritsenko, Marina A.

    2013-02-01

    Liquid chromatography coupled with mass spectrometry (LC-MS) is widely used to identify and quantify peptides in complex biological samples. In particular, label-free shotgun proteomics is highly effective for the identification of peptides and subsequently obtaining a global protein profile of a sample. As a result, this approach is widely used for discovery studies. Typically, the objective of these discovery studies is to identify proteins that are affected by some condition of interest (e.g. disease, exposure). However, for complex biological samples, label-free LC-MS proteomics experiments measure peptides and do not directly yield protein quantities. Thus, protein quantification must be inferred frommore » one or more measured peptides. In recent years, many computational approaches to relative protein quantification of label-free LC-MS data have been published. In this review, we examine the most commonly employed quantification approaches to relative protein abundance from peak intensity values, evaluate their individual merits, and discuss challenges in the use of the various computational approaches.« less

  11. HPC Analytics Support. Requirements for Uncertainty Quantification Benchmarks

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Paulson, Patrick R.; Purohit, Sumit; Rodriguez, Luke R.

    2015-05-01

    This report outlines techniques for extending benchmark generation products so they support uncertainty quantification by benchmarked systems. We describe how uncertainty quantification requirements can be presented to candidate analytical tools supporting SPARQL. We describe benchmark data sets for evaluating uncertainty quantification, as well as an approach for using our benchmark generator to produce data sets for generating benchmark data sets.

  12. Quantification of active pharmaceutical ingredient and impurities in sildenafil citrate obtained from the Internet

    PubMed Central

    Nutan, Mohammad T.; Dodla, Uday Krishna Reddy

    2014-01-01

    Background: The accessibility of prescription drugs produced outside of the United States, most notably sildenafil citrate (innovator product, Viagra®), has been made much easier by the Internet. Of greatest concern to clinicians and policymakers is product quality and patient safety. The US Food and Drug Administration (FDA) has issued warnings to potential buyers that the safety of drugs purchased from the Internet cannot be guaranteed, and may present a health risk to consumers from substandard products. Objective: The objective of this study was to determine whether generic sildenafil citrate tablets from international markets obtained via the Internet are equivalent to the US innovator product regarding major aspects of pharmaceutical quality: potency, accuracy of labeling, and presence and level of impurities. This will help identify aspects of drug quality that may impact public health risks. Methods: A total of 15 sildenafil citrate tablets were obtained for pharmaceutical analysis: 14 generic samples from international Internet pharmacy websites and the US innovator product. According to US Pharmacopeial guidelines, tablet samples were tested using high-performance liquid chromatography for potency of active pharmaceutical ingredient (API) and levels of impurities (impurities A, B, C, and D). Impurity levels were compared with International Conference on Harmonisation (ICH) limits. Results: Among the 15 samples, 4 samples possessed higher impurity B levels than the ICH qualification threshold, 8 samples possessed higher impurity C levels than the ICH qualification threshold, and 4 samples possessed more than 1% impurity quantity of maximum daily dose (MDD). For API, 6 of the samples failed to fall within the 5% assay limit. Conclusions: Quality assurance tests are often used to detect formulation defects of drug products during the manufacturing and/or storage process. Results suggest that manufacturing standards for sildenafil citrate generic drug

  13. Quantification of Efficiency of Beneficiation of Lunar Regolith

    NASA Technical Reports Server (NTRS)

    Trigwell, Steve; Lane, John; Captain, James; Weis, Kyle; Quinn, Jacqueline; Watanabe, Fumiya

    2011-01-01

    Electrostatic beneficiation of lunar regolith is being researched at Kennedy Space Center to enhance the ilmenite concentration of the regolith for the production of oxygen in in-situ resource utilization on the lunar surface. Ilmenite enrichment of up to 200% was achieved using lunar simulants. For the most accurate quantification of the regolith particles, standard petrographic methods are typically followed, but in order to optimize the process, many hundreds of samples were generated in this study that made the standard analysis methods time prohibitive. In the current studies, X-ray photoelectron spectroscopy (XPS) and Secondary Electron microscopy/Energy Dispersive Spectroscopy (SEM/EDS) were used that could automatically, and quickly, analyze many separated fractions of lunar simulant. In order to test the accuracy of the quantification, test mixture samples of known quantities of ilmenite (2, 5, 10, and 20 wt%) in silica (pure quartz powder), were analyzed by XPS and EDS. The results showed that quantification for low concentrations of ilmenite in silica could be accurately achieved by both XPS and EDS, knowing the limitations of the techniques. 1

  14. Quantification Bias Caused by Plasmid DNA Conformation in Quantitative Real-Time PCR Assay

    PubMed Central

    Lin, Chih-Hui; Chen, Yu-Chieh; Pan, Tzu-Ming

    2011-01-01

    Quantitative real-time PCR (qPCR) is the gold standard for the quantification of specific nucleic acid sequences. However, a serious concern has been revealed in a recent report: supercoiled plasmid standards cause significant over-estimation in qPCR quantification. In this study, we investigated the effect of plasmid DNA conformation on the quantification of DNA and the efficiency of qPCR. Our results suggest that plasmid DNA conformation has significant impact on the accuracy of absolute quantification by qPCR. DNA standard curves shifted significantly among plasmid standards with different DNA conformations. Moreover, the choice of DNA measurement method and plasmid DNA conformation may also contribute to the measurement error of DNA standard curves. Due to the multiple effects of plasmid DNA conformation on the accuracy of qPCR, efforts should be made to assure the highest consistency of plasmid standards for qPCR. Thus, we suggest that the conformation, preparation, quantification, purification, handling, and storage of standard plasmid DNA should be described and defined in the Minimum Information for Publication of Quantitative Real-Time PCR Experiments (MIQE) to assure the reproducibility and accuracy of qPCR absolute quantification. PMID:22194997

  15. Quantum non-Gaussianity and quantification of nonclassicality

    NASA Astrophysics Data System (ADS)

    Kühn, B.; Vogel, W.

    2018-05-01

    The algebraic quantification of nonclassicality, which naturally arises from the quantum superposition principle, is related to properties of regular nonclassicality quasiprobabilities. The latter are obtained by non-Gaussian filtering of the Glauber-Sudarshan P function. They yield lower bounds for the degree of nonclassicality. We also derive bounds for convex combinations of Gaussian states for certifying quantum non-Gaussianity directly from the experimentally accessible nonclassicality quasiprobabilities. Other quantum-state representations, such as s -parametrized quasiprobabilities, insufficiently indicate or even fail to directly uncover detailed information on the properties of quantum states. As an example, our approach is applied to multi-photon-added squeezed vacuum states.

  16. Screening, confirmation and quantification of boldenone sulfate in equine urine after administration of boldenone undecylenate (Equipoise).

    PubMed

    Weidolf, L O; Chichila, T M; Henion, J D

    1988-12-09

    Methods for screening by thin-layer chromatography, quantification by high-performance liquid chromatography with ultraviolet detection and confirmation by gas chromatography-mass spectrometry of boldenone sulfate in equine urine after administration of boldenone undecylenate (Equipoise) are presented. Sample work-up was done with C18 liquid-solid extraction followed by solvolytic cleavage of the sulfate ester. Confirmatory evidence of boldenone sulfate in equine urine was obtained from 2 h to 42 days following a therapeutic intramuscular dose of Equipoise. The use of 19-nortestosterone sulfate as the internal standard for quantification of boldenone sulfate is discussed.

  17. Radio-frequency energy quantification in magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Alon, Leeor

    Mapping of radio frequency (RF) energy deposition has been challenging for 50+ years, especially, when scanning patients in the magnetic resonance imaging (MRI) environment. As result, electromagnetic simulation software is often used for estimating the specific absorption rate (SAR), the rate of RF energy deposition in tissue. The thesis work presents challenges associated with aligning information provided by electromagnetic simulation and MRI experiments. As result of the limitations of simulations, experimental methods for the quantification of SAR were established. A system for quantification of the total RF energy deposition was developed for parallel transmit MRI (a system that uses multiple antennas to excite and image the body). The system is capable of monitoring and predicting channel-by-channel RF energy deposition, whole body SAR and capable of tracking potential hardware failures that occur in the transmit chain and may cause the deposition of excessive energy into patients. Similarly, we demonstrated that local RF power deposition can be mapped and predicted for parallel transmit systems based on a series of MRI temperature mapping acquisitions. Resulting from the work, we developed tools for optimal reconstruction temperature maps from MRI acquisitions. The tools developed for temperature mapping paved the way for utilizing MRI as a diagnostic tool for evaluation of RF/microwave emitting device safety. Quantification of the RF energy was demonstrated for both MRI compatible and non-MRI-compatible devices (such as cell phones), while having the advantage of being noninvasive, of providing millimeter resolution and high accuracy.

  18. 43 CFR 11.71 - Quantification phase-service reduction quantification.

    Code of Federal Regulations, 2011 CFR

    2011-10-01

    ...-discharge-or-release condition. (c) Contents of the quantification. The following factors should be included...; and (6) Factors identified in the specific guidance in paragraphs (h), (i), (j), (k), and (l) of this section dealing with the different kinds of natural resources. (d) Selection of resources, services, and...

  19. 43 CFR 11.71 - Quantification phase-service reduction quantification.

    Code of Federal Regulations, 2010 CFR

    2010-10-01

    ...-discharge-or-release condition. (c) Contents of the quantification. The following factors should be included...; and (6) Factors identified in the specific guidance in paragraphs (h), (i), (j), (k), and (l) of this section dealing with the different kinds of natural resources. (d) Selection of resources, services, and...

  20. Multiscale quantification of tissue spiculation and distortion for detection of architectural distortion and spiculated mass in mammography

    NASA Astrophysics Data System (ADS)

    Lao, Zhiqiang; Zheng, Xin

    2011-03-01

    This paper proposes a multiscale method to quantify tissue spiculation and distortion in mammography CAD systems that aims at improving the sensitivity in detecting architectural distortion and spiculated mass. This approach addresses the difficulty of predetermining the neighborhood size for feature extraction in characterizing lesions demonstrating spiculated mass/architectural distortion that may appear in different sizes. The quantification is based on the recognition of tissue spiculation and distortion pattern using multiscale first-order phase portrait model in texture orientation field generated by Gabor filter bank. A feature map is generated based on the multiscale quantification for each mammogram and two features are then extracted from the feature map. These two features will be combined with other mass features to provide enhanced discriminate ability in detecting lesions demonstrating spiculated mass and architectural distortion. The efficiency and efficacy of the proposed method are demonstrated with results obtained by applying the method to over 500 cancer cases and over 1000 normal cases.

  1. An accurate proteomic quantification method: fluorescence labeling absolute quantification (FLAQ) using multidimensional liquid chromatography and tandem mass spectrometry.

    PubMed

    Liu, Junyan; Liu, Yang; Gao, Mingxia; Zhang, Xiangmin

    2012-08-01

    A facile proteomic quantification method, fluorescent labeling absolute quantification (FLAQ), was developed. Instead of using MS for quantification, the FLAQ method is a chromatography-based quantification in combination with MS for identification. Multidimensional liquid chromatography (MDLC) with laser-induced fluorescence (LIF) detection with high accuracy and tandem MS system were employed for FLAQ. Several requirements should be met for fluorescent labeling in MS identification: Labeling completeness, minimum side-reactions, simple MS spectra, and no extra tandem MS fragmentations for structure elucidations. A fluorescence dye, 5-iodoacetamidofluorescein, was finally chosen to label proteins on all cysteine residues. The fluorescent dye was compatible with the process of the trypsin digestion and MALDI MS identification. Quantitative labeling was achieved with optimization of reacting conditions. A synthesized peptide and model proteins, BSA (35 cysteines), OVA (five cysteines), were used for verifying the completeness of labeling. Proteins were separated through MDLC and quantified based on fluorescent intensities, followed by MS identification. High accuracy (RSD% < 1.58) and wide linearity of quantification (1-10(5) ) were achieved by LIF detection. The limit of quantitation for the model protein was as low as 0.34 amol. Parts of proteins in human liver proteome were quantified and demonstrated using FLAQ. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  2. Practical quantification of necrosis in histological whole-slide images.

    PubMed

    Homeyer, André; Schenk, Andrea; Arlt, Janine; Dahmen, Uta; Dirsch, Olaf; Hahn, Horst K

    2013-06-01

    Since the histological quantification of necrosis is a common task in medical research and practice, we evaluate different image analysis methods for quantifying necrosis in whole-slide images. In a practical usage scenario, we assess the impact of different classification algorithms and feature sets on both accuracy and computation time. We show how a well-chosen combination of multiresolution features and an efficient postprocessing step enables the accurate quantification necrosis in gigapixel images in less than a minute. The results are general enough to be applied to other areas of histological image analysis as well. Copyright © 2013 Elsevier Ltd. All rights reserved.

  3. Osteoblast-like cell response to macro- and micro-patterned carbon scaffolds obtained from the sea rush Juncus maritimus.

    PubMed

    López-Álvarez, M; Pereiro, I; Serra, J; de Carlos, A; González, P

    2011-08-01

    Carbon scaffolds with a directional patterned surface were obtained by pyrolysis of the sea rush Juncus maritimus. The structure of the scaffolds was investigated using scanning electron microscopy, mercury porosimetry and interferometric profilometry. X-ray diffraction and x-ray fluorescence were the techniques used for their chemical characterization. The alignment and differentiation of pre-osteoblasts (MC3T3-E1 cell line) incubated on the patterned scaffolds were evaluated by scanning electron microscopy, confocal laser scanning microscopy and by the quantification of the phosphatase alkaline activity and the osteocalcin synthesis. It was found that pyrolysis at 500 °C preserved and even enhanced the natural macro- and micro-patterning of the plant. The results obtained for porosity and chemical composition validated these structures as viable scaffolds for tissue engineering applications. Finally, the patterned surface was confirmed to promote the oriented growth of the pre-osteoblasts MC3T3-E1, not only after short periods of incubation (hours) but also after longer ones (several weeks). The quantification of the cell differentiation markers together with the evaluation of the cell layer morphology up to 28 days of incubation confirmed the differentiation of MC3T3-E1 cells to osteoblasts. © 2011 IOP Publishing Ltd

  4. PeakCaller: an automated graphical interface for the quantification of intracellular calcium obtained by high-content screening.

    PubMed

    Artimovich, Elena; Jackson, Russell K; Kilander, Michaela B C; Lin, Yu-Chih; Nestor, Michael W

    2017-10-16

    Intracellular calcium is an important ion involved in the regulation and modulation of many neuronal functions. From regulating cell cycle and proliferation to initiating signaling cascades and regulating presynaptic neurotransmitter release, the concentration and timing of calcium activity governs the function and fate of neurons. Changes in calcium transients can be used in high-throughput screening applications as a basic measure of neuronal maturity, especially in developing or immature neuronal cultures derived from stem cells. Using human induced pluripotent stem cell derived neurons and dissociated mouse cortical neurons combined with the calcium indicator Fluo-4, we demonstrate that PeakCaller reduces type I and type II error in automated peak calling when compared to the oft-used PeakFinder algorithm under both basal and pharmacologically induced conditions. Here we describe PeakCaller, a novel MATLAB script and graphical user interface for the quantification of intracellular calcium transients in neuronal cultures. PeakCaller allows the user to set peak parameters and smoothing algorithms to best fit their data set. This new analysis script will allow for automation of calcium measurements and is a powerful software tool for researchers interested in high-throughput measurements of intracellular calcium.

  5. DICOM image quantification secondary capture (DICOM IQSC) integrated with numeric results, regions, and curves: implementation and applications in nuclear medicine

    NASA Astrophysics Data System (ADS)

    Cao, Xinhua; Xu, Xiaoyin; Voss, Stephan

    2017-03-01

    In this paper, we describe an enhanced DICOM Secondary Capture (SC) that integrates Image Quantification (IQ) results, Regions of Interest (ROIs), and Time Activity Curves (TACs) with screen shots by embedding extra medical imaging information into a standard DICOM header. A software toolkit of DICOM IQSC has been developed to implement the SC-centered information integration of quantitative analysis for routine practice of nuclear medicine. Primary experiments show that the DICOM IQSC method is simple and easy to implement seamlessly integrating post-processing workstations with PACS for archiving and retrieving IQ information. Additional DICOM IQSC applications in routine nuclear medicine and clinic research are also discussed.

  6. Group refractive index quantification using a Fourier domain short coherence Sagnac interferometer.

    PubMed

    Montonen, Risto; Kassamakov, Ivan; Lehmann, Peter; Österberg, Kenneth; Hæggström, Edward

    2018-02-15

    The group refractive index is important in length calibration of Fourier domain interferometers by transparent transfer standards. We demonstrate accurate group refractive index quantification using a Fourier domain short coherence Sagnac interferometer. Because of a justified linear length calibration function, the calibration constants cancel out in the evaluation of the group refractive index, which is then obtained accurately from two uncalibrated lengths. Measurements of two standard thickness coverslips revealed group indices of 1.5426±0.0042 and 1.5434±0.0046, with accuracies quoted at the 95% confidence level. This agreed with the dispersion data of the coverslip manufacturer and therefore validates our method. Our method provides a sample specific and accurate group refractive index quantification using the same Fourier domain interferometer that is to be calibrated for the length. This reduces significantly the requirements of the calibration transfer standard.

  7. Biodiesel production from microalgal isolates of southern Pakistan and quantification of FAMEs by GC-MS/MS analysis

    PubMed Central

    2012-01-01

    Background Microalgae have attracted major interest as a sustainable source for biodiesel production on commercial scale. This paper describes the screening of six microalgal species, Scenedesmus quadricauda, Scenedesmus acuminatus, Nannochloropsis sp., Anabaena sp., Chlorella sp. and Oscillatoria sp., isolated from fresh and marine water resources of southern Pakistan for biodiesel production and the GC-MS/MS analysis of their fatty acid methyl esters (FAMEs). Results Growth rate, biomass productivity and oil content of each algal species have been investigated under autotrophic condition. Biodiesel was produced from algal oil by acid catalyzed transesterification reaction and resulting fatty acid methyl esters (FAMEs) content was analyzed by GC/MS. Fatty acid profiling of the biodiesel, obtained from various microalgal oils showed high content of C-16:0, C-18:0, cis-Δ9C-18:1, cis-Δ11C-18:1 (except Scenedesmus quadricauda) and 10-hydroxyoctadecanoic (except Scenedesmus acuminatus). Absolute amount of C-14:0, C-16:0 and C-18:0 by a validated GC-MS/MS method were found to be 1.5-1.7, 15.0-42.5 and 4.2-18.4 mg/g, respectively, in biodiesel obtained from various microalgal oils. Biodiesel was also characterized in terms of cetane number, kinematic viscosity, density and higher heating value and compared with the standard values. Conclusion Six microalgae of local origin were screened for biodiesel production. A method for absolute quantification of three important saturated fatty acid methyl esters (C-14, C-16 and C-18) by gas chromatography-tandem mass spectrometry (GC-MS/MS), using multiple reactions monitoring (MRM) mode, was employed for the identification and quantification of biodiesels obtained from various microalgal oils. The results suggested that locally found microalgae can be sustainably harvested for the production of biodiesel. This offers the tremendous economic opportunity for an energy-deficient nation. PMID:23216896

  8. Improved quantification for local regions of interest in preclinical PET imaging

    NASA Astrophysics Data System (ADS)

    Cal-González, J.; Moore, S. C.; Park, M.-A.; Herraiz, J. L.; Vaquero, J. J.; Desco, M.; Udias, J. M.

    2015-09-01

    In Positron Emission Tomography, there are several causes of quantitative inaccuracy, such as partial volume or spillover effects. The impact of these effects is greater when using radionuclides that have a large positron range, e.g. 68Ga and 124I, which have been increasingly used in the clinic. We have implemented and evaluated a local projection algorithm (LPA), originally evaluated for SPECT, to compensate for both partial-volume and spillover effects in PET. This method is based on the use of a high-resolution CT or MR image, co-registered with a PET image, which permits a high-resolution segmentation of a few tissues within a volume of interest (VOI) centered on a region within which tissue-activity values need to be estimated. The additional boundary information is used to obtain improved activity estimates for each tissue within the VOI, by solving a simple inversion problem. We implemented this algorithm for the preclinical Argus PET/CT scanner and assessed its performance using the radionuclides 18F, 68Ga and 124I. We also evaluated and compared the results obtained when it was applied during the iterative reconstruction, as well as after the reconstruction as a postprocessing procedure. In addition, we studied how LPA can help to reduce the ‘spillover contamination’, which causes inaccurate quantification of lesions in the immediate neighborhood of large, ‘hot’ sources. Quantification was significantly improved by using LPA, which provided more accurate ratios of lesion-to-background activity concentration for hot and cold regions. For 18F, the contrast was improved from 3.0 to 4.0 in hot lesions (when the true ratio was 4.0) and from 0.16 to 0.06 in cold lesions (true ratio  =  0.0), when using the LPA postprocessing. Furthermore, activity values estimated within the VOI using LPA during reconstruction were slightly more accurate than those obtained by post-processing, while also visually improving the image contrast and uniformity

  9. Improved quantification for local regions of interest in preclinical PET imaging

    PubMed Central

    Cal-González, J.; Moore, S. C.; Park, M.-A.; Herraiz, J. L.; Vaquero, J. J.; Desco, M.; Udias, J. M.

    2015-01-01

    In Positron Emission Tomography, there are several causes of quantitative inaccuracy, such as partial volume or spillover effects. The impact of these effects is greater when using radionuclides that have a large positron range, e.g., 68Ga and 124I, which have been increasingly used in the clinic. We have implemented and evaluated a local projection algorithm (LPA), originally evaluated for SPECT, to compensate for both partial-volume and spillover effects in PET. This method is based on the use of a high-resolution CT or MR image, co-registered with a PET image, which permits a high-resolution segmentation of a few tissues within a volume of interest (VOI) centered on a region within which tissue-activity values need to be estimated. The additional boundary information is used to obtain improved activity estimates for each tissue within the VOI, by solving a simple inversion problem. We implemented this algorithm for the preclinical Argus PET/CT scanner and assessed its performance using the radionuclides 18F, 68Ga and 124I. We also evaluated and compared the results obtained when it was applied during the iterative reconstruction, as well as after the reconstruction as a postprocessing procedure. In addition, we studied how LPA can help to reduce the “spillover contamination”, which causes inaccurate quantification of lesions in the immediate neighborhood of large, “hot” sources. Quantification was significantly improved by using LPA, which provided more accurate ratios of lesion-to-background activity concentration for hot and cold regions. For 18F, the contrast was improved from 3.0 to 4.0 in hot lesions (when the true ratio was 4.0) and from 0.16 to 0.06 in cold lesions (true ratio = 0.0), when using the LPA postprocessing. Furthermore, activity values estimated within the VOI using LPA during reconstruction were slightly more accurate than those obtained by post-processing, while also visually improving the image contrast and uniformity within the VOI

  10. Breast density quantification using magnetic resonance imaging (MRI) with bias field correction: A postmortem study

    PubMed Central

    Ding, Huanjun; Johnson, Travis; Lin, Muqing; Le, Huy Q.; Ducote, Justin L.; Su, Min-Ying; Molloi, Sabee

    2013-01-01

    Purpose: Quantification of breast density based on three-dimensional breast MRI may provide useful information for the early detection of breast cancer. However, the field inhomogeneity can severely challenge the computerized image segmentation process. In this work, the effect of the bias field in breast density quantification has been investigated with a postmortem study. Methods: T1-weighted images of 20 pairs of postmortem breasts were acquired on a 1.5 T breast MRI scanner. Two computer-assisted algorithms were used to quantify the volumetric breast density. First, standard fuzzy c-means (FCM) clustering was used on raw images with the bias field present. Then, the coherent local intensity clustering (CLIC) method estimated and corrected the bias field during the iterative tissue segmentation process. Finally, FCM clustering was performed on the bias-field-corrected images produced by CLIC method. The left–right correlation for breasts in the same pair was studied for both segmentation algorithms to evaluate the precision of the tissue classification. Finally, the breast densities measured with the three methods were compared to the gold standard tissue compositions obtained from chemical analysis. The linear correlation coefficient, Pearson's r, was used to evaluate the two image segmentation algorithms and the effect of bias field. Results: The CLIC method successfully corrected the intensity inhomogeneity induced by the bias field. In left–right comparisons, the CLIC method significantly improved the slope and the correlation coefficient of the linear fitting for the glandular volume estimation. The left–right breast density correlation was also increased from 0.93 to 0.98. When compared with the percent fibroglandular volume (%FGV) from chemical analysis, results after bias field correction from both the CLIC the FCM algorithms showed improved linear correlation. As a result, the Pearson's r increased from 0.86 to 0.92 with the bias field correction

  11. Four human Plasmodium species quantification using droplet digital PCR.

    PubMed

    Srisutham, Suttipat; Saralamba, Naowarat; Malleret, Benoit; Rénia, Laurent; Dondorp, Arjen M; Imwong, Mallika

    2017-01-01

    Droplet digital polymerase chain reaction (ddPCR) is a partial PCR based on water-oil emulsion droplet technology. It is a highly sensitive method for detecting and delineating minor alleles from complex backgrounds and provides absolute quantification of DNA targets. The ddPCR technology has been applied for detection of many pathogens. Here the sensitive assay utilizing ddPCR for detection and quantification of Plasmodium species was investigated. The assay was developed for two levels of detection, genus specific for all Plasmodium species and for specific Plasmodium species detection. The ddPCR assay was developed based on primers and probes specific to the Plasmodium genus 18S rRNA gene. Using ddPCR for ultra-sensitive P. falciparum assessment, the lower level of detection from concentrated DNA obtained from a high volume (1 mL) blood sample was 11 parasites/mL. For species identification, in particular for samples with mixed infections, a duplex reaction was developed for detection and quantification P. falciparum/ P. vivax and P. malariae/ P. ovale. Amplification of each Plasmodium species in the duplex reaction showed equal sensitivity to singleplex single species detection. The duplex ddPCR assay had higher sensitivity to identify minor species in 32 subpatent parasitaemia samples from Cambodia, and performed better than real-time PCR. The ddPCR assay shows high sensitivity to assess very low parasitaemia of all human Plasmodium species. This provides a useful research tool for studying the role of the asymptomatic parasite reservoir for transmission in regions aiming for malaria elimination.

  12. Iterative fitting method for the evaluation and quantification of PAES spectra

    NASA Astrophysics Data System (ADS)

    Zimnik, Samantha; Hackenberg, Mathias; Hugenschmidt, Christoph

    2017-01-01

    The elemental composition of surfaces is of great importance for the understanding of many surface processes such as catalysis. For a reliable analysis and a comparison of results, the quantification of the measured data is indispensable. Positron annihilation induced Auger Electron Spectroscopy (PAES) is a spectroscopic technique that measures the elemental composition with outstanding surface sensitivity, but up to now, no standardized evaluation procedure for PAES spectra is available. In this paper we present a new approach for the evaluation of PAES spectra of compounds, using the spectra obtained for the pure elements as reference. The measured spectrum is then fitted by a linear combination of the reference spectra by varying their intensities. The comparison of the results of the fitting routine with a calculation of the full parameter range shows an excellent agreement. We present the results of the new analysis method to evaluate the PAES spectra of sub-monolayers of Ni on a Pd substrate.

  13. Cosmogenic radionuclides on LDEF: An unexpected Be-10 result

    NASA Technical Reports Server (NTRS)

    Gregory, J. C.; Albrecht, A.; Herzog, G.; Klein, J.; Middleton, R.; Dezfouly-Arjomandy, B.; Harmon, B. A.

    1993-01-01

    Following the discovery of the atmospheric derived cosmogenic radionuclide Be-7 on the Long Duration Exposure Facility (LDEF), a search began for other known nuclides produced by similar mechanisms. None of the others have the narrow gamma-ray line emission of Be-7 decay which enabled its rapid detection and quantification. A search for Be-10 atoms on LDEF clamp plates using accelerator mass spectrometry is described. An unexpected result was obtained.

  14. Semi-automated quantification and neuroanatomical mapping of heterogeneous cell populations.

    PubMed

    Mendez, Oscar A; Potter, Colin J; Valdez, Michael; Bello, Thomas; Trouard, Theodore P; Koshy, Anita A

    2018-07-15

    Our group studies the interactions between cells of the brain and the neurotropic parasite Toxoplasma gondii. Using an in vivo system that allows us to permanently mark and identify brain cells injected with Toxoplasma protein, we have identified that Toxoplasma-injected neurons (TINs) are heterogeneously distributed throughout the brain. Unfortunately, standard methods to quantify and map heterogeneous cell populations onto a reference brain atlas are time consuming and prone to user bias. We developed a novel MATLAB-based semi-automated quantification and mapping program to allow the rapid and consistent mapping of heterogeneously distributed cells on to the Allen Institute Mouse Brain Atlas. The system uses two-threshold background subtraction to identify and quantify cells of interest. We demonstrate that we reliably quantify and neuroanatomically localize TINs with low intra- or inter-observer variability. In a follow up experiment, we show that specific regions of the mouse brain are enriched with TINs. The procedure we use takes advantage of simple immunohistochemistry labeling techniques, use of a standard microscope with a motorized stage, and low cost computing that can be readily obtained at a research institute. To our knowledge there is no other program that uses such readily available techniques and equipment for mapping heterogeneous populations of cells across the whole mouse brain. The quantification method described here allows reliable visualization, quantification, and mapping of heterogeneous cell populations in immunolabeled sections across whole mouse brains. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Obtaining patient test results from clinical laboratories: a survey of state law for pharmacists.

    PubMed

    Witry, Matthew J; Doucette, William R

    2009-01-01

    To identify states with laws that restrict to whom clinical laboratories may release copies of laboratory test results and to describe how these laws may affect pharmacists' ability to obtain patient laboratory test results. Researchers examined state statutes and administrative codes for all 50 states and the District of Columbia at the University of Iowa Law Library between June and July 2007. Researchers also consulted with lawyers, state Clinical Laboratory Improvement Amendments officers, and law librarians. Laws relating to the study objective were analyzed. 34 jurisdictions do not restrict the release of laboratory test results, while 17 states have laws that restrict to whom clinical laboratories can send copies of test results. In these states, pharmacists will have to use alternative sources, such as physician offices, to obtain test results. Pharmacists must consider state law before requesting copies of laboratory test results from clinical laboratories. This may be an issue that state pharmacy associations can address to increase pharmacist access to important patient information.

  16. Quantification of synthetic cannabinoids in herbal smoking blends using NMR.

    PubMed

    Dunne, Simon J; Rosengren-Holmberg, Jenny P

    2017-05-01

    Herbal smoking blends containing synthetic cannabinoids have become popular alternatives to marijuana. These products were previously sold in pre-packaged foil bags, but nowadays seizures usually contain synthetic cannabinoid powders together with unprepared plant materials. A question often raised by the Swedish police is how much smoking blend can be prepared from certain amounts of banned substance, in order to establish the severity of the crime. To address this question, information about the synthetic cannabinoid content in both the powder and the prepared herbal blends is necessary. In this work, an extraction procedure compatible with direct NMR quantification of synthetic cannabinoids in herbal smoking blends was developed. Extraction media, time and efficiency were tested for different carrier materials containing representative synthetic cannabinoids. The developed protocol utilizes a 30 min extraction step in d 4 -methanol in presence of internal standard allowing direct quantitation of the extract using NMR. The accuracy of the developed method was tested using in-house prepared herbal smoking blends. The results showed deviations less than 0.2% from the actual content, proving that the method is sufficiently accurate for these quantifications. Using this method, ten synthetic cannabinoids present in sixty-three different herbal blends seized by the Swedish police between October 2012 and April 2015 were quantified. Obtained results showed a variation in cannabinoid contents from 1.5% (w/w) for mixtures containing MDMB-CHMICA to over 5% (w/w) for mixtures containing 5F-AKB-48. This is important information for forensic experts when making theoretical calculations of production quantities in legal cases regarding "home-made" herbal smoking blends. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  17. In-Gel Stable-Isotope Labeling (ISIL): a strategy for mass spectrometry-based relative quantification.

    PubMed

    Asara, John M; Zhang, Xiang; Zheng, Bin; Christofk, Heather H; Wu, Ning; Cantley, Lewis C

    2006-01-01

    Most proteomics approaches for relative quantification of protein expression use a combination of stable-isotope labeling and mass spectrometry. Traditionally, researchers have used difference gel electrophoresis (DIGE) from stained 1D and 2D gels for relative quantification. While differences in protein staining intensity can often be visualized, abundant proteins can obscure less abundant proteins, and quantification of post-translational modifications is difficult. A method is presented for quantifying changes in the abundance of a specific protein or changes in specific modifications of a protein using In-gel Stable-Isotope Labeling (ISIL). Proteins extracted from any source (tissue, cell line, immunoprecipitate, etc.), treated under two experimental conditions, are resolved in separate lanes by gel electrophoresis. The regions of interest (visualized by staining) are reacted separately with light versus heavy isotope-labeled reagents, and the gel slices are then mixed and digested with proteases. The resulting peptides are then analyzed by LC-MS to determine relative abundance of light/heavy isotope pairs and analyzed by LC-MS/MS for identification of sequence and modifications. The strategy compares well with other relative quantification strategies, and in silico calculations reveal its effectiveness as a global relative quantification strategy. An advantage of ISIL is that visualization of gel differences can be used as a first quantification step followed by accurate and sensitive protein level stable-isotope labeling and mass spectrometry-based relative quantification.

  18. Recurrence quantification as potential bio-markers for diagnosis of pre-cancer

    NASA Astrophysics Data System (ADS)

    Mukhopadhyay, Sabyasachi; Pratiher, Sawon; Barman, Ritwik; Pratiher, Souvik; Pradhan, Asima; Ghosh, Nirmalya; Panigrahi, Prasanta K.

    2017-03-01

    In this paper, the spectroscopy signals have been analyzed in recurrence plots (RP), and extract recurrence quantification analysis (RQA) parameters from the RP in order to classify the tissues into normal and different precancerous grades. Three RQA parameters have been quantified in order to extract the important features in the spectroscopy data. These features have been fed to different classifiers for classification. Simulation results validate the efficacy of the recurrence quantification as potential bio-markers for diagnosis of pre-cancer.

  19. Qualitative and Quantitative Detection of Botulinum Neurotoxins from Complex Matrices: Results of the First International Proficiency Test

    PubMed Central

    Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G.

    2015-01-01

    In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as “gold standard” for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724

  20. Bayesian Statistics and Uncertainty Quantification for Safety Boundary Analysis in Complex Systems

    NASA Technical Reports Server (NTRS)

    He, Yuning; Davies, Misty Dawn

    2014-01-01

    The analysis of a safety-critical system often requires detailed knowledge of safe regions and their highdimensional non-linear boundaries. We present a statistical approach to iteratively detect and characterize the boundaries, which are provided as parameterized shape candidates. Using methods from uncertainty quantification and active learning, we incrementally construct a statistical model from only few simulation runs and obtain statistically sound estimates of the shape parameters for safety boundaries.

  1. A specific endogenous reference for genetically modified common bean (Phaseolus vulgaris L.) DNA quantification by real-time PCR targeting lectin gene.

    PubMed

    Venturelli, Gustavo L; Brod, Fábio C A; Rossi, Gabriela B; Zimmermann, Naíra F; Oliveira, Jaison P; Faria, Josias C; Arisi, Ana C M

    2014-11-01

    The Embrapa 5.1 genetically modified (GM) common bean was approved for commercialization in Brazil. Methods for the quantification of this new genetically modified organism (GMO) are necessary. The development of a suitable endogenous reference is essential for GMO quantification by real-time PCR. Based on this, a new taxon-specific endogenous reference quantification assay was developed for Phaseolus vulgaris L. Three genes encoding common bean proteins (phaseolin, arcelin, and lectin) were selected as candidates for endogenous reference. Primers targeting these candidate genes were designed and the detection was evaluated using the SYBR Green chemistry. The assay targeting lectin gene showed higher specificity than the remaining assays, and a hydrolysis probe was then designed. This assay showed high specificity for 50 common bean samples from two gene pools, Andean and Mesoamerican. For GM common bean varieties, the results were similar to those obtained for non-GM isogenic varieties with PCR efficiency values ranging from 92 to 101 %. Moreover, this assay presented a limit of detection of ten haploid genome copies. The primers and probe developed in this work are suitable to detect and quantify either GM or non-GM common bean.

  2. Lamb Wave Damage Quantification Using GA-Based LS-SVM.

    PubMed

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-06-12

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  3. Lamb Wave Damage Quantification Using GA-Based LS-SVM

    PubMed Central

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-01-01

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification. PMID:28773003

  4. Consistency of flow quantifications in tridirectional phase-contrast MRI

    NASA Astrophysics Data System (ADS)

    Unterhinninghofen, R.; Ley, S.; Dillmann, R.

    2009-02-01

    Tridirectionally encoded phase-contrast MRI is a technique to non-invasively acquire time-resolved velocity vector fields of blood flow. These may not only be used to analyze pathological flow patterns, but also to quantify flow at arbitrary positions within the acquired volume. In this paper we examine the validity of this approach by analyzing the consistency of related quantifications instead of comparing it with an external reference measurement. Datasets of the thoracic aorta were acquired from 6 pigs, 1 healthy volunteer and 3 patients with artificial aortic valves. Using in-house software an elliptical flow quantification plane was placed manually at 6 positions along the descending aorta where it was rotated to 5 different angles. For each configuration flow was computed based on the original data and data that had been corrected for phase offsets. Results reveal that quantifications are more dependent on changes in position than on changes in angle. Phase offset correction considerably reduces this dependency. Overall consistency is good with a maximum variation coefficient of 9.9% and a mean variation coefficient of 7.2%.

  5. Mixture quantification using PLS in plastic scintillation measurements.

    PubMed

    Bagán, H; Tarancón, A; Rauret, G; García, J F

    2011-06-01

    This article reports the capability of plastic scintillation (PS) combined with multivariate calibration (Partial least squares; PLS) to detect and quantify alpha and beta emitters in mixtures. While several attempts have been made with this purpose in mind using liquid scintillation (LS), no attempt was done using PS that has the great advantage of not producing mixed waste after the measurements are performed. Following this objective, ternary mixtures of alpha and beta emitters ((241)Am, (137)Cs and (90)Sr/(90)Y) have been quantified. Procedure optimisation has evaluated the use of the net spectra or the sample spectra, the inclusion of different spectra obtained at different values of the Pulse Shape Analysis parameter and the application of the PLS1 or PLS2 algorithms. The conclusions show that the use of PS+PLS2 applied to the sample spectra, without the use of any pulse shape discrimination, allows quantification of the activities with relative errors less than 10% in most of the cases. This procedure not only allows quantification of mixtures but also reduces measurement time (no blanks are required) and the application of this procedure does not require detectors that include the pulse shape analysis parameter. Copyright © 2011 Elsevier Ltd. All rights reserved.

  6. Species identification and quantification in meat and meat products using droplet digital PCR (ddPCR).

    PubMed

    Floren, C; Wiedemann, I; Brenig, B; Schütz, E; Beck, J

    2015-04-15

    Species fraud and product mislabelling in processed food, albeit not being a direct health issue, often results in consumer distrust. Therefore methods for quantification of undeclared species are needed. Targeting mitochondrial DNA, e.g. CYTB gene, for species quantification is unsuitable, due to a fivefold inter-tissue variation in mtDNA content per cell resulting in either an under- (-70%) or overestimation (+160%) of species DNA contents. Here, we describe a reliable two-step droplet digital PCR (ddPCR) assay targeting the nuclear F2 gene for precise quantification of cattle, horse, and pig in processed meat products. The ddPCR assay is advantageous over qPCR showing a limit of quantification (LOQ) and detection (LOD) in different meat products of 0.01% and 0.001%, respectively. The specificity was verified in 14 different species. Hence, determining F2 in food by ddPCR can be recommended for quality assurance and control in production systems. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Assessment of cardiac fibrosis: a morphometric method comparison for collagen quantification.

    PubMed

    Schipke, Julia; Brandenberger, Christina; Rajces, Alexandra; Manninger, Martin; Alogna, Alessio; Post, Heiner; Mühlfeld, Christian

    2017-04-01

    Fibrotic remodeling of the heart is a frequent condition linked to various diseases and cardiac dysfunction. Collagen quantification is an important objective in cardiac fibrosis research; however, a variety of different histological methods are currently used that may differ in accuracy. Here, frequently applied collagen quantification techniques were compared. A porcine model of early stage heart failure with preserved ejection fraction was used as an example. Semiautomated threshold analyses were imprecise, mainly due to inclusion of noncollagen structures or failure to detect certain collagen deposits. In contrast, collagen assessment by automated image analysis and light microscopy (LM)-stereology was more sensitive. Depending on the quantification method, the amount of estimated collagen varied and influenced intergroup comparisons. PicroSirius Red, Masson's trichrome, and Azan staining protocols yielded similar results, whereas the measured collagen area increased with increasing section thickness. Whereas none of the LM-based methods showed significant differences between the groups, electron microscopy (EM)-stereology revealed a significant collagen increase between cardiomyocytes in the experimental group, but not at other localizations. In conclusion, in contrast to the staining protocol, section thickness and the quantification method being used directly influence the estimated collagen content and thus, possibly, intergroup comparisons. EM in combination with stereology is a precise and sensitive method for collagen quantification if certain prerequisites are considered. For subtle fibrotic alterations, consideration of collagen localization may be necessary. Among LM methods, LM-stereology and automated image analysis are appropriate to quantify fibrotic changes, the latter depending on careful control of algorithm and comparable section staining. NEW & NOTEWORTHY Direct comparison of frequently applied histological fibrosis assessment techniques

  8. Development of a Protein Standard Absolute Quantification (PSAQ™) assay for the quantification of Staphylococcus aureus enterotoxin A in serum.

    PubMed

    Adrait, Annie; Lebert, Dorothée; Trauchessec, Mathieu; Dupuis, Alain; Louwagie, Mathilde; Masselon, Christophe; Jaquinod, Michel; Chevalier, Benoît; Vandenesch, François; Garin, Jérôme; Bruley, Christophe; Brun, Virginie

    2012-06-06

    Enterotoxin A (SEA) is a staphylococcal virulence factor which is suspected to worsen septic shock prognosis. However, the presence of SEA in the blood of sepsis patients has never been demonstrated. We have developed a mass spectrometry-based assay for the targeted and absolute quantification of SEA in serum. To enhance sensitivity and specificity, we combined an immunoaffinity-based sample preparation with mass spectrometry analysis in the selected reaction monitoring (SRM) mode. Absolute quantification of SEA was performed using the PSAQ™ method (Protein Standard Absolute Quantification), which uses a full-length isotope-labeled SEA as internal standard. The lower limit of detection (LLOD) and lower limit of quantification (LLOQ) were estimated at 352pg/mL and 1057pg/mL, respectively. SEA recovery after immunocapture was determined to be 7.8±1.4%. Therefore, we assumed that less than 1femtomole of each SEA proteotypic peptide was injected on the liquid chromatography column before SRM analysis. From a 6-point titration experiment, quantification accuracy was determined to be 77% and precision at LLOQ was lower than 5%. With this sensitive PSAQ-SRM assay, we expect to contribute to decipher the pathophysiological role of SEA in severe sepsis. This article is part of a Special Issue entitled: Proteomics: The clinical link. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. Rapid Quantification of Low-Viscosity Acetyl-Triacylglycerols Using Electrospray Ionization Mass Spectrometry.

    PubMed

    Bansal, Sunil; Durrett, Timothy P

    2016-09-01

    Acetyl-triacylglycerols (acetyl-TAG) possess an sn-3 acetate group, which confers useful chemical and physical properties to these unusual triacylglycerols (TAG). Current methods for quantification of acetyl-TAG are time consuming and do not provide any information on the molecular species profile. Electrospray ionization mass spectrometry (ESI-MS)-based methods can overcome these drawbacks. However, the ESI-MS signal intensity for TAG depends on the aliphatic chain length and unsaturation index of the molecule. Therefore response factors for different molecular species need to be determined before any quantification. The effects of the chain length and the number of double-bonds of the sn-1/2 acyl groups on the signal intensity for the neutral loss of short chain length sn-3 groups were quantified using a series of synthesized sn-3 specific structured TAG. The signal intensity for the neutral loss of the sn-3 acyl group was found to negatively correlated with the aliphatic chain length and unsaturation index of the sn-1/2 acyl groups. The signal intensity of the neutral loss of the sn-3 acyl group was also negatively correlated with the size of that chain. Further, the position of the group undergoing neutral loss was also important, with the signal from an sn-2 acyl group much lower than that from one located at sn-3. Response factors obtained from these analyses were used to develop a method for the absolute quantification of acetyl-TAG. The increased sensitivity of this ESI-MS-based approach allowed successful quantification of acetyl-TAG in various biological settings, including the products of in vitro enzyme activity assays.

  10. Heterochromatic Flicker Photometry for Objective Lens Density Quantification.

    PubMed

    Najjar, Raymond P; Teikari, Petteri; Cornut, Pierre-Loïc; Knoblauch, Kenneth; Cooper, Howard M; Gronfier, Claude

    2016-03-01

    Although several methods have been proposed to evaluate lens transmittance, to date there is no consensual in vivo approach in clinical practice. The aim of this study was to compare ocular lens density and transmittance measurements obtained by an improved psychophysical scotopic heterochromatic flicker photometry (sHFP) technique to the results obtained by three other measures: a psychophysical threshold technique, a Scheimpflug imaging technique, and a clinical assessment using a validated subjective scale. Forty-three subjects (18 young, 9 middle aged, and 16 older) were included in the study. Individual lens densities were measured and transmittance curves were derived from sHFP indexes. Ocular lens densities were compared across methods by using linear regression analysis. The four approaches showed a quadratic increase in lens opacification with age. The sHFP technique revealed that transmittance decreased with age over the entire visual spectrum. This decrease was particularly pronounced between young and older participants in the short (53.03% decrease in the 400-500 nm range) wavelength regions of the light spectrum. Lens density derived from sHFP highly correlated with the values obtained with the other approaches. Compared to other objective measures, sHFP also showed the lowest variability and the best fit with a quadratic trend (r2 = 0.71) of lens density increase as a function of age. The sHFP technique offers a practical, reliable, and accurate method to measure lens density in vivo and predict lens transmittance over the visible spectrum. An accurate quantification of lens transmittance should be obtained in clinical practice, but also in research in visual and nonvisual photoreception.

  11. Colour thresholding and objective quantification in bioimaging

    NASA Technical Reports Server (NTRS)

    Fermin, C. D.; Gerber, M. A.; Torre-Bueno, J. R.

    1992-01-01

    Computer imaging is rapidly becoming an indispensable tool for the quantification of variables in research and medicine. Whilst its use in medicine has largely been limited to qualitative observations, imaging in applied basic sciences, medical research and biotechnology demands objective quantification of the variables in question. In black and white densitometry (0-256 levels of intensity) the separation of subtle differences between closely related hues from stains is sometimes very difficult. True-colour and real-time video microscopy analysis offer choices not previously available with monochrome systems. In this paper we demonstrate the usefulness of colour thresholding, which has so far proven indispensable for proper objective quantification of the products of histochemical reactions and/or subtle differences in tissue and cells. In addition, we provide interested, but untrained readers with basic information that may assist decisions regarding the most suitable set-up for a project under consideration. Data from projects in progress at Tulane are shown to illustrate the advantage of colour thresholding over monochrome densitometry and for objective quantification of subtle colour differences between experimental and control samples.

  12. Quantification of plasma exosome is a potential prognostic marker for esophageal squamous cell carcinoma.

    PubMed

    Matsumoto, Yasunori; Kano, Masayuki; Akutsu, Yasunori; Hanari, Naoyuki; Hoshino, Isamu; Murakami, Kentaro; Usui, Akihiro; Suito, Hiroshi; Takahashi, Masahiko; Otsuka, Ryota; Xin, Hu; Komatsu, Aki; Iida, Keiko; Matsubara, Hisahiro

    2016-11-01

    Exosomes play important roles in cancer progression. Although its contents (e.g., proteins and microRNAs) have been focused on in cancer research, particularly as potential diagnostic markers, the exosome behavior and methods for exosome quantification remain unclear. In the present study, we analyzed the tumor-derived exosome behavior and assessed the quantification of exosomes in patient plasma as a biomarker for esophageal squamous cell carcinoma (ESCC). A CD63-GFP expressing human ESCC cell line (TE2-CD63-GFP) was made by transfection, and mouse subcutaneous tumor models were established. Fluorescence imaging was performed on tumors and plasma exosomes harvested from mice. GFP-positive small vesicles were confirmed in the plasma obtained from TE2-CD63-GFP tumor-bearing mice. Patient plasma was collected in Chiba University Hospital (n=86). Exosomes were extracted from 100 µl of the plasma and quantified by acetylcholinesterase (AChE) activity. The relationship between exosome quantification and the patient clinical characteristics was assessed. The quantification of exosomes isolated from the patient plasma revealed that esophageal cancer patients (n=66) expressed higher exosome levels than non-malignant patients (n=20) (P=0.0002). Although there was no correlation between the tumor progression and the exosome levels, exosome number was the independent prognostic marker and low levels of exosome predicted a poor prognosis (P=0.03). In conclusion, exosome levels may be useful as an independent prognostic factor for ESCC patients.

  13. Multiplex Droplet Digital PCR Protocols for Quantification of GM Maize Events.

    PubMed

    Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Štebih, Dejan; Morisset, Dany; Holst-Jensen, Arne; Žel, Jana

    2018-01-01

    The standard-curve based simplex quantitative polymerase chain reaction (qPCR) has been the gold standard for DNA target quantification for more than a decade. The large and growing number of individual analyses needed to test for genetically modified organisms (GMOs) is reducing the cost-effectiveness of qPCR. Droplet digital PCR (ddPCR) enables absolute quantification without standard curves, avoids the amplification efficiency bias observed with qPCR, allows more accurate estimations at low target copy numbers and, in combination with multiplexing, significantly improves cost efficiency. Here we describe two protocols for multiplex quantification of GM maize events: (1) nondiscriminating, with multiplex quantification of targets as a group (12 GM maize lines) and (2) discriminating, with multiplex quantification of individual targets (events). The first enables the quantification of twelve European Union authorized GM maize events as a group with only two assays, but does not permit determination of the individual events present. The second protocol enables the quantification of four individual targets (three GM events and one endogene) in a single reaction. Both protocols can be modified for quantification of any other DNA target.

  14. Quantification technology study on flaws in steam-filled pipelines based on image processing

    NASA Astrophysics Data System (ADS)

    Sun, Lina; Yuan, Peixin

    2009-07-01

    Starting from exploiting the applied detection system of gas transmission pipeline, a set of X-ray image processing methods and pipeline flaw quantificational evaluation methods are proposed. Defective and non-defective strings and rows in gray image were extracted and oscillogram was obtained. We can distinguish defects in contrast with two gray images division. According to the gray value of defects with different thicknesses, the gray level depth curve is founded. Through exponential and polynomial fitting way to obtain the attenuation mathematical model which the beam penetrates pipeline, thus attain flaw deep dimension. This paper tests on the PPR pipe in the production of simulated holes flaw and cracks flaw, 135KV used the X-ray source on the testing. Test results show that X-ray image processing method, which meet the needs of high efficient flaw detection and provide quality safeguard for thick oil recovery, can be used successfully in detecting corrosion of insulated pipe.

  15. Quantification technology study on flaws in steam-filled pipelines based on image processing

    NASA Astrophysics Data System (ADS)

    Yuan, Pei-xin; Cong, Jia-hui; Chen, Bo

    2008-03-01

    Starting from exploiting the applied detection system of gas transmission pipeline, a set of X-ray image processing methods and pipeline flaw quantificational evaluation methods are proposed. Defective and non-defective strings and rows in gray image were extracted and oscillogram was obtained. We can distinguish defects in contrast with two gray images division. According to the gray value of defects with different thicknesses, the gray level depth curve is founded. Through exponential and polynomial fitting way to obtain the attenuation mathematical model which the beam penetrates pipeline, thus attain flaw deep dimension. This paper tests on the PPR pipe in the production of simulated holes flaw and cracks flaw. The X-ray source tube voltage was selected as 130kv and valve current was 1.5mA.Test results show that X-ray image processing methods, which meet the needs of high efficient flaw detection and provide quality safeguard for thick oil recovery, can be used successfully in detecting corrosion of insulated pipe.

  16. Development of magnetic resonance technology for noninvasive boron quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradshaw, K.M.

    1990-11-01

    Boron magnetic resonance imaging (MRI) and spectroscopy (MRS) were developed in support of the noninvasive boron quantification task of the Idaho National Engineering Laboratory (INEL) Power Burst Facility/Boron Neutron Capture Therapy (PBF/BNCT) program. The hardware and software described in this report are modifications specific to a GE Signa{trademark} MRI system, release 3.X and are necessary for boron magnetic resonance operation. The technology developed in this task has been applied to obtaining animal pharmacokinetic data of boron compounds (drug time response) and the in-vivo localization of boron in animal tissue noninvasively. 9 refs., 21 figs.

  17. Quantification of video-taped images in microcirculation research using inexpensive imaging software (Adobe Photoshop).

    PubMed

    Brunner, J; Krummenauer, F; Lehr, H A

    2000-04-01

    Study end-points in microcirculation research are usually video-taped images rather than numeric computer print-outs. Analysis of these video-taped images for the quantification of microcirculatory parameters usually requires computer-based image analysis systems. Most software programs for image analysis are custom-made, expensive, and limited in their applicability to selected parameters and study end-points. We demonstrate herein that an inexpensive, commercially available computer software (Adobe Photoshop), run on a Macintosh G3 computer with inbuilt graphic capture board provides versatile, easy to use tools for the quantification of digitized video images. Using images obtained by intravital fluorescence microscopy from the pre- and postischemic muscle microcirculation in the skinfold chamber model in hamsters, Photoshop allows simple and rapid quantification (i) of microvessel diameters, (ii) of the functional capillary density and (iii) of postischemic leakage of FITC-labeled high molecular weight dextran from postcapillary venules. We present evidence of the technical accuracy of the software tools and of a high degree of interobserver reliability. Inexpensive commercially available imaging programs (i.e., Adobe Photoshop) provide versatile tools for image analysis with a wide range of potential applications in microcirculation research.

  18. Uncertainty Quantification applied to flow simulations in thoracic aortic aneurysms

    NASA Astrophysics Data System (ADS)

    Boccadifuoco, Alessandro; Mariotti, Alessandro; Celi, Simona; Martini, Nicola; Salvetti, Maria Vittoria

    2015-11-01

    The thoracic aortic aneurysm is a progressive dilatation of the thoracic aorta causing a weakness in the aortic wall, which may eventually cause life-threatening events. Clinical decisions on treatment strategies are currently based on empiric criteria, like the aortic diameter value or its growth rate. Numerical simulations can give the quantification of important indexes which are impossible to be obtained through in-vivo measurements and can provide supplementary information. Hemodynamic simulations are carried out by using the open-source tool SimVascular and considering patient-specific geometries. One of the main issues in these simulations is the choice of suitable boundary conditions, modeling the organs and vessels not included in the computational domain. The current practice is to use outflow conditions based on resistance and capacitance, whose values are tuned to obtain a physiological behavior of the patient pressure. However it is not known a priori how this choice affects the results of the simulation. The impact of the uncertainties in these outflow parameters is investigated here by using the generalized Polynomial Chaos approach. This analysis also permits to calibrate the outflow-boundary parameters when patient-specific in-vivo data are available.

  19. In situ Biofilm Quantification in Bioelectrochemical Systems by using Optical Coherence Tomography.

    PubMed

    Molenaar, Sam D; Sleutels, Tom; Pereira, Joao; Iorio, Matteo; Borsje, Casper; Zamudio, Julian A; Fabregat-Santiago, Francisco; Buisman, Cees J N; Ter Heijne, Annemiek

    2018-04-25

    Detailed studies of microbial growth in bioelectrochemical systems (BESs) are required for their suitable design and operation. Here, we report the use of optical coherence tomography (OCT) as a tool for in situ and noninvasive quantification of biofilm growth on electrodes (bioanodes). An experimental platform is designed and described in which transparent electrodes are used to allow real-time, 3D biofilm imaging. The accuracy and precision of the developed method is assessed by relating the OCT results to well-established standards for biofilm quantification (chemical oxygen demand (COD) and total N content) and show high correspondence to these standards. Biofilm thickness observed by OCT ranged between 3 and 90 μm for experimental durations ranging from 1 to 24 days. This translated to growth yields between 38 and 42 mgCODbiomass  gCODacetate -1 at an anode potential of -0.35 V versus Ag/AgCl. Time-lapse observations of an experimental run performed in duplicate show high reproducibility in obtained microbial growth yield by the developed method. As such, we identify OCT as a powerful tool for conducting in-depth characterizations of microbial growth dynamics in BESs. Additionally, the presented platform allows concomitant application of this method with various optical and electrochemical techniques. © 2018 The Authors. Published by Wiley-VCH Verlag GmbH & Co. KGaA.

  20. Automated quantification of myocardial perfusion SPECT using simplified normal limits.

    PubMed

    Slomka, Piotr J; Nishina, Hidetaka; Berman, Daniel S; Akincioglu, Cigdem; Abidov, Aiden; Friedman, John D; Hayes, Sean W; Germano, Guido

    2005-01-01

    To simplify development of normal limits for myocardial perfusion SPECT (MPS), we implemented a quantification scheme in which normal limits are derived without visual scoring of abnormal scans or optimization of regional thresholds. Normal limits were derived from same-day TI-201 rest/Tc-99m-sestamibi stress scans of male (n = 40) and female (n = 40) low-likelihood patients. Defect extent, total perfusion deficit (TPD), and regional perfusion extents were derived by comparison to normal limits in polar-map coordinates. MPS scans from 256 consecutive patients without known coronary artery disease, who underwent coronary angiography, were analyzed. The new method of quantification (TPD) was compared with our previously developed quantification system and visual scoring. The receiver operator characteristic area under the curve for detection of 50% or greater stenoses by TPD (0.88 +/- 0.02) was higher than by visual scoring (0.83 +/- 0.03) ( P = .039) or standard quantification (0.82 +/- 0.03) ( P = .004). For detection of 70% or greater stenoses, it was higher for TPD (0.89 +/- 0.02) than for standard quantification (0.85 +/- 0.02) ( P = .014). Sensitivity and specificity were 93% and 79%, respectively, for TPD; 81% and 85%, respectively, for visual scoring; and 80% and 73%, respectively, for standard quantification. The use of stress mode-specific normal limits did not improve performance. Simplified quantification achieves performance better than or equivalent to visual scoring or quantification based on per-segment visual optimization of abnormality thresholds.

  1. Separation, identification and quantification of carotenoids and chlorophylls in dietary supplements containing Chlorella vulgaris and Spirulina platensis using High Performance Thin Layer Chromatography.

    PubMed

    Hynstova, Veronika; Sterbova, Dagmar; Klejdus, Borivoj; Hedbavny, Josef; Huska, Dalibor; Adam, Vojtech

    2018-01-30

    In this study, 14 commercial products (dietary supplements) containing alga Chlorella vulgaris and cyanobacteria Spirulina platensis, originated from China and Japan, were analysed. UV-vis spectrophotometric method was applied for rapid determination of chlorophylls, carotenoids and pheophytins; as degradation products of chlorophylls. High Performance Thin-Layer Chromatography (HPTLC) was used for effective separation of these compounds, and also Atomic Absorption Spectrometry for determination of heavy metals as indicator of environmental pollution. Based on the results obtained from UV-vis spectrophotometric determination of photosynthetic pigments (chlorophylls and carotenoids), it was confirmed that Chlorella vulgaris contains more of all these pigments compared to the cyanobacteria Spirulina platensis. The fastest mobility compound identified in Chlorella vulgaris and Spirulina platensis using HPTLC method was β-carotene. Spectral analysis and standard calibration curve method were used for identification and quantification of separated substances on Thin-Layer Chromatographic plate. Quantification of copper (Cu 2+ , at 324.7 nm) and zinc (Zn 2+ , at 213.9nm) was performed using Flame Atomic Absorption Spectrometry with air-acetylene flame atomization. Quantification of cadmium (Cd 2+ , at 228.8 nm), nickel (Ni 2+ , at 232.0nm) and lead (Pb 2+ , at 283.3nm) by Electrothermal Graphite Furnace Atomic Absorption Spectrometry; and quantification of mercury (Hg 2+ , at 254nm) by Cold Vapour Atomic Absorption Spectrometry. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. Increasing the feasibility of minimally invasive procedures in type A aortic dissections: a framework for segmentation and quantification.

    PubMed

    Morariu, Cosmin Adrian; Terheiden, Tobias; Dohle, Daniel Sebastian; Tsagakis, Konstantinos; Pauli, Josef

    2016-02-01

    Our goal is to provide precise measurements of the aortic dimensions in case of dissection pathologies. Quantification of surface lengths and aortic radii/diameters together with the visualization of the dissection membrane represents crucial prerequisites for enabling minimally invasive treatment of type A dissections, which always also imply the ascending aorta. We seek a measure invariant to luminance and contrast for aortic outer wall segmentation. Therefore, we propose a 2D graph-based approach using phase congruency combined with additional features. Phase congruency is extended to 3D by designing a novel conic directional filter and adding a lowpass component to the 3D Log-Gabor filterbank for extracting the fine dissection membrane, which separates the true lumen from the false one within the aorta. The result of the outer wall segmentation is compared with manually annotated axial slices belonging to 11 CTA datasets. Quantitative assessment of our novel 2D/3D membrane extraction algorithms has been obtained for 10 datasets and reveals subvoxel accuracy in all cases. Aortic inner and outer surface lengths, determined within 2 cadaveric CT datasets, are validated against manual measurements performed by a vascular surgeon on excised aortas of the body donors. This contribution proposes a complete pipeline for segmentation and quantification of aortic dissections. Validation against ground truth of the 3D contour lengths quantification represents a significant step toward custom-designed stent-grafts.

  3. Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.

    PubMed

    Gerold, Chase T; Bakker, Eric; Henry, Charles S

    2018-04-03

    In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.

  4. Breast density quantification using magnetic resonance imaging (MRI) with bias field correction: a postmortem study.

    PubMed

    Ding, Huanjun; Johnson, Travis; Lin, Muqing; Le, Huy Q; Ducote, Justin L; Su, Min-Ying; Molloi, Sabee

    2013-12-01

    Quantification of breast density based on three-dimensional breast MRI may provide useful information for the early detection of breast cancer. However, the field inhomogeneity can severely challenge the computerized image segmentation process. In this work, the effect of the bias field in breast density quantification has been investigated with a postmortem study. T1-weighted images of 20 pairs of postmortem breasts were acquired on a 1.5 T breast MRI scanner. Two computer-assisted algorithms were used to quantify the volumetric breast density. First, standard fuzzy c-means (FCM) clustering was used on raw images with the bias field present. Then, the coherent local intensity clustering (CLIC) method estimated and corrected the bias field during the iterative tissue segmentation process. Finally, FCM clustering was performed on the bias-field-corrected images produced by CLIC method. The left-right correlation for breasts in the same pair was studied for both segmentation algorithms to evaluate the precision of the tissue classification. Finally, the breast densities measured with the three methods were compared to the gold standard tissue compositions obtained from chemical analysis. The linear correlation coefficient, Pearson's r, was used to evaluate the two image segmentation algorithms and the effect of bias field. The CLIC method successfully corrected the intensity inhomogeneity induced by the bias field. In left-right comparisons, the CLIC method significantly improved the slope and the correlation coefficient of the linear fitting for the glandular volume estimation. The left-right breast density correlation was also increased from 0.93 to 0.98. When compared with the percent fibroglandular volume (%FGV) from chemical analysis, results after bias field correction from both the CLIC the FCM algorithms showed improved linear correlation. As a result, the Pearson's r increased from 0.86 to 0.92 with the bias field correction. The investigated CLIC method

  5. New method of scoliosis assessment: preliminary results using computerized photogrammetry.

    PubMed

    Aroeira, Rozilene Maria Cota; Leal, Jefferson Soares; de Melo Pertence, Antônio Eustáquio

    2011-09-01

    A new method for nonradiographic evaluation of scoliosis was independently compared with the Cobb radiographic method, for the quantification of scoliotic curvature. To develop a protocol for computerized photogrammetry, as a nonradiographic method, for the quantification of scoliosis, and to mathematically relate this proposed method with the Cobb radiographic method. Repeated exposure to radiation of children can be harmful to their health. Nevertheless, no nonradiographic method until now proposed has gained popularity as a routine method for evaluation, mainly due to a low correspondence to the Cobb radiographic method. Patients undergoing standing posteroanterior full-length spine radiographs, who were willing to participate in this study, were submitted to dorsal digital photography in the orthostatic position with special surface markers over the spinous process, specifically the vertebrae C7 to L5. The radiographic and photographic images were sent separately for independent analysis to two examiners, trained in quantification of scoliosis for the types of images received. The scoliosis curvature angles obtained through computerized photogrammetry (the new method) were compared to those obtained through the Cobb radiographic method. Sixteen individuals were evaluated (14 female and 2 male). All presented idiopathic scoliosis, and were between 21.4 ± 6.1 years of age; 52.9 ± 5.8 kg in weight; 1.63 ± 0.05 m in height, with a body mass index of 19.8 ± 0.2. There was no statistically significant difference between the scoliosis angle measurements obtained in the comparative analysis of both methods, and a mathematical relationship was formulated between both methods. The preliminary results presented demonstrate equivalence between the two methods. More studies are needed to firmly assess the potential of this new method as a coadjuvant tool in the routine following of scoliosis treatment.

  6. Results obtained with a low cost software-based audiometer for hearing screening.

    PubMed

    Ferrari, Deborah Viviane; Lopez, Esteban Alejandro; Lopes, Andrea Cintra; Aiello, Camila Piccini; Jokura, Pricila Reis

    2013-07-01

     The implementation of hearing screening programs can be facilitated by reducing operating costs, including the cost of equipment. The Telessaúde (TS) audiometer is a low-cost, software-based, and easy-to-use piece of equipment for conducting audiometric screening.  To evaluate the TS audiometer for conducting audiometric screening.  A prospective randomized study was performed. Sixty subjects, divided into those who did not have (group A, n = 30) and those who had otologic complaints (group B, n = 30), underwent audiometric screening with conventional and TS audiometers in a randomized order. Pure tones at 25 dB HL were presented at frequencies of 500, 1000, 2000, and 4000 Hz. A "fail" result was considered when the individual failed to respond to at least one of the stimuli. Pure-tone audiometry was also performed on all participants. The concordance of the results of screening with both audiometers was evaluated. The sensitivity, specificity, and positive and negative predictive values of screening with the TS audiometer were calculated.  For group A, 100% of the ears tested passed the screening. For group B, "pass" results were obtained in 34.2% (TS) and 38.3% (conventional) of the ears tested. The agreement between procedures (TS vs. conventional) ranged from 93% to 98%. For group B, screening with the TS audiometer showed 95.5% sensitivity, 90.4% sensitivity, and positive and negative predictive values equal to 94.9% and 91.5%, respectively.  The results of the TS audiometer were similar to those obtained with the conventional audiometer, indicating that the TS audiometer can be used for audiometric screening.

  7. Quantification of trace metals in water using complexation and filter concentration.

    PubMed

    Dolgin, Bella; Bulatov, Valery; Japarov, Julia; Elish, Eyal; Edri, Elad; Schechter, Israel

    2010-06-15

    Various metals undergo complexation with organic reagents, resulting in colored products. In practice, their molar absorptivities allow for quantification in the ppm range. However, a proper pre-concentration of the colored complex on paper filter lowers the quantification limit to the low ppb range. In this study, several pre-concentration techniques have been examined and compared: filtering the already complexed mixture, complexation on filter, and dipping of dye-covered filter in solution. The best quantification has been based on the ratio of filter reflectance at a certain wavelength to that at zero metal concentration. The studied complex formations (Ni ions with TAN and Cd ions with PAN) involve production of nanoparticle suspensions, which are associated with complicated kinetics. The kinetics of the complexation of Ni ions with TAN has been investigated and optimum timing could be found. Kinetic optimization in regard to some interferences has also been suggested.

  8. Quantification of taurine in energy drinks using ¹H NMR.

    PubMed

    Hohmann, Monika; Felbinger, Christine; Christoph, Norbert; Wachter, Helmut; Wiest, Johannes; Holzgrabe, Ulrike

    2014-05-01

    The consumption of so called energy drinks is increasing, especially among adolescents. These beverages commonly contain considerable amounts of the amino sulfonic acid taurine, which is related to a magnitude of various physiological effects. The customary method to control the legal limit of taurine in energy drinks is LC-UV/vis with postcolumn derivatization using ninhydrin. In this paper we describe the quantification of taurine in energy drinks by (1)H NMR as an alternative to existing methods of quantification. Variation of pH values revealed the separation of a distinct taurine signal in (1)H NMR spectra, which was applied for integration and quantification. Quantification was performed using external calibration (R(2)>0.9999; linearity verified by Mandel's fitting test with a 95% confidence level) and PULCON. Taurine concentrations in 20 different energy drinks were analyzed by both using (1)H NMR and LC-UV/vis. The deviation between (1)H NMR and LC-UV/vis results was always below the expanded measurement uncertainty of 12.2% for the LC-UV/vis method (95% confidence level) and at worst 10.4%. Due to the high accordance to LC-UV/vis data and adequate recovery rates (ranging between 97.1% and 108.2%), (1)H NMR measurement presents a suitable method to quantify taurine in energy drinks. Copyright © 2013 Elsevier B.V. All rights reserved.

  9. The applications of statistical quantification techniques in nanomechanics and nanoelectronics.

    PubMed

    Mai, Wenjie; Deng, Xinwei

    2010-10-08

    Although nanoscience and nanotechnology have been developing for approximately two decades and have achieved numerous breakthroughs, the experimental results from nanomaterials with a higher noise level and poorer repeatability than those from bulk materials still remain as a practical issue, and challenge many techniques of quantification of nanomaterials. This work proposes a physical-statistical modeling approach and a global fitting statistical method to use all the available discrete data or quasi-continuous curves to quantify a few targeted physical parameters, which can provide more accurate, efficient and reliable parameter estimates, and give reasonable physical explanations. In the resonance method for measuring the elastic modulus of ZnO nanowires (Zhou et al 2006 Solid State Commun. 139 222-6), our statistical technique gives E = 128.33 GPa instead of the original E = 108 GPa, and unveils a negative bias adjustment f(0). The causes are suggested by the systematic bias in measuring the length of the nanowires. In the electronic measurement of the resistivity of a Mo nanowire (Zach et al 2000 Science 290 2120-3), the proposed new method automatically identified the importance of accounting for the Ohmic contact resistance in the model of the Ohmic behavior in nanoelectronics experiments. The 95% confidence interval of resistivity in the proposed one-step procedure is determined to be 3.57 +/- 0.0274 x 10( - 5) ohm cm, which should be a more reliable and precise estimate. The statistical quantification technique should find wide applications in obtaining better estimations from various systematic errors and biased effects that become more significant at the nanoscale.

  10. Crop residues quantification to obtain self-consumption compost in an organic garden

    NASA Astrophysics Data System (ADS)

    Lopez de Fuentes, Pilar; Lopez Merino, María; Remedios Alvir, María; Briz de Felipe, Teresa

    2013-04-01

    This research focuses on quantifying the crop residue left after the campaign fall/winter (2011) for the organic garden crops of Agricultural ETSI, located in practice fields, to get compost for self-generated residues arising from within their own fields. This compost is produced by mixing this material with an organic residues source animal. In this way the plant organic residues provided the nitrogen required for an appropriate C/N and the animal organic residues can provide the carbon amount required to achieve an optimal scenario. The garden has a surface area of 180 m2 which was cultured with different seasonal vegetables, different families and attending practices and species associations' rotations, proper of farming techniques. The organic material of animal origin referred to, is rest from sheep renew bed, sustained management support the precepts of organic farming and cottage belongs to practice fields too. At the end of crop cycle, we proceeded to the harvest and sorting of usable crop residues, which was considered as net crop residues. In each case, these residues were subjected to a cutting treatment by the action of a mincing machine and then weighed to estimate the amounts given by each crop. For the sheep bed residue 1m2 was collected after three months having renewed. It had been made by providing 84 kg of straw bales in July and introducing about 12 Kg each. The herd consisted of three females and one playe. Each one of them was feed 300g and 600 g of straw per day. Two alternating different pens were used to simulate a regime of semi-intensive housing. A balance on how much organic residue material was obtained at the end and how much was obtained in the compost process is discussed in terms of volume and nutrients content is discussed.

  11. Quantification of Vibrio species in oysters from the Gulf of Mexico with two procedures based on MPN and PCR.

    PubMed

    Barrera-Escorcia, Guadalupe; Wong-Chang, Irma; Fernández-Rendón, Carlos Leopoldo; Botello, Alfonso Vázquez; Gómez-Gil, Bruno; Lizárraga-Partida, Marcial Leonardo

    2016-11-01

    Oysters can accumulate potentially pathogenic water bacteria. The objective of this study was to compare two procedures to quantify Vibrio species present in oysters to determine the most sensitive method. We analyzed oyster samples from the Gulf of Mexico, commercialized in Mexico City. The samples were inoculated in tubes with alkaline peptone water (APW), based on three tubes and four dilutions (10 -1 to 10 -4 ). From these tubes, the first quantification of Vibrio species was performed (most probable number (MPN) from tubes) and bacteria were inoculated by streaking on thiosulfate-citrate-bile salts-sucrose (TCBS) petri dishes. Colonies were isolated for a second quantification (MPN from dishes). Polymerase chain reaction (PCR) was used to determine species with specific primers: ompW for Vibrio cholerae, tlh for Vibrio parahaemolyticus, and VvhA for Vibrio vulnificus. Simultaneously, the sanitary quality of oysters was determined. The quantification of V. parahaemolyticus was significantly higher in APW tubes than in TCBS dishes. Regarding V. vulnificus counts, the differences among both approaches were not significant. In contrast, the MPNs of V. cholerae obtained from dishes were higher than from tubes. The quantification of MPNs through PCR of V. parahaemolyticus and V. vulnificus obtained from APW was sensitive and recommendable for the detection of both species. In contrast, to quantify V. cholerae, it was necessary to isolate colonies on TCBS prior PCR. Culturing in APW at 42 °C could be an alternative to avoid colony isolation. The MPNs of V. cholerae from dishes was associated with the bad sanitary quality of the samples.

  12. Quantification of active pharmaceutical ingredient and impurities in sildenafil citrate obtained from the Internet.

    PubMed

    Veronin, Michael A; Nutan, Mohammad T; Dodla, Uday Krishna Reddy

    2014-10-01

    The accessibility of prescription drugs produced outside of the United States, most notably sildenafil citrate (innovator product, Viagra®), has been made much easier by the Internet. Of greatest concern to clinicians and policymakers is product quality and patient safety. The US Food and Drug Administration (FDA) has issued warnings to potential buyers that the safety of drugs purchased from the Internet cannot be guaranteed, and may present a health risk to consumers from substandard products. The objective of this study was to determine whether generic sildenafil citrate tablets from international markets obtained via the Internet are equivalent to the US innovator product regarding major aspects of pharmaceutical quality: potency, accuracy of labeling, and presence and level of impurities. This will help identify aspects of drug quality that may impact public health risks. A total of 15 sildenafil citrate tablets were obtained for pharmaceutical analysis: 14 generic samples from international Internet pharmacy websites and the US innovator product. According to US Pharmacopeial guidelines, tablet samples were tested using high-performance liquid chromatography for potency of active pharmaceutical ingredient (API) and levels of impurities (impurities A, B, C, and D). Impurity levels were compared with International Conference on Harmonisation (ICH) limits. Among the 15 samples, 4 samples possessed higher impurity B levels than the ICH qualification threshold, 8 samples possessed higher impurity C levels than the ICH qualification threshold, and 4 samples possessed more than 1% impurity quantity of maximum daily dose (MDD). For API, 6 of the samples failed to fall within the 5% assay limit. Quality assurance tests are often used to detect formulation defects of drug products during the manufacturing and/or storage process. Results suggest that manufacturing standards for sildenafil citrate generic drug products compared with the US innovator product are not

  13. Quantification of uncertainties for application in detonation simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  14. A validated ultra high pressure liquid chromatographic method for qualification and quantification of folic acid in pharmaceutical preparations.

    PubMed

    Deconinck, E; Crevits, S; Baten, P; Courselle, P; De Beer, J

    2011-04-05

    A fully validated UHPLC method for the identification and quantification of folic acid in pharmaceutical preparations was developed. The starting conditions for the development were calculated starting from the HPLC conditions of a validated method. These start conditions were tested on four different UHPLC columns: Grace Vision HT™ C18-P, C18, C18-HL and C18-B (2 mm × 100 mm, 1.5 μm). After selection of the stationary phase, the method was further optimised by testing two aqueous and two organic phases and by adapting to a gradient method. The obtained method was fully validated based on its measurement uncertainty (accuracy profile) and robustness tests. A UHPLC method was obtained for the identification and quantification of folic acid in pharmaceutical preparations, which will cut analysis times and solvent consumption. Copyright © 2010 Elsevier B.V. All rights reserved.

  15. Relative quantification in seed GMO analysis: state of art and bottlenecks.

    PubMed

    Chaouachi, Maher; Bérard, Aurélie; Saïd, Khaled

    2013-06-01

    Reliable quantitative methods are needed to comply with current EU regulations on the mandatory labeling of genetically modified organisms (GMOs) and GMO-derived food and feed products with a minimum GMO content of 0.9 %. The implementation of EU Commission Recommendation 2004/787/EC on technical guidance for sampling and detection which meant as a helpful tool for the practical implementation of EC Regulation 1830/2003, which states that "the results of quantitative analysis should be expressed as the number of target DNA sequences per target taxon specific sequences calculated in terms of haploid genomes". This has led to an intense debate on the type of calibrator best suitable for GMO quantification. The main question addressed in this review is whether reference materials and calibrators should be matrix based or whether pure DNA analytes should be used for relative quantification in GMO analysis. The state of the art, including the advantages and drawbacks, of using DNA plasmid (compared to genomic DNA reference materials) as calibrators, is widely described. In addition, the influence of the genetic structure of seeds on real-time PCR quantitative results obtained for seed lots is discussed. The specific composition of a seed kernel, the mode of inheritance, and the ploidy level ensure that there is discordance between a GMO % expressed as a haploid genome equivalent and a GMO % based on numbers of seeds. This means that a threshold fixed as a percentage of seeds cannot be used as such for RT-PCR. All critical points that affect the expression of the GMO content in seeds are discussed in this paper.

  16. Quantification of Campylobacter spp. in pig feces by direct real-time PCR with an internal control of extraction and amplification.

    PubMed

    Leblanc-Maridor, Mily; Garénaux, Amélie; Beaudeau, François; Chidaine, Bérangère; Seegers, Henri; Denis, Martine; Belloc, Catherine

    2011-04-01

    The rapid and direct quantification of Campylobacter spp. in complex substrates like feces or environmental samples is crucial to facilitate epidemiological studies on Campylobacter in pig production systems. We developed a real-time PCR assay for detecting and quantifying Campylobacter spp. directly in pig feces with the use of an internal control. Campylobacter spp. and Yersinia ruckeri primers-probes sets were designed and checked for specificity with diverse Campylobacter, related organisms, and other bacterial pathogens before being used in field samples. The quantification of Campylobacter spp. by the real-time PCR then was realized on 531 fecal samples obtained from experimentally and naturally infected pigs; the numeration of Campylobacter on Karmali plate was done in parallel. Yersinia ruckeri, used as bacterial internal control, was added to the samples before DNA extraction to control DNA-extraction and PCR-amplification. The sensitivity of the PCR assay was 10 genome copies. The established Campylobacter real-time PCR assay showed a 7-log-wide linear dynamic range of quantification (R²=0.99) with a detection limit of 200 Colony Forming Units of Campylobacter per gram of feces. A high correlation was found between the results obtained by real-time PCR and those by culture at both qualitative and quantitative levels. Moreover, DNA extraction followed by real-time PCR reduced the time needed for analysis to a few hours (within a working day). In conclusion, the real-time PCR developed in this study provides new tools for further epidemiological surveys to investigate the carriage and excretion of Campylobacter by pigs. Copyright © 2011 Elsevier B.V. All rights reserved.

  17. HPLC Quantification of astaxanthin and canthaxanthin in Salmonidae eggs.

    PubMed

    Tzanova, Milena; Argirova, Mariana; Atanasov, Vasil

    2017-04-01

    Astaxanthin and canthaxanthin are naturally occurring antioxidants referred to as xanthophylls. They are used as food additives in fish farms to improve the organoleptic qualities of salmonid products and to prevent reproductive diseases. This study reports the development and single-laboratory validation of a rapid method for quantification of astaxanthin and canthaxanthin in eggs of rainbow trout (Oncorhynchus mykiss) and brook trout (Salvelinus fontinalis М.). An advantage of the proposed method is the perfect combination of selective extraction of the xanthophylls and analysis of the extract by high-performance liquid chromatography and photodiode array detection. The method validation was carried out in terms of linearity, accuracy, precision, recovery and limits of detection and quantification. The method was applied for simultaneous quantification of the two xanthophylls in eggs of rainbow trout and brook trout after their selective extraction. The results show that astaxanthin accumulations in salmonid fish eggs are larger than those of canthaxanthin. As the levels of these two xanthophylls affect fish fertility, this method can be used to improve the nutritional quality and to minimize the occurrence of the M74 syndrome in fish populations. Copyright © 2016 John Wiley & Sons, Ltd.

  18. Mass spectrometry–based relative quantification of proteins in precatalytic and catalytically active spliceosomes by metabolic labeling (SILAC), chemical labeling (iTRAQ), and label-free spectral count

    PubMed Central

    Schmidt, Carla; Grønborg, Mads; Deckert, Jochen; Bessonov, Sergey; Conrad, Thomas; Lührmann, Reinhard; Urlaub, Henning

    2014-01-01

    The spliceosome undergoes major changes in protein and RNA composition during pre-mRNA splicing. Knowing the proteins—and their respective quantities—at each spliceosomal assembly stage is critical for understanding the molecular mechanisms and regulation of splicing. Here, we applied three independent mass spectrometry (MS)–based approaches for quantification of these proteins: (1) metabolic labeling by SILAC, (2) chemical labeling by iTRAQ, and (3) label-free spectral count for quantification of the protein composition of the human spliceosomal precatalytic B and catalytic C complexes. In total we were able to quantify 157 proteins by at least two of the three approaches. Our quantification shows that only a very small subset of spliceosomal proteins (the U5 and U2 Sm proteins, a subset of U5 snRNP-specific proteins, and the U2 snRNP-specific proteins U2A′ and U2B′′) remains unaltered upon transition from the B to the C complex. The MS-based quantification approaches classify the majority of proteins as dynamically associated specifically with the B or the C complex. In terms of experimental procedure and the methodical aspect of this work, we show that metabolically labeled spliceosomes are functionally active in terms of their assembly and splicing kinetics and can be utilized for quantitative studies. Moreover, we obtain consistent quantification results from all three methods, including the relatively straightforward and inexpensive label-free spectral count technique. PMID:24448447

  19. A fully automated colorimetric sensing device using smartphone for biomolecular quantification

    NASA Astrophysics Data System (ADS)

    Dutta, Sibasish; Nath, Pabitra

    2017-03-01

    In the present work, the use of smartphone for colorimetric quantification of biomolecules has been demonstrated. As a proof-of-concept, BSA protein and carbohydrate have been used as biomolecular sample. BSA protein and carbohydrate at different concentrations have been treated with Lowry's reagent and Anthrone's reagent respectively . The change in color of the reagent-treated samples at different concentrations have been recorded with the camera of a smartphone in combination with a custom designed optomechanical hardware attachment. This change in color of the reagent-treated samples has been correlated with color channels of two different color models namely RGB (Red Green Blue) and HSV (Hue Saturation and Value) model. In addition to that, the change in color intensity has also been correlated with the grayscale value for each of the imaged sample. A custom designed android app has been developed to quantify the bimolecular concentration and display the result in the phone itself. The obtained results have been compared with that of standard spectrophotometer usually considered for the purpose and highly reliable data have been obtained with the designed sensor. The device is robust, portable and low cost as compared to its commercially available counterparts. The data obtained from the sensor can be transmitted to anywhere in the world through the existing cellular network. It is envisioned that the designed sensing device would find wide range of applications in the field of analytical and bioanalytical sensing research.

  20. Rapid quantification and sex determination of forensic evidence materials.

    PubMed

    Andréasson, Hanna; Allen, Marie

    2003-11-01

    DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.

  1. Quantification and imaging of HER2 protein using nanocrystals conjugated with single-domain antibodies

    NASA Astrophysics Data System (ADS)

    Glukhov, S.; Berestovoy, M.; Chames, P.; Baty, D.; Nabiev, I.; Sukhanova, A.

    2017-01-01

    This study dealt with quantification and imaging of human epidermal growth factor receptor 2 (HER2), an important prognostic marker for cancer diagnosis and treatment, using specific quantum-dot-based conjugates. Fluorescent inorganic nanocrystals or quantum dots (QDs) are extremely highly resistant to photobleaching and have a high emission quantum yield and a continuous range of emission spectra, from the ultraviolet to the infrared regions. Ultrasmall nanoprobes consisting of highly affine anti-HER2 single-domain antibodies (sdAbs or "nanobodies") conjugated with QDs in a strictly oriented manner have been designed. QDs with a fluorescence peak maxima at wavelengths of 562 nm, 569 nm, 570 nm or in the near-infrared region were used. Here, we present our results of ISA quantification of HER2 protein, in situ imaging of HER2 protein on the surface of HER2-positive SK-BR-3 cells in immunohistochemical experiments, and counting of stained with anti-HER2 conjugates HER2-positive SK-BR-3 cells in their mixture with unstained cells of the same culture in flow cytometry experiments. The data demonstrate that the anti-HER2 QD-sdAb conjugates obtained are highly specific and sensitive and could be used in numerous applications for advanced integrated diagnosis.

  2. Hydrophilic interaction liquid chromatography for the separation, purification, and quantification of raffinose family oligosaccharides from Lycopus lucidus Turcz.

    PubMed

    Liang, Tu; Fu, Qing; Li, Fangbing; Zhou, Wei; Xin, Huaxia; Wang, Hui; Jin, Yu; Liang, Xinmiao

    2015-08-01

    A systematic strategy based on hydrophilic interaction liquid chromatography was developed for the separation, purification and quantification of raffinose family oligosaccharides from Lycopus lucidus Turcz. Methods with enough hydrophilicity and selectivity were utilized to resolve the problems encountered in the separation of oligosaccharides such as low retention, low resolution and poor solubility. The raffinose family oligosaccharides in L. lucidus Turcz. were isolated using solid-phase extraction followed by hydrophilic interaction liquid chromatography at semi-preparative scale to obtain standards of stachyose, verbascose and ajugose. Utilizing the obtained oligosaccharides as standards, a quantitative determination method was developed, validated and applied for the content determination of raffinose family oligosaccharides both in the aerial and root parts of L. lucidus Turcz. There were no oligosaccharides in the aerial parts, while in the root parts, the total content was 686.5 mg/g with the average distribution: raffinose 66.5 mg/g, stachyose 289.0 mg/g, verbascose 212.4 mg/g, and ajugose 118.6 mg/g. The result provided the potential of roots of L. lucidus Turcz. as new raffinose family oligosaccharides sources for functional food. Moreover, since the present systematic strategy is efficient, sensitive and robust, separation, purification and quantification of oligosaccharides by hydrophilic interaction liquid chromatography seems to be possible. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  3. Simple quantification of phenolic compounds present in the minor fraction of virgin olive oil by LC-DAD-FLD.

    PubMed

    Godoy-Caballero, M P; Acedo-Valenzuela, M I; Galeano-Díaz, T

    2012-11-15

    This paper presents the results of the study on the extraction, identification and quantification of a group of important phenolic compounds in virgin olive oil (VOO) samples, obtained from olives of various varieties, by liquid chromatography coupled to UV-vis and fluorescence detection. Sixteen phenolic compounds belonging to different families have been identified and quantified spending a total time of 25 min. The linearity was examined by establishing the external standard calibration curves. Four order linear ranges and limits of detection ranging from 0.02 to 0.6 μg mL(-1) and 0.006 to 0.3 μg mL(-1) were achieved using UV-vis and fluorescence detection, respectively. Regarding the real samples, for the determination of the phenolic compounds in higher concentrations (hydroxytyrosol and tyrosol) a simple liquid-liquid extraction with ethanol was used to make the sample compatible with the mobile phase. Recovery values close to 100% were obtained. However, a previous solid phase extraction with Diol cartridges was necessary to concentrate and separate the minor phenolic compounds of the main interferences. The parameters affecting this step were carefully optimized and, after that, recoveries near 80-100% were obtained for the rest of the studied phenolic compounds. Also, the limits of detection were improved 15 times. Finally, the standard addition method was carried out for each of the analytes and no matrix effect was found, so the quantification of the 16 phenolic compounds from different monovarietal VOO was carried out by using the corresponding external standard calibration plot. Copyright © 2012 Elsevier B.V. All rights reserved.

  4. Electrochemical study of the anticancer drug daunorubicin at a water/oil interface: drug lipophilicity and quantification.

    PubMed

    Ribeiro, José A; Silva, F; Pereira, Carlos M

    2013-02-05

    In this work, the ion transfer mechanism of the anticancer drug daunorubicin (DNR) at a liquid/liquid interface has been studied for the first time. This study was carried out using electrochemical techniques, namely cyclic voltammetry (CV) and differential pulse voltammetry (DPV). The lipophilicity of DNR was investigated at the water/1,6-dichlorohexane (DCH) interface, and the results obtained were presented in the form of an ionic partition diagram. The partition coefficients of both neutral and ionic forms of the drug were determined. The analytical parameter for the detection of DNR was also investigated in this work. An electrochemical DNR sensor is proposed by means of simple ion transfer at the water/DCH interface, using DPV as the quantification technique. Experimental conditions for the analytical determination of DNR were established, and a detection limit of 0.80 μM was obtained.

  5. Fingerprinting and quantification of GMOs in the agro-food sector.

    PubMed

    Taverniers, I; Van Bockstaele, E; De Loose, M

    2003-01-01

    Most strategies for analyzing GMOs in plants and derived food and feed products, are based on the polymerase chain reaction (PCR) technique. In conventional PCR methods, a 'known' sequence between two specific primers is amplified. To the contrary, with the 'anchor PCR' technique, unknown sequences adjacent to a known sequence, can be amplified. Because T-DNA/plant border sequences are being amplified, anchor PCR is the perfect tool for unique identification of transgenes, including non-authorized GMOs. In this work, anchor PCR was applied to characterize the 'transgene locus' and to clarify the complete molecular structure of at least six different commercial transgenic plants. Based on sequences of T-DNA/plant border junctions, obtained by anchor PCR, event specific primers were developed. The junction fragments, together with endogeneous reference gene targets, were cloned in plasmids. The latter were then used as event specific calibrators in real-time PCR, a new technique for the accurate relative quantification of GMOs. We demonstrate here the importance of anchor PCR for identification and the usefulness of plasmid DNA calibrators in quantification strategies for GMOs, throughout the agro-food sector.

  6. Distortion of genetically modified organism quantification in processed foods: influence of particle size compositions and heat-induced DNA degradation.

    PubMed

    Moreano, Francisco; Busch, Ulrich; Engel, Karl-Heinz

    2005-12-28

    Milling fractions from conventional and transgenic corn were prepared at laboratory scale and used to study the influence of sample composition and heat-induced DNA degradation on the relative quantification of genetically modified organisms (GMO) in food products. Particle size distributions of the obtained fractions (coarse grits, regular grits, meal, and flour) were characterized using a laser diffraction system. The application of two DNA isolation protocols revealed a strong correlation between the degree of comminution of the milling fractions and the DNA yield in the extracts. Mixtures of milling fractions from conventional and transgenic material (1%) were prepared and analyzed via real-time polymerase chain reaction. Accurate quantification of the adjusted GMO content was only possible in mixtures containing conventional and transgenic material in the form of analogous milling fractions, whereas mixtures of fractions exhibiting different particle size distributions delivered significantly over- and underestimated GMO contents depending on their compositions. The process of heat-induced nucleic acid degradation was followed by applying two established quantitative assays showing differences between the lengths of the recombinant and reference target sequences (A, deltal(A) = -25 bp; B, deltal(B) = +16 bp; values related to the amplicon length of the reference gene). Data obtained by the application of method A resulted in underestimated recoveries of GMO contents in the samples of heat-treated products, reflecting the favored degradation of the longer target sequence used for the detection of the transgene. In contrast, data yielded by the application of method B resulted in increasingly overestimated recoveries of GMO contents. The results show how commonly used food technological processes may lead to distortions in the results of quantitative GMO analyses.

  7. Quantity and quality of black carbon molecular markers as obtained by two chromatographic methods (GC-FID and HPLC-DAD) - How do results compare?

    NASA Astrophysics Data System (ADS)

    Schneider, M. P. W.; Smittenberg, R. H.; Dittmar, T.; Schmidt, M. W. I.

    2009-04-01

    Chars produced by wildfires are an important source of black carbon (BC) in the environment. After their deposition on the soil surface they can be distributed into rivers, marine waters and sediments. The analysis of benzenepolycarboxylic acids (BPCAs) as a quantitative measure for black carbon (BC) in soil and sediment samples is a well-established method (Glaser et al., 1998; Brodowski et al., 2005). Briefly, the nitric acid oxidation of fused aromatic ring systems in BC forms eight molecular markers (BPCAs), which can be assigned to BC, and which subsequently can be quantified by GC-FID (gas chromatography with flame ionization detector). Recently, this method was modified for the quantification of BC in seawater samples using HPLC-DAD (High performance liquid chromatography with diode array detector) for the determination of individual BPCAs (Dittmar, 2008). A direct comparison of both analytical techniques is lacking but would be important for future data comparison aimed at the calculation of global BC budgets. Here we present a systematic comparison of the two BPCA quantification methods. We prepared chars under well-defined laboratory conditions. In order to cover a broad spectrum of char properties we used two sources of biomass and a wide range of pyrolysis temperatures. Chestnut hardwood chips (Castanea sativa) and rice straw (Oryza sativa) were pyrolysed at temperatures between 200 and 1000°C under a constant N2 stream. The maximum temperatures were held constant for 5 hours (Hammes et al., 2006). The BC contents of the chars have been analysed using the BPCA extraction method followed by either GC-FID or HPLC-DAD quantification. Preliminary results suggest that both methods yield similar total quantities of BPCA, and also the proportions of the individual markers are similar. Ongoing experiments will allow for a more detailed comparison of the two methods. The BPCA composition of chars formed at different temperatures and from different precursor

  8. CometQ: An automated tool for the detection and quantification of DNA damage using comet assay image analysis.

    PubMed

    Ganapathy, Sreelatha; Muraleedharan, Aparna; Sathidevi, Puthumangalathu Savithri; Chand, Parkash; Rajkumar, Ravi Philip

    2016-09-01

    DNA damage analysis plays an important role in determining the approaches for treatment and prevention of various diseases like cancer, schizophrenia and other heritable diseases. Comet assay is a sensitive and versatile method for DNA damage analysis. The main objective of this work is to implement a fully automated tool for the detection and quantification of DNA damage by analysing comet assay images. The comet assay image analysis consists of four stages: (1) classifier (2) comet segmentation (3) comet partitioning and (4) comet quantification. Main features of the proposed software are the design and development of four comet segmentation methods, and the automatic routing of the input comet assay image to the most suitable one among these methods depending on the type of the image (silver stained or fluorescent stained) as well as the level of DNA damage (heavily damaged or lightly/moderately damaged). A classifier stage, based on support vector machine (SVM) is designed and implemented at the front end, to categorise the input image into one of the above four groups to ensure proper routing. Comet segmentation is followed by comet partitioning which is implemented using a novel technique coined as modified fuzzy clustering. Comet parameters are calculated in the comet quantification stage and are saved in an excel file. Our dataset consists of 600 silver stained images obtained from 40 Schizophrenia patients with different levels of severity, admitted to a tertiary hospital in South India and 56 fluorescent stained images obtained from different internet sources. The performance of "CometQ", the proposed standalone application for automated analysis of comet assay images, is evaluated by a clinical expert and is also compared with that of a most recent and related software-OpenComet. CometQ gave 90.26% positive predictive value (PPV) and 93.34% sensitivity which are much higher than those of OpenComet, especially in the case of silver stained images. The

  9. Reliability and discriminatory power of methods for dental plaque quantification

    PubMed Central

    RAGGIO, Daniela Prócida; BRAGA, Mariana Minatel; RODRIGUES, Jonas Almeida; FREITAS, Patrícia Moreira; IMPARATO, José Carlos Pettorossi; MENDES, Fausto Medeiros

    2010-01-01

    Objective This in situ study evaluated the discriminatory power and reliability of methods of dental plaque quantification and the relationship between visual indices (VI) and fluorescence camera (FC) to detect plaque. Material and Methods Six volunteers used palatal appliances with six bovine enamel blocks presenting different stages of plaque accumulation. The presence of plaque with and without disclosing was assessed using VI. Images were obtained with FC and digital camera in both conditions. The area covered by plaque was assessed. Examinations were done by two independent examiners. Data were analyzed by Kruskal-Wallis and Kappa tests to compare different conditions of samples and to assess the inter-examiner reproducibility. Results Some methods presented adequate reproducibility. The Turesky index and the assessment of area covered by disclosed plaque in the FC images presented the highest discriminatory powers. Conclusions The Turesky index and images with FC with disclosing present good reliability and discriminatory power in quantifying dental plaque. PMID:20485931

  10. A stochastic approach to uncertainty quantification in residual moveout analysis

    NASA Astrophysics Data System (ADS)

    Johng-Ay, T.; Landa, E.; Dossou-Gbété, S.; Bordes, L.

    2015-06-01

    Oil and gas exploration and production relies usually on the interpretation of a single seismic image, which is obtained from observed data. However, the statistical nature of seismic data and the various approximations and assumptions are sources of uncertainties which may corrupt the evaluation of parameters. The quantification of these uncertainties is a major issue which supposes to help in decisions that have important social and commercial implications. The residual moveout analysis, which is an important step in seismic data processing is usually performed by a deterministic approach. In this paper we discuss a Bayesian approach to the uncertainty analysis.

  11. A precise and accurate acupoint location obtained on the face using consistency matrix pointwise fusion method.

    PubMed

    Yanq, Xuming; Ye, Yijun; Xia, Yong; Wei, Xuanzhong; Wang, Zheyu; Ni, Hongmei; Zhu, Ying; Xu, Lingyu

    2015-02-01

    To develop a more precise and accurate method, and identified a procedure to measure whether an acupoint had been correctly located. On the face, we used an acupoint location from different acupuncture experts and obtained the most precise and accurate values of acupoint location based on the consistency information fusion algorithm, through a virtual simulation of the facial orientation coordinate system. Because of inconsistencies in each acupuncture expert's original data, the system error the general weight calculation. First, we corrected each expert of acupoint location system error itself, to obtain a rational quantification for each expert of acupuncture and moxibustion acupoint location consistent support degree, to obtain pointwise variable precision fusion results, to put every expert's acupuncture acupoint location fusion error enhanced to pointwise variable precision. Then, we more effectively used the measured characteristics of different acupuncture expert's acupoint location, to improve the measurement information utilization efficiency and acupuncture acupoint location precision and accuracy. Based on using the consistency matrix pointwise fusion method on the acupuncture experts' acupoint location values, each expert's acupoint location information could be calculated, and the most precise and accurate values of each expert's acupoint location could be obtained.

  12. Survey of Existing Uncertainty Quantification Capabilities for Army Relevant Problems

    DTIC Science & Technology

    2017-11-27

    ARL-TR-8218•NOV 2017 US Army Research Laboratory Survey of Existing Uncertainty Quantification Capabilities for Army-Relevant Problems by James J...NOV 2017 US Army Research Laboratory Survey of Existing Uncertainty Quantification Capabilities for Army-Relevant Problems by James J Ramsey...Rev. 8/98)    Prescribed by ANSI Std. Z39.18 November 2017 Technical Report Survey of Existing Uncertainty Quantification Capabilities for Army

  13. Accurate proteome-wide protein quantification from high-resolution 15N mass spectra

    PubMed Central

    2011-01-01

    In quantitative mass spectrometry-based proteomics, the metabolic incorporation of a single source of 15N-labeled nitrogen has many advantages over using stable isotope-labeled amino acids. However, the lack of a robust computational framework for analyzing the resulting spectra has impeded wide use of this approach. We have addressed this challenge by introducing a new computational methodology for analyzing 15N spectra in which quantification is integrated with identification. Application of this method to an Escherichia coli growth transition reveals significant improvement in quantification accuracy over previous methods. PMID:22182234

  14. A novel strategy to obtain quantitative data for modelling: combined enrichment and real-time PCR for enumeration of salmonellae from pig carcasses.

    PubMed

    Krämer, Nadine; Löfström, Charlotta; Vigre, Håkan; Hoorfar, Jeffrey; Bunge, Cornelia; Malorny, Burkhard

    2011-03-01

    Salmonella is a major zoonotic pathogen which causes outbreaks and sporadic cases of gastroenteritis in humans worldwide. The primary sources for Salmonella are food-producing animals such as pigs and poultry. For risk assessment and hazard analysis and critical control point (HACCP) concepts, it is essential to produce large amounts of quantitative data, which is currently not achievable with the standard cultural based methods for enumeration of Salmonella. This study presents the development of a novel strategy to enumerate low numbers of Salmonella in cork borer samples taken from pig carcasses as a first concept and proof of principle for a new sensitive and rapid quantification method based on combined enrichment and real-time PCR. The novelty of the approach is in the short pre-enrichment step, where for most bacteria, growth is in the log phase. The method consists of an 8h pre-enrichment of the cork borer sample diluted 1:10 in non-selective buffered peptone water, followed by DNA extraction, and Salmonella detection and quantification by real-time PCR. The limit of quantification was 1.4 colony forming units (CFU)/20 cm(2) (approximately 10 g) of artificially contaminated sample with 95% confidence interval of ± 0.7 log CFU/sample. The precision was similar to the standard reference most probable number (MPN) method. A screening of 200 potentially naturally contaminated cork borer samples obtained over seven weeks in a slaughterhouse resulted in 25 Salmonella-positive samples. The analysis of salmonellae within these samples showed that the PCR method had a higher sensitivity for samples with a low contamination level (<6.7 CFU/sample), where 15 of the samples negative with the MPN method was detected with the PCR method and 5 were found to be negative by both methods. For the samples with a higher contamination level (6.7-310 CFU/sample) a good agreement between the results obtained with the PCR and MPN methods was obtained. The quantitative real

  15. Calibration of BCR-ABL1 mRNA quantification methods using genetic reference materials is a valid strategy to report results on the international scale.

    PubMed

    Mauté, Carole; Nibourel, Olivier; Réa, Delphine; Coiteux, Valérie; Grardel, Nathalie; Preudhomme, Claude; Cayuela, Jean-Michel

    2014-09-01

    Until recently, diagnostic laboratories that wanted to report on the international scale had limited options: they had to align their BCR-ABL1 quantification methods through a sample exchange with a reference laboratory to derive a conversion factor. However, commercial methods calibrated on the World Health Organization genetic reference panel are now available. We report results from a study designed to assess the comparability of the two alignment strategies. Sixty follow-up samples from chronic myeloid leukemia patients were included. Two commercial methods calibrated on the genetic reference panel were compared to two conversion factor methods routinely used at Saint-Louis Hospital, Paris, and at Lille University Hospital. Results were matched against concordance criteria (i.e., obtaining at least two of the three following landmarks: 50, 75 and 90% of the patient samples within a 2-fold, 3-fold and 5-fold range, respectively). Out of the 60 samples, more than 32 were available for comparison. Compared to the conversion factor method, the two commercial methods were within a 2-fold, 3-fold and 5-fold range for 53 and 59%, 89 and 88%, 100 and 97%, respectively of the samples analyzed at Saint-Louis. At Lille, results were 45 and 85%, 76 and 97%, 100 and 100%, respectively. Agreements between methods were observed in the four comparisons performed. Our data show that the two commercial methods selected are concordant with the conversion factor methods. This study brings the proof of principle that alignment on the international scale using the genetic reference panel is compatible with the patient sample exchange procedure. We believe that these results are particularly important for diagnostic laboratories wishing to adopt commercial methods. Copyright © 2014 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  16. Compositional Solution Space Quantification for Probabilistic Software Analysis

    NASA Technical Reports Server (NTRS)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  17. Metering error quantification under voltage and current waveform distortion

    NASA Astrophysics Data System (ADS)

    Wang, Tao; Wang, Jia; Xie, Zhi; Zhang, Ran

    2017-09-01

    With integration of more and more renewable energies and distortion loads into power grid, the voltage and current waveform distortion results in metering error in the smart meters. Because of the negative effects on the metering accuracy and fairness, it is an important subject to study energy metering combined error. In this paper, after the comparing between metering theoretical value and real recorded value under different meter modes for linear and nonlinear loads, a quantification method of metering mode error is proposed under waveform distortion. Based on the metering and time-division multiplier principles, a quantification method of metering accuracy error is proposed also. Analyzing the mode error and accuracy error, a comprehensive error analysis method is presented which is suitable for new energy and nonlinear loads. The proposed method has been proved by simulation.

  18. A machine learning approach for efficient uncertainty quantification using multiscale methods

    NASA Astrophysics Data System (ADS)

    Chan, Shing; Elsheikh, Ahmed H.

    2018-02-01

    Several multiscale methods account for sub-grid scale features using coarse scale basis functions. For example, in the Multiscale Finite Volume method the coarse scale basis functions are obtained by solving a set of local problems over dual-grid cells. We introduce a data-driven approach for the estimation of these coarse scale basis functions. Specifically, we employ a neural network predictor fitted using a set of solution samples from which it learns to generate subsequent basis functions at a lower computational cost than solving the local problems. The computational advantage of this approach is realized for uncertainty quantification tasks where a large number of realizations has to be evaluated. We attribute the ability to learn these basis functions to the modularity of the local problems and the redundancy of the permeability patches between samples. The proposed method is evaluated on elliptic problems yielding very promising results.

  19. Secondary Students' Quantification of Ratio and Rate: A Framework for Reasoning about Change in Covarying Quantities

    ERIC Educational Resources Information Center

    Johnson, Heather Lynn

    2015-01-01

    Contributing to a growing body of research addressing secondary students' quantitative and covariational reasoning, the multiple case study reported in this article investigated secondary students' quantification of ratio and rate. This article reports results from a study investigating students' quantification of rate and ratio as…

  20. Accurate Quantification of Cardiovascular Biomarkers in Serum Using Protein Standard Absolute Quantification (PSAQ™) and Selected Reaction Monitoring*

    PubMed Central

    Huillet, Céline; Adrait, Annie; Lebert, Dorothée; Picard, Guillaume; Trauchessec, Mathieu; Louwagie, Mathilde; Dupuis, Alain; Hittinger, Luc; Ghaleh, Bijan; Le Corvoisier, Philippe; Jaquinod, Michel; Garin, Jérôme; Bruley, Christophe; Brun, Virginie

    2012-01-01

    Development of new biomarkers needs to be significantly accelerated to improve diagnostic, prognostic, and toxicity monitoring as well as therapeutic follow-up. Biomarker evaluation is the main bottleneck in this development process. Selected Reaction Monitoring (SRM) combined with stable isotope dilution has emerged as a promising option to speed this step, particularly because of its multiplexing capacities. However, analytical variabilities because of upstream sample handling or incomplete trypsin digestion still need to be resolved. In 2007, we developed the PSAQ™ method (Protein Standard Absolute Quantification), which uses full-length isotope-labeled protein standards to quantify target proteins. In the present study we used clinically validated cardiovascular biomarkers (LDH-B, CKMB, myoglobin, and troponin I) to demonstrate that the combination of PSAQ and SRM (PSAQ-SRM) allows highly accurate biomarker quantification in serum samples. A multiplex PSAQ-SRM assay was used to quantify these biomarkers in clinical samples from myocardial infarction patients. Good correlation between PSAQ-SRM and ELISA assay results was found and demonstrated the consistency between these analytical approaches. Thus, PSAQ-SRM has the capacity to improve both accuracy and reproducibility in protein analysis. This will be a major contribution to efficient biomarker development strategies. PMID:22080464

  1. Results of the 2013 UT modeling benchmark obtained with models implemented in CIVA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Toullelan, Gwénaël; Raillon, Raphaële; Chatillon, Sylvain

    The 2013 Ultrasonic Testing (UT) modeling benchmark concerns direct echoes from side drilled holes (SDH), flat bottom holes (FBH) and corner echoes from backwall breaking artificial notches inspected with a matrix phased array probe. This communication presents the results obtained with the models implemented in the CIVA software: the pencilmodel is used to compute the field radiated by the probe, the Kirchhoff approximation is applied to predict the response of FBH and notches and the SOV (Separation Of Variables) model is used for the SDH responses. The comparison between simulated and experimental results are presented and discussed.

  2. Comparison of results obtained with various sensors used to measure fluctuating quantities in jets.

    NASA Technical Reports Server (NTRS)

    Parthasarathy, S. P.; Massier, P. F.; Cuffel, R. F.

    1973-01-01

    An experimental investigation has been conducted to compare the results obtained with six different instruments that sense fluctuating quantities in free jets. These sensors are typical of those that have recently been used by various investigators who are engaged in experimental studies of jet noise. Intensity distributions and two-point correlations with space separation and time delay were obtained. The static pressure, density, and velocity fluctuations are well correlated over the entire cross section of the jet and the cross-correlations persist for several jet diameters along the flow direction. The eddies appear to be flattened in the flow direction by a ratio of 0.4.

  3. PSEA-Quant: a protein set enrichment analysis on label-free and label-based protein quantification data.

    PubMed

    Lavallée-Adam, Mathieu; Rauniyar, Navin; McClatchy, Daniel B; Yates, John R

    2014-12-05

    The majority of large-scale proteomics quantification methods yield long lists of quantified proteins that are often difficult to interpret and poorly reproduced. Computational approaches are required to analyze such intricate quantitative proteomics data sets. We propose a statistical approach to computationally identify protein sets (e.g., Gene Ontology (GO) terms) that are significantly enriched with abundant proteins with reproducible quantification measurements across a set of replicates. To this end, we developed PSEA-Quant, a protein set enrichment analysis algorithm for label-free and label-based protein quantification data sets. It offers an alternative approach to classic GO analyses, models protein annotation biases, and allows the analysis of samples originating from a single condition, unlike analogous approaches such as GSEA and PSEA. We demonstrate that PSEA-Quant produces results complementary to GO analyses. We also show that PSEA-Quant provides valuable information about the biological processes involved in cystic fibrosis using label-free protein quantification of a cell line expressing a CFTR mutant. Finally, PSEA-Quant highlights the differences in the mechanisms taking place in the human, rat, and mouse brain frontal cortices based on tandem mass tag quantification. Our approach, which is available online, will thus improve the analysis of proteomics quantification data sets by providing meaningful biological insights.

  4. PSEA-Quant: A Protein Set Enrichment Analysis on Label-Free and Label-Based Protein Quantification Data

    PubMed Central

    2015-01-01

    The majority of large-scale proteomics quantification methods yield long lists of quantified proteins that are often difficult to interpret and poorly reproduced. Computational approaches are required to analyze such intricate quantitative proteomics data sets. We propose a statistical approach to computationally identify protein sets (e.g., Gene Ontology (GO) terms) that are significantly enriched with abundant proteins with reproducible quantification measurements across a set of replicates. To this end, we developed PSEA-Quant, a protein set enrichment analysis algorithm for label-free and label-based protein quantification data sets. It offers an alternative approach to classic GO analyses, models protein annotation biases, and allows the analysis of samples originating from a single condition, unlike analogous approaches such as GSEA and PSEA. We demonstrate that PSEA-Quant produces results complementary to GO analyses. We also show that PSEA-Quant provides valuable information about the biological processes involved in cystic fibrosis using label-free protein quantification of a cell line expressing a CFTR mutant. Finally, PSEA-Quant highlights the differences in the mechanisms taking place in the human, rat, and mouse brain frontal cortices based on tandem mass tag quantification. Our approach, which is available online, will thus improve the analysis of proteomics quantification data sets by providing meaningful biological insights. PMID:25177766

  5. High-Throughput HPLC-MS/MS Method for Quantification of Ibuprofen Enantiomers in Human Plasma: Focus on Investigation of Metabolite Interference.

    PubMed

    Nakov, Natalija; Bogdanovska, Liljana; Acevska, Jelena; Tonic-Ribarska, Jasmina; Petkovska, Rumenka; Dimitrovska, Aneta; Kasabova, Lilia; Svinarov, Dobrin

    2016-11-01

    In this research, as a part of the development of fast and reliable HPLC-MS/MS method for quantification of ibuprofen (IBP) enantiomers in human plasma, the possibility of IBP acylglucoronide (IBP-Glu) back-conversion was assessed. This involved investigation of in source and in vitro back-conversion. The separation of IBP enantiomers, its metabolite and rac-IBP-d3 (internal standard), was achieved within 6 min using Chiracel OJ-RH chromatographic column (150 × 2.1 mm, 5 μm). The followed selected reaction monitoring transitions for IBP-Glu (m/z 381.4 → 205.4, m/z 381.4 → 161.4 and m/z 205.4 → 161.4) implied that under the optimized electrospray ionization parameters, in source back-conversion of IBP-Glu was insignificant. The results obtained after liquid-liquid extraction of plasma samples spiked with IBP-Glu revealed that the amount of IBP enantiomers generated by IBP-Glu back-conversion was far <20% of lower limit of quantification sample. These results indicate that the presence of IBP-Glu in real samples will not affect the quantification of the IBP enantiomers; thereby reliability of the method was improved. Additional advantage of the method is the short analysis time making it suitable for the large number of samples. The method was fully validated according to the EMA guideline and was shown to meet all requirements to be applied in a pharmacokinetic study. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  6. Quantification of Na+,K+ pumps and their transport rate in skeletal muscle: Functional significance

    PubMed Central

    2013-01-01

    During excitation, muscle cells gain Na+ and lose K+, leading to a rise in extracellular K+ ([K+]o), depolarization, and loss of excitability. Recent studies support the idea that these events are important causes of muscle fatigue and that full use of the Na+,K+-ATPase (also known as the Na+,K+ pump) is often essential for adequate clearance of extracellular K+. As a result of their electrogenic action, Na+,K+ pumps also help reverse depolarization arising during excitation, hyperkalemia, and anoxia, or from cell damage resulting from exercise, rhabdomyolysis, or muscle diseases. The ability to evaluate Na+,K+-pump function and the capacity of the Na+,K+ pumps to fill these needs require quantification of the total content of Na+,K+ pumps in skeletal muscle. Inhibition of Na+,K+-pump activity, or a decrease in their content, reduces muscle contractility. Conversely, stimulation of the Na+,K+-pump transport rate or increasing the content of Na+,K+ pumps enhances muscle excitability and contractility. Measurements of [3H]ouabain binding to skeletal muscle in vivo or in vitro have enabled the reproducible quantification of the total content of Na+,K+ pumps in molar units in various animal species, and in both healthy people and individuals with various diseases. In contrast, measurements of 3-O-methylfluorescein phosphatase activity associated with the Na+,K+-ATPase may show inconsistent results. Measurements of Na+ and K+ fluxes in intact isolated muscles show that, after Na+ loading or intense excitation, all the Na+,K+ pumps are functional, allowing calculation of the maximum Na+,K+-pumping capacity, expressed in molar units/g muscle/min. The activity and content of Na+,K+ pumps are regulated by exercise, inactivity, K+ deficiency, fasting, age, and several hormones and pharmaceuticals. Studies on the α-subunit isoforms of the Na+,K+-ATPase have detected a relative increase in their number in response to exercise and the glucocorticoid dexamethasone but have not

  7. Mass Spectrometric Quantification of N-Linked Glycans by Reference to Exogenous Standards.

    PubMed

    Mehta, Nickita; Porterfield, Mindy; Struwe, Weston B; Heiss, Christian; Azadi, Parastoo; Rudd, Pauline M; Tiemeyer, Michael; Aoki, Kazuhiro

    2016-09-02

    Environmental and metabolic processes shape the profile of glycoprotein glycans expressed by cells, whether in culture, developing tissues, or mature organisms. Quantitative characterization of glycomic changes associated with these conditions has been achieved historically by reductive coupling of oligosaccharides to various fluorophores following release from glycoprotein and subsequent HPLC or capillary electrophoretic separation. Such labeling-based approaches provide a robust means of quantifying glycan amount based on fluorescence yield. Mass spectrometry, on the other hand, has generally been limited to relative quantification in which the contribution of the signal intensity for an individual glycan is expressed as a percent of the signal intensity summed over the total profile. Relative quantification has been valuable for highlighting changes in glycan expression between samples; sensitivity is high, and structural information can be derived by fragmentation. We have investigated whether MS-based glycomics is amenable to absolute quantification by referencing signal intensities to well-characterized oligosaccharide standards. We report the qualification of a set of N-linked oligosaccharide standards by NMR, HPLC, and MS. We also demonstrate the dynamic range, sensitivity, and recovery from complex biological matrices for these standards in their permethylated form. Our results indicate that absolute quantification for MS-based glycomic analysis is reproducible and robust utilizing currently available glycan standards.

  8. Quantification of differential gene expression by multiplexed targeted resequencing of cDNA

    PubMed Central

    Arts, Peer; van der Raadt, Jori; van Gestel, Sebastianus H.C.; Steehouwer, Marloes; Shendure, Jay; Hoischen, Alexander; Albers, Cornelis A.

    2017-01-01

    Whole-transcriptome or RNA sequencing (RNA-Seq) is a powerful and versatile tool for functional analysis of different types of RNA molecules, but sample reagent and sequencing cost can be prohibitive for hypothesis-driven studies where the aim is to quantify differential expression of a limited number of genes. Here we present an approach for quantification of differential mRNA expression by targeted resequencing of complementary DNA using single-molecule molecular inversion probes (cDNA-smMIPs) that enable highly multiplexed resequencing of cDNA target regions of ∼100 nucleotides and counting of individual molecules. We show that accurate estimates of differential expression can be obtained from molecule counts for hundreds of smMIPs per reaction and that smMIPs are also suitable for quantification of relative gene expression and allele-specific expression. Compared with low-coverage RNA-Seq and a hybridization-based targeted RNA-Seq method, cDNA-smMIPs are a cost-effective high-throughput tool for hypothesis-driven expression analysis in large numbers of genes (10 to 500) and samples (hundreds to thousands). PMID:28474677

  9. In vivo behavior of NTBI revealed by automated quantification system.

    PubMed

    Ito, Satoshi; Ikuta, Katsuya; Kato, Daisuke; Lynda, Addo; Shibusa, Kotoe; Niizeki, Noriyasu; Toki, Yasumichi; Hatayama, Mayumi; Yamamoto, Masayo; Shindo, Motohiro; Iizuka, Naomi; Kohgo, Yutaka; Fujiya, Mikihiro

    2016-08-01

    Non-Tf-bound iron (NTBI), which appears in serum in iron overload, is thought to contribute to organ damage; the monitoring of serum NTBI levels may therefore be clinically useful in iron-overloaded patients. However, NTBI quantification methods remain complex, limiting their use in clinical practice. To overcome the technical difficulties often encountered, we recently developed a novel automated NTBI quantification system capable of measuring large numbers of samples. In the present study, we investigated the in vivo behavior of NTBI in human and animal serum using this newly established automated system. Average NTBI in healthy volunteers was 0.44 ± 0.076 μM (median 0.45 μM, range 0.28-0.66 μM), with no significant difference between sexes. Additionally, serum NTBI rapidly increased after iron loading, followed by a sudden disappearance. NTBI levels also decreased in inflammation. The results indicate that NTBI is a unique marker of iron metabolism, unlike other markers of iron metabolism, such as serum ferritin. Our new automated NTBI quantification method may help to reveal the clinical significance of NTBI and contribute to our understanding of iron overload.

  10. MPQ-cytometry: a magnetism-based method for quantification of nanoparticle-cell interactions

    NASA Astrophysics Data System (ADS)

    Shipunova, V. O.; Nikitin, M. P.; Nikitin, P. I.; Deyev, S. M.

    2016-06-01

    Precise quantification of interactions between nanoparticles and living cells is among the imperative tasks for research in nanobiotechnology, nanotoxicology and biomedicine. To meet the challenge, a rapid method called MPQ-cytometry is developed, which measures the integral non-linear response produced by magnetically labeled nanoparticles in a cell sample with an original magnetic particle quantification (MPQ) technique. MPQ-cytometry provides a sensitivity limit 0.33 ng of nanoparticles and is devoid of a background signal present in many label-based assays. Each measurement takes only a few seconds, and no complicated sample preparation or data processing is required. The capabilities of the method have been demonstrated by quantification of interactions of iron oxide nanoparticles with eukaryotic cells. The total amount of targeted nanoparticles that specifically recognized the HER2/neu oncomarker on the human cancer cell surface was successfully measured, the specificity of interaction permitting the detection of HER2/neu positive cells in a cell mixture. Moreover, it has been shown that MPQ-cytometry analysis of a HER2/neu-specific iron oxide nanoparticle interaction with six cell lines of different tissue origins quantitatively reflects the HER2/neu status of the cells. High correlation of MPQ-cytometry data with those obtained by three other commonly used in molecular and cell biology methods supports consideration of this method as a prospective alternative for both quantifying cell-bound nanoparticles and estimating the expression level of cell surface antigens. The proposed method does not require expensive sophisticated equipment or highly skilled personnel and it can be easily applied for rapid diagnostics, especially under field conditions.Precise quantification of interactions between nanoparticles and living cells is among the imperative tasks for research in nanobiotechnology, nanotoxicology and biomedicine. To meet the challenge, a rapid method

  11. GMO quantification: valuable experience and insights for the future.

    PubMed

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.

  12. StakeMeter: value-based stakeholder identification and quantification framework for value-based software systems.

    PubMed

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N A; Bin Zaheer, Kashif

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.

  13. StakeMeter: Value-Based Stakeholder Identification and Quantification Framework for Value-Based Software Systems

    PubMed Central

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N. A.; Zaheer, Kashif Bin

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called ‘StakeMeter’. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error. PMID:25799490

  14. A refined methodology for modeling volume quantification performance in CT

    NASA Astrophysics Data System (ADS)

    Chen, Baiyu; Wilson, Joshua; Samei, Ehsan

    2014-03-01

    The utility of CT lung nodule volume quantification technique depends on the precision of the quantification. To enable the evaluation of quantification precision, we previously developed a mathematical model that related precision to image resolution and noise properties in uniform backgrounds in terms of an estimability index (e'). The e' was shown to predict empirical precision across 54 imaging and reconstruction protocols, but with different correlation qualities for FBP and iterative reconstruction (IR) due to the non-linearity of IR impacted by anatomical structure. To better account for the non-linearity of IR, this study aimed to refine the noise characterization of the model in the presence of textured backgrounds. Repeated scans of an anthropomorphic lung phantom were acquired. Subtracted images were used to measure the image quantum noise, which was then used to adjust the noise component of the e' calculation measured from a uniform region. In addition to the model refinement, the validation of the model was further extended to 2 nodule sizes (5 and 10 mm) and 2 segmentation algorithms. Results showed that the magnitude of IR's quantum noise was significantly higher in structured backgrounds than in uniform backgrounds (ASiR, 30-50%; MBIR, 100-200%). With the refined model, the correlation between e' values and empirical precision no longer depended on reconstruction algorithm. In conclusion, the model with refined noise characterization relfected the nonlinearity of iterative reconstruction in structured background, and further showed successful prediction of quantification precision across a variety of nodule sizes, dose levels, slice thickness, reconstruction algorithms, and segmentation software.

  15. Recent application of quantification II in Japanese medical research.

    PubMed Central

    Suzuki, T; Kudo, A

    1979-01-01

    Hayashi's Quantification II is a method of multivariate discrimination analysis to manipulate attribute data as predictor variables. It is very useful in the medical research field for estimation, diagnosis, prognosis, evaluation of epidemiological factors, and other problems based on multiplicity of attribute data. In Japan, this method is so well known that most of the computer program packages include the Hayashi Quantification, but it seems to be yet unfamiliar with the method for researchers outside Japan. In view of this situation, we introduced 19 selected articles of recent applications of the Quantification II in Japanese medical research. In reviewing these papers, special mention is made to clarify how the researchers were satisfied with findings provided by the method. At the same time, some recommendations are made about terminology and program packages. Also a brief discussion of the background of the quantification methods is given with special reference to the Behaviormetric Society of Japan. PMID:540587

  16. Applicability of plasmid calibrant pTC1507 in quantification of TC1507 maize: an interlaboratory study.

    PubMed

    Meng, Yanan; Liu, Xin; Wang, Shu; Zhang, Dabing; Yang, Litao

    2012-01-11

    To enforce the labeling regulations of genetically modified organisms (GMOs), the application of DNA plasmids as calibrants is becoming essential for the practical quantification of GMOs. This study reports the construction of plasmid pTC1507 for a quantification assay of genetically modified (GM) maize TC1507 and the collaborative ring trial in international validation of its applicability as a plasmid calibrant. pTC1507 includes one event-specific sequence of TC1507 maize and one unique sequence of maize endogenous gene zSSIIb. A total of eight GMO detection laboratories worldwide were invited to join the validation process, and test results were returned from all eight participants. Statistical analysis of the returned results showed that real-time PCR assays using pTC1507 as calibrant in both GM event-specific and endogenous gene quantifications had high PCR efficiency (ranging from 0.80 to 1.15) and good linearity (ranging from 0.9921 to 0.9998). In a quantification assay of five blind samples, the bias between the test values and true values ranged from 2.6 to 24.9%. All results indicated that the developed pTC1507 plasmid is applicable for the quantitative analysis of TC1507 maize and can be used as a suitable substitute for dried powder certified reference materials (CRMs).

  17. Segmentation And Quantification Of Black Holes In Multiple Sclerosis

    PubMed Central

    Datta, Sushmita; Sajja, Balasrinivasa Rao; He, Renjie; Wolinsky, Jerry S.; Gupta, Rakesh K.; Narayana, Ponnada A.

    2006-01-01

    A technique that involves minimal operator intervention was developed and implemented for identification and quantification of black holes on T1- weighted magnetic resonance images (T1 images) in multiple sclerosis (MS). Black holes were segmented on T1 images based on grayscale morphological operations. False classification of black holes was minimized by masking the segmented images with images obtained from the orthogonalization of T2-weighted and T1 images. Enhancing lesion voxels on postcontrast images were automatically identified and eliminated from being included in the black hole volume. Fuzzy connectivity was used for the delineation of black holes. The performance of this algorithm was quantitatively evaluated on 14 MS patients. PMID:16126416

  18. Find Pairs: The Module for Protein Quantification of the PeakQuant Software Suite

    PubMed Central

    Eisenacher, Martin; Kohl, Michael; Wiese, Sebastian; Hebeler, Romano; Meyer, Helmut E.

    2012-01-01

    Abstract Accurate quantification of proteins is one of the major tasks in current proteomics research. To address this issue, a wide range of stable isotope labeling techniques have been developed, allowing one to quantitatively study thousands of proteins by means of mass spectrometry. In this article, the FindPairs module of the PeakQuant software suite is detailed. It facilitates the automatic determination of protein abundance ratios based on the automated analysis of stable isotope-coded mass spectrometric data. Furthermore, it implements statistical methods to determine outliers due to biological as well as technical variance of proteome data obtained in replicate experiments. This provides an important means to evaluate the significance in obtained protein expression data. For demonstrating the high applicability of FindPairs, we focused on the quantitative analysis of proteome data acquired in 14N/15N labeling experiments. We further provide a comprehensive overview of the features of the FindPairs software, and compare these with existing quantification packages. The software presented here supports a wide range of proteomics applications, allowing one to quantitatively assess data derived from different stable isotope labeling approaches, such as 14N/15N labeling, SILAC, and iTRAQ. The software is publicly available at http://www.medizinisches-proteom-center.de/software and free for academic use. PMID:22909347

  19. Quantification of maltol in Korean ginseng (Panax ginseng) products by high-performance liquid chromatography-diode array detector

    PubMed Central

    Jeong, Hyun Cheol; Hong, Hee-Do; Kim, Young-Chan; Rhee, Young Kyoung; Choi, Sang Yoon; Kim, Kyung-Tack; Kim, Sung Soo; Lee, Young-Chul; Cho, Chang-Won

    2015-01-01

    Background: Maltol, as a type of phenolic compounds, is produced by the browning reaction during the high-temperature treatment of ginseng. Thus, maltol can be used as a marker for the quality control of various ginseng products manufactured by high-temperature treatment including red ginseng. For the quantification of maltol in Korean ginseng products, an effective high-performance liquid chromatography-diode array detector (HPLC-DAD) method was developed. Materials and Methods: The HPLC-DAD method for maltol quantification coupled with a liquid-liquid extraction (LLE) method was developed and validated in terms of linearity, precision, and accuracy. An HPLC separation was performed on a C18 column. Results: The LLE methods and HPLC running conditions for maltol quantification were optimized. The calibration curve of the maltol exhibited good linearity (R2 = 1.00). The limit of detection value of maltol was 0.26 μg/mL, and the limit of quantification value was 0.79 μg/mL. The relative standard deviations (RSDs) of the data of the intra- and inter-day experiments were <1.27% and 0.61%, respectively. The results of the recovery test were 101.35–101.75% with an RSD value of 0.21–1.65%. The developed method was applied successfully to quantify the maltol in three ginseng products manufactured by different methods. Conclusion: The results of validation demonstrated that the proposed HPLC-DAD method was useful for the quantification of maltol in various ginseng products. PMID:26246746

  20. [11C]Harmine Binding to Brain Monoamine Oxidase A: Test-Retest Properties and Noninvasive Quantification.

    PubMed

    Zanderigo, Francesca; D'Agostino, Alexandra E; Joshi, Nandita; Schain, Martin; Kumar, Dileep; Parsey, Ramin V; DeLorenzo, Christine; Mann, J John

    2018-02-08

    Inhibition of the isoform A of monoamine oxidase (MAO-A), a mitochondrial enzyme catalyzing deamination of monoamine neurotransmitters, is useful in treatment of depression and anxiety disorders. [ 11 C]harmine, a MAO-A PET radioligand, has been used to study mood disorders and antidepressant treatment. However, [ 11 C]harmine binding test-retest characteristics have to date only been partially investigated. Furthermore, since MAO-A is ubiquitously expressed, no reference region is available, thus requiring arterial blood sampling during PET scanning. Here, we investigate [ 11 C]harmine binding measurements test-retest properties; assess effects of using a minimally invasive input function estimation on binding quantification and repeatability; and explore binding potentials estimation using a reference region-free approach. Quantification of [ 11 C]harmine distribution volume (V T ) via kinetic models and graphical analyses was compared based on absolute test-retest percent difference (TRPD), intraclass correlation coefficient (ICC), and identifiability. The optimal procedure was also used with a simultaneously estimated input function in place of the measured curve. Lastly, an approach for binding potentials quantification in absence of a reference region was evaluated. [ 11 C]harmine V T estimates quantified using arterial blood and kinetic modeling showed average absolute TRPD values of 7.7 to 15.6 %, and ICC values between 0.56 and 0.86, across brain regions. Using simultaneous estimation (SIME) of input function resulted in V T estimates close to those obtained using arterial input function (r = 0.951, slope = 1.073, intercept = - 1.037), with numerically but not statistically higher test-retest difference (range 16.6 to 22.0 %), but with overall poor ICC values, between 0.30 and 0.57. Prospective studies using [ 11 C]harmine are possible given its test-retest repeatability when binding is quantified using arterial blood. Results with SIME of

  1. In-line multipoint near-infrared spectroscopy for moisture content quantification during freeze-drying.

    PubMed

    Kauppinen, Ari; Toiviainen, Maunu; Korhonen, Ossi; Aaltonen, Jaakko; Järvinen, Kristiina; Paaso, Janne; Juuti, Mikko; Ketolainen, Jarkko

    2013-02-19

    During the past decade, near-infrared (NIR) spectroscopy has been applied for in-line moisture content quantification during a freeze-drying process. However, NIR has been used as a single-vial technique and thus is not representative of the entire batch. This has been considered as one of the main barriers for NIR spectroscopy becoming widely used in process analytical technology (PAT) for freeze-drying. Clearly it would be essential to monitor samples that reliably represent the whole batch. The present study evaluated multipoint NIR spectroscopy for in-line moisture content quantification during a freeze-drying process. Aqueous sucrose solutions were used as model formulations. NIR data was calibrated to predict the moisture content using partial least-squares (PLS) regression with Karl Fischer titration being used as a reference method. PLS calibrations resulted in root-mean-square error of prediction (RMSEP) values lower than 0.13%. Three noncontact, diffuse reflectance NIR probe heads were positioned on the freeze-dryer shelf to measure the moisture content in a noninvasive manner, through the side of the glass vials. The results showed that the detection of unequal sublimation rates within a freeze-dryer shelf was possible with the multipoint NIR system in use. Furthermore, in-line moisture content quantification was reliable especially toward the end of the process. These findings indicate that the use of multipoint NIR spectroscopy can achieve representative quantification of moisture content and hence a drying end point determination to a desired residual moisture level.

  2. Automatic segmentation and quantification of the cardiac structures from non-contrast-enhanced cardiac CT scans

    NASA Astrophysics Data System (ADS)

    Shahzad, Rahil; Bos, Daniel; Budde, Ricardo P. J.; Pellikaan, Karlijn; Niessen, Wiro J.; van der Lugt, Aad; van Walsum, Theo

    2017-05-01

    Early structural changes to the heart, including the chambers and the coronary arteries, provide important information on pre-clinical heart disease like cardiac failure. Currently, contrast-enhanced cardiac computed tomography angiography (CCTA) is the preferred modality for the visualization of the cardiac chambers and the coronaries. In clinical practice not every patient undergoes a CCTA scan; many patients receive only a non-contrast-enhanced calcium scoring CT scan (CTCS), which has less radiation dose and does not require the administration of contrast agent. Quantifying cardiac structures in such images is challenging, as they lack the contrast present in CCTA scans. Such quantification would however be relevant, as it enables population based studies with only a CTCS scan. The purpose of this work is therefore to investigate the feasibility of automatic segmentation and quantification of cardiac structures viz whole heart, left atrium, left ventricle, right atrium, right ventricle and aortic root from CTCS scans. A fully automatic multi-atlas-based segmentation approach is used to segment the cardiac structures. Results show that the segmentation overlap between the automatic method and that of the reference standard have a Dice similarity coefficient of 0.91 on average for the cardiac chambers. The mean surface-to-surface distance error over all the cardiac structures is 1.4+/- 1.7 mm. The automatically obtained cardiac chamber volumes using the CTCS scans have an excellent correlation when compared to the volumes in corresponding CCTA scans, a Pearson correlation coefficient (R) of 0.95 is obtained. Our fully automatic method enables large-scale assessment of cardiac structures on non-contrast-enhanced CT scans.

  3. Three-dimensional morphological analysis of intracranial aneurysms: a fully automated method for aneurysm sac isolation and quantification.

    PubMed

    Larrabide, Ignacio; Cruz Villa-Uriol, Maria; Cárdenes, Rubén; Pozo, Jose Maria; Macho, Juan; San Roman, Luis; Blasco, Jordi; Vivas, Elio; Marzo, Alberto; Hose, D Rod; Frangi, Alejandro F

    2011-05-01

    Morphological descriptors are practical and essential biomarkers for diagnosis and treatment selection for intracranial aneurysm management according to the current guidelines in use. Nevertheless, relatively little work has been dedicated to improve the three-dimensional quantification of aneurysmal morphology, to automate the analysis, and hence to reduce the inherent intra and interobserver variability of manual analysis. In this paper we propose a methodology for the automated isolation and morphological quantification of saccular intracranial aneurysms based on a 3D representation of the vascular anatomy. This methodology is based on the analysis of the vasculature skeleton's topology and the subsequent application of concepts from deformable cylinders. These are expanded inside the parent vessel to identify different regions and discriminate the aneurysm sac from the parent vessel wall. The method renders as output the surface representation of the isolated aneurysm sac, which can then be quantified automatically. The proposed method provides the means for identifying the aneurysm neck in a deterministic way. The results obtained by the method were assessed in two ways: they were compared to manual measurements obtained by three independent clinicians as normally done during diagnosis and to automated measurements from manually isolated aneurysms by three independent operators, nonclinicians, experts in vascular image analysis. All the measurements were obtained using in-house tools. The results were qualitatively and quantitatively compared for a set of the saccular intracranial aneurysms (n = 26). Measurements performed on a synthetic phantom showed that the automated measurements obtained from manually isolated aneurysms where the most accurate. The differences between the measurements obtained by the clinicians and the manually isolated sacs were statistically significant (neck width: p <0.001, sac height: p = 0.002). When comparing clinicians

  4. Novel isotopic N, N-Dimethyl Leucine (iDiLeu) Reagents Enable Absolute Quantification of Peptides and Proteins Using a Standard Curve Approach

    NASA Astrophysics Data System (ADS)

    Greer, Tyler; Lietz, Christopher B.; Xiang, Feng; Li, Lingjun

    2015-01-01

    Absolute quantification of protein targets using liquid chromatography-mass spectrometry (LC-MS) is a key component of candidate biomarker validation. One popular method combines multiple reaction monitoring (MRM) using a triple quadrupole instrument with stable isotope-labeled standards (SIS) for absolute quantification (AQUA). LC-MRM AQUA assays are sensitive and specific, but they are also expensive because of the cost of synthesizing stable isotope peptide standards. While the chemical modification approach using mass differential tags for relative and absolute quantification (mTRAQ) represents a more economical approach when quantifying large numbers of peptides, these reagents are costly and still suffer from lower throughput because only two concentration values per peptide can be obtained in a single LC-MS run. Here, we have developed and applied a set of five novel mass difference reagents, isotopic N, N-dimethyl leucine (iDiLeu). These labels contain an amine reactive group, triazine ester, are cost effective because of their synthetic simplicity, and have increased throughput compared with previous LC-MS quantification methods by allowing construction of a four-point standard curve in one run. iDiLeu-labeled peptides show remarkably similar retention time shifts, slightly lower energy thresholds for higher-energy collisional dissociation (HCD) fragmentation, and high quantification accuracy for trypsin-digested protein samples (median errors <15%). By spiking in an iDiLeu-labeled neuropeptide, allatostatin, into mouse urine matrix, two quantification methods are validated. The first uses one labeled peptide as an internal standard to normalize labeled peptide peak areas across runs (<19% error), whereas the second enables standard curve creation and analyte quantification in one run (<8% error).

  5. Direct Quantification of Carotenoids in Low Fat Baby Foods Via Laser Photoacoustics and Colorimetric Index *

    NASA Astrophysics Data System (ADS)

    Dóka, O.; Ajtony, Zs.; Bicanic, D.; Valinger, D.; Végvári, Gy.

    2014-12-01

    Carotenoids are important antioxidants found in various foods including those for nutrition of infants. In this investigation, the total carotenoid content (TCC) of nine different commercially available baby foods was quantified using colorimetric index * obtained via reflectance colorimetry (RC) and by laser photoacoustic spectroscopy (LPAS) at 473 nm. The latter requires a minimum of sample preparation and only a one time calibration step which enables practically direct quantification of TCC. Results were verified versus UV-Vis spectrophotometry (SP) as the reference technique. It was shown that RC and LPAS (at 473 nm) provide satisfactory results for *, = 0.9925 and = 0.9972, respectively. Other color indices do not show a correlation with TCC. When determining the TCC in baby foods containing tomatoes, it is necessary to select a different analytical wavelength to compensate for the effect of lycopene's presence in the test samples.

  6. An online sleep apnea detection method based on recurrence quantification analysis.

    PubMed

    Nguyen, Hoa Dinh; Wilkins, Brek A; Cheng, Qi; Benjamin, Bruce Allen

    2014-07-01

    This paper introduces an online sleep apnea detection method based on heart rate complexity as measured by recurrence quantification analysis (RQA) statistics of heart rate variability (HRV) data. RQA statistics can capture nonlinear dynamics of a complex cardiorespiratory system during obstructive sleep apnea. In order to obtain a more robust measurement of the nonstationarity of the cardiorespiratory system, we use different fixed amount of neighbor thresholdings for recurrence plot calculation. We integrate a feature selection algorithm based on conditional mutual information to select the most informative RQA features for classification, and hence, to speed up the real-time classification process without degrading the performance of the system. Two types of binary classifiers, i.e., support vector machine and neural network, are used to differentiate apnea from normal sleep. A soft decision fusion rule is developed to combine the results of these classifiers in order to improve the classification performance of the whole system. Experimental results show that our proposed method achieves better classification results compared with the previous recurrence analysis-based approach. We also show that our method is flexible and a strong candidate for a real efficient sleep apnea detection system.

  7. Collagen Quantification in Tissue Specimens.

    PubMed

    Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I

    2017-01-01

    Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.

  8. Round robin test on quantification of amyloid-β 1-42 in cerebrospinal fluid by mass spectrometry.

    PubMed

    Pannee, Josef; Gobom, Johan; Shaw, Leslie M; Korecka, Magdalena; Chambers, Erin E; Lame, Mary; Jenkins, Rand; Mylott, William; Carrillo, Maria C; Zegers, Ingrid; Zetterberg, Henrik; Blennow, Kaj; Portelius, Erik

    2016-01-01

    Cerebrospinal fluid (CSF) amyloid-β 1-42 (Aβ42) is an important biomarker for Alzheimer's disease, both in diagnostics and to monitor disease-modifying therapies. However, there is a great need for standardization of methods used for quantification. To overcome problems associated with immunoassays, liquid chromatography-tandem mass spectrometry (LC-MS/MS) has emerged as a critical orthogonal alternative. We compared results for CSF Aβ42 quantification in a round robin study performed in four laboratories using similar sample preparation methods and LC-MS instrumentation. The LC-MS results showed excellent correlation between laboratories (r(2) >0.98), high analytical precision, and good correlation with enzyme-linked immunosorbent assay (r(2) >0.85). The use of a common reference sample further decreased interlaboratory variation. Our results indicate that LC-MS is suitable for absolute quantification of Aβ42 in CSF and highlight the importance of developing a certified reference material. Copyright © 2016 The Alzheimer's Association. Published by Elsevier Inc. All rights reserved.

  9. High-performance thin-layer chromatography (HPTLC) for the simultaneous quantification of the cyclic lipopeptides Surfactin, Iturin A and Fengycin in culture samples of Bacillus species.

    PubMed

    Geissler, Mareen; Oellig, Claudia; Moss, Karin; Schwack, Wolfgang; Henkel, Marius; Hausmann, Rudolf

    2017-02-15

    A high-performance thin-layer chromatography method has been established for the identification and simultaneous quantification of the cyclic lipopeptides Surfactin, Iturin A and Fengycin in Bacillus culture samples. B. subtilis DSM 10 T , B. amyloliquefaciens DSM 7 T and B. methylotrophicus DSM 23117 were used as model strains. Culture samples indicated that a sample pretreatment is necessary in order to run HPTLC analyses. A threefold extraction of the cell-free broth with the solvent chloroform/methanol (2:1, v/v) gave best results, when all three lipopeptides were included in the analysis. For the mobile phase, a two-step development was considered most suitable. The first development is conducted with chloroform/methanol/water (65:25:4, v/v/v) over a migration distance of 60mm and the second development using butanol/ethanol/0.1% acetic acid (1:4:1, v/v/v) over a migration distance of 60mm, as well. The method was validated according to Validation of Analytical Procedures: Methodology (FDA Guidance) with respect to the parameters linearity, limit of detection (LOD), limit of quantification (LOQ), precision, accuracy and recovery rate. A linear range with R 2 >0.99 was obtained for all samples from 30ng/zone up to 600ng/zone. The results indicated that quantification of Surfactin has to be performed after the first development (hR F =44), while Fengycin is quantified after the second development (hR F =36, hR F range=20-40). For Iturin A, the results demonstrated that quantification is in favor after the first (hR F =19) development, but also possible after the second (hR F =59) development. LOD and LOQ for Surfactin and Iturin A after the first development, and Fengycin after the second development were determined to be 16ng/zone and 47ng/zone, 13ng/zone and 39ng/zone, and 27ng/zone and 82ng/zone, respectively. Results further revealed the highly accurate and precise character of the developed method with a good inter- and intraday reproducibility. For the

  10. Using Riemannian geometry to obtain new results on Dikin and Karmarkar methods

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Oliveira, P.; Joao, X.; Piaui, T.

    1994-12-31

    We are motivated by a 1990 Karmarkar paper on Riemannian geometry and Interior Point Methods. In this talk we show 3 results. (1) Karmarkar direction can be derived from the Dikin one. This is obtained by constructing a certain Z(x) representation of the null space of the unitary simplex (e, x) = 1; then the projective direction is the image under Z(x) of the affine-scaling one, when it is restricted to that simplex. (2) Second order information on Dikin and Karmarkar methods. We establish computable Hessians for each of the metrics corresponding to both directions, thus permitting the generation ofmore » {open_quotes}second order{close_quotes} methods. (3) Dikin and Karmarkar geodesic descent methods. For those directions, we make computable the theoretical Luenberger geodesic descent method, since we are able to explicit very accurate expressions of the corresponding geodesics. Convergence results are given.« less

  11. Comparative quantification of human intestinal bacteria based on cPCR and LDR/LCR.

    PubMed

    Tang, Zhou-Rui; Li, Kai; Zhou, Yu-Xun; Xiao, Zhen-Xian; Xiao, Jun-Hua; Huang, Rui; Gu, Guo-Hao

    2012-01-21

    To establish a multiple detection method based on comparative polymerase chain reaction (cPCR) and ligase detection reaction (LDR)/ligase chain reaction (LCR) to quantify the intestinal bacterial components. Comparative quantification of 16S rDNAs from different intestinal bacterial components was used to quantify multiple intestinal bacteria. The 16S rDNAs of different bacteria were amplified simultaneously by cPCR. The LDR/LCR was examined to actualize the genotyping and quantification. Two beneficial (Bifidobacterium, Lactobacillus) and three conditionally pathogenic bacteria (Enterococcus, Enterobacterium and Eubacterium) were used in this detection. With cloned standard bacterial 16S rDNAs, standard curves were prepared to validate the quantitative relations between the ratio of original concentrations of two templates and the ratio of the fluorescence signals of their final ligation products. The internal controls were added to monitor the whole detection flow. The quantity ratio between two bacteria was tested. cPCR and LDR revealed obvious linear correlations with standard DNAs, but cPCR and LCR did not. In the sample test, the distributions of the quantity ratio between each two bacterial species were obtained. There were significant differences among these distributions in the total samples. But these distributions of quantity ratio of each two bacteria remained stable among groups divided by age or sex. The detection method in this study can be used to conduct multiple intestinal bacteria genotyping and quantification, and to monitor the human intestinal health status as well.

  12. A fully automated system for quantification of background parenchymal enhancement in breast DCE-MRI

    NASA Astrophysics Data System (ADS)

    Ufuk Dalmiş, Mehmet; Gubern-Mérida, Albert; Borelli, Cristina; Vreemann, Suzan; Mann, Ritse M.; Karssemeijer, Nico

    2016-03-01

    Background parenchymal enhancement (BPE) observed in breast dynamic contrast enhanced magnetic resonance imaging (DCE-MRI) has been identified as an important biomarker associated with risk for developing breast cancer. In this study, we present a fully automated framework for quantification of BPE. We initially segmented fibroglandular tissue (FGT) of the breasts using an improved version of an existing method. Subsequently, we computed BPEabs (volume of the enhancing tissue), BPErf (BPEabs divided by FGT volume) and BPErb (BPEabs divided by breast volume), using different relative enhancement threshold values between 1% and 100%. To evaluate and compare the previous and improved FGT segmentation methods, we used 20 breast DCE-MRI scans and we computed Dice similarity coefficient (DSC) values with respect to manual segmentations. For evaluation of the BPE quantification, we used a dataset of 95 breast DCE-MRI scans. Two radiologists, in individual reading sessions, visually analyzed the dataset and categorized each breast into minimal, mild, moderate and marked BPE. To measure the correlation between automated BPE values to the radiologists' assessments, we converted these values into ordinal categories and we used Spearman's rho as a measure of correlation. According to our results, the new segmentation method obtained an average DSC of 0.81 0.09, which was significantly higher (p<0.001) compared to the previous method (0.76 0.10). The highest correlation values between automated BPE categories and radiologists' assessments were obtained with the BPErf measurement (r=0.55, r=0.49, p<0.001 for both), while the correlation between the scores given by the two radiologists was 0.82 (p<0.001). The presented framework can be used to systematically investigate the correlation between BPE and risk in large screening cohorts.

  13. Automated solid-phase extraction coupled online with HPLC-FLD for the quantification of zearalenone in edible oil.

    PubMed

    Drzymala, Sarah S; Weiz, Stefan; Heinze, Julia; Marten, Silvia; Prinz, Carsten; Zimathies, Annett; Garbe, Leif-Alexander; Koch, Matthias

    2015-05-01

    Established maximum levels for the mycotoxin zearalenone (ZEN) in edible oil require monitoring by reliable analytical methods. Therefore, an automated SPE-HPLC online system based on dynamic covalent hydrazine chemistry has been developed. The SPE step comprises a reversible hydrazone formation by ZEN and a hydrazine moiety covalently attached to a solid phase. Seven hydrazine materials with different properties regarding the resin backbone, pore size, particle size, specific surface area, and loading have been evaluated. As a result, a hydrazine-functionalized silica gel was chosen. The final automated online method was validated and applied to the analysis of three maize germ oil samples including a provisionally certified reference material. Important performance criteria for the recovery (70-120 %) and precision (RSDr <25 %) as set by the Commission Regulation EC 401/2006 were fulfilled: The mean recovery was 78 % and RSDr did not exceed 8 %. The results of the SPE-HPLC online method were further compared to results obtained by liquid-liquid extraction with stable isotope dilution analysis LC-MS/MS and found to be in good agreement. The developed SPE-HPLC online system with fluorescence detection allows a reliable, accurate, and sensitive quantification (limit of quantification, 30 μg/kg) of ZEN in edible oils while significantly reducing the workload. To our knowledge, this is the first report on an automated SPE-HPLC method based on a covalent SPE approach.

  14. Simultaneous quantification and semi-quantification of ginkgolic acids and their metabolites in rat plasma by UHPLC-LTQ-Orbitrap-MS and its application to pharmacokinetics study.

    PubMed

    Qian, Yiyun; Zhu, Zhenhua; Duan, Jin-Ao; Guo, Sheng; Shang, Erxin; Tao, Jinhua; Su, Shulan; Guo, Jianming

    2017-01-15

    A highly sensitive method using ultra-high-pressure liquid chromatography coupled with linear ion trap-Orbitrap tandem mass spectrometry (UHPLC-LTQ-Orbitrap-MS) has been developed and validated for the simultaneous identification and quantification of ginkgolic acids and semi-quantification of their metabolites in rat plasma. For the five selected ginkgolic acids, the method was found to be with good linearities (r>0.9991), good intra- and inter-day precisions (RSD<15%), and good accuracies (RE, from -10.33% to 4.92%) as well. Extraction recoveries, matrix effects and stabilities for rat plasm samples were within the required limits. The validated method was successfully applied to investigate the pharmacokinetics of the five ginkgolic acids in rat plasma after oral administration of 3 dosage groups (900mg/kg, 300mg/kg and 100mg/kg). Meanwhile, six metabolites of GA (15:1) and GA (17:1) were identified by comparison of MS data with reported values. The results of validation in terms of linear ranges, precisions and stabilities were established for semi-quantification of metabolites. The curves of relative changes of these metabolites during the metabolic process were constructed by plotting the peak area ratios of metabolites to salicylic acid (internal standard, IS), respectively. Double peaks were observed in all 3 dose groups. Different type of metabolites and different dosage of each metabolite both resulted in different T max . Copyright © 2016 Elsevier B.V. All rights reserved.

  15. The effect of applied transducer force on acoustic radiation force impulse quantification within the left lobe of the liver.

    PubMed

    Porra, Luke; Swan, Hans; Ho, Chien

    2015-08-01

    Introduction: Acoustic Radiation Force Impulse (ARFI) Quantification measures shear wave velocities (SWVs) within the liver. It is a reliable method for predicting the severity of liver fibrosis and has the potential to assess fibrosis in any part of the liver, but previous research has found ARFI quantification in the right lobe more accurate than in the left lobe. A lack of standardised applied transducer force when performing ARFI quantification in the left lobe of the liver may account for some of this inaccuracy. The research hypothesis of this present study predicted that an increase in applied transducer force would result in an increase in SWVs measured. Methods: ARFI quantification within the left lobe of the liver was performed within a group of healthy volunteers (n = 28). During each examination, each participant was subjected to ARFI quantification at six different levels of transducer force applied to the epigastric abdominal wall. Results: A repeated measures ANOVA test showed that ARFI quantification was significantly affected by applied transducer force (p = 0.002). Significant pairwise comparisons using Bonferroni correction for multiple comparisons showed that with an increase in applied transducer force, there was a decrease in SWVs. Conclusion: Applied transducer force has a significant effect on SWVs within the left lobe of the liver and it may explain some of the less accurate and less reliable results in previous studies where transducer force was not taken into consideration. Future studies in the left lobe of the liver should take this into account and control for applied transducer force.

  16. Interpretation of biological and mechanical variations between the Lowry versus Bradford method for protein quantification.

    PubMed

    Lu, Tzong-Shi; Yiao, Szu-Yu; Lim, Kenneth; Jensen, Roderick V; Hsiao, Li-Li

    2010-07-01

    The identification of differences in protein expression resulting from methodical variations is an essential component to the interpretation of true, biologically significant results. We used the Lowry and Bradford methods- two most commonly used methods for protein quantification, to assess whether differential protein expressions are a result of true biological or methodical variations. MATERIAL #ENTITYSTARTX00026; Differential protein expression patterns was assessed by western blot following protein quantification by the Lowry and Bradford methods. We have observed significant variations in protein concentrations following assessment with the Lowry versus Bradford methods, using identical samples. Greater variations in protein concentration readings were observed over time and in samples with higher concentrations, with the Bradford method. Identical samples quantified using both methods yielded significantly different expression patterns on Western blot. We show for the first time that methodical variations observed in these protein assay techniques, can potentially translate into differential protein expression patterns, that can be falsely taken to be biologically significant. Our study therefore highlights the pivotal need to carefully consider methodical approaches to protein quantification in techniques that report quantitative differences.

  17. Uncertainty Quantification given Discontinuous Climate Model Response and a Limited Number of Model Runs

    NASA Astrophysics Data System (ADS)

    Sargsyan, K.; Safta, C.; Debusschere, B.; Najm, H.

    2010-12-01

    Uncertainty quantification in complex climate models is challenged by the sparsity of available climate model predictions due to the high computational cost of model runs. Another feature that prevents classical uncertainty analysis from being readily applicable is bifurcative behavior in climate model response with respect to certain input parameters. A typical example is the Atlantic Meridional Overturning Circulation. The predicted maximum overturning stream function exhibits discontinuity across a curve in the space of two uncertain parameters, namely climate sensitivity and CO2 forcing. We outline a methodology for uncertainty quantification given discontinuous model response and a limited number of model runs. Our approach is two-fold. First we detect the discontinuity with Bayesian inference, thus obtaining a probabilistic representation of the discontinuity curve shape and location for arbitrarily distributed input parameter values. Then, we construct spectral representations of uncertainty, using Polynomial Chaos (PC) expansions on either side of the discontinuity curve, leading to an averaged-PC representation of the forward model that allows efficient uncertainty quantification. The approach is enabled by a Rosenblatt transformation that maps each side of the discontinuity to regular domains where desirable orthogonality properties for the spectral bases hold. We obtain PC modes by either orthogonal projection or Bayesian inference, and argue for a hybrid approach that targets a balance between the accuracy provided by the orthogonal projection and the flexibility provided by the Bayesian inference - where the latter allows obtaining reasonable expansions without extra forward model runs. The model output, and its associated uncertainty at specific design points, are then computed by taking an ensemble average over PC expansions corresponding to possible realizations of the discontinuity curve. The methodology is tested on synthetic examples of

  18. Near-Infrared Scintillation of Liquid Argon: Recent Results Obtained with the NIR Facility at Fermilab

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Escobar, C. O.; Rubinov, P.; Tilly, E.

    After a short review of previous attempts to observe and measure the near-infrared scintillation in liquid argon, we present new results obtained with NIR, a dedicated cryostat at the Fermilab Proton Assembly Building (PAB). The new results give confidence that the near-infrared light can be used as the much needed light signal in large liquid argon time projection chambers.11 pages,

  19. Quantification of terpene trilactones in Ginkgo biloba with a 1H NMR method.

    PubMed

    Liang, Tingfu; Miyakawa, Takuya; Yang, Jinwei; Ishikawa, Tsutomu; Tanokura, Masaru

    2018-06-01

    Ginkgo biloba L. has been used as a herbal medicine in the traditional treatment of insufficient blood flow, memory deficits, and cerebral insufficiency. The terpene trilactone components, the bioactive agents of Ginkgo biloba L., have also been reported to exhibit useful functionality such as anti-inflammatory and neuroprotective effects. Therefore, in the present research, we attempted to analyze quantitatively the terpene trilactone components in Ginkgo biloba leaf extract, with quantitative 1 H NMR (qNMR) and obtained almost identical results to data reported using HPLC. Application of the qNMR method for the analysis of the terpene trilactone contents in commercial Ginkgo extract products, such as soft gel capsules and tablets, produced the same levels noted in package labels. Thus, qNMR is an alternative method for quantification of the terpene trilactone components in commercial Ginkgo extract products.

  20. Novel Methods of Automated Quantification of Gap Junction Distribution and Interstitial Collagen Quantity from Animal and Human Atrial Tissue Sections

    PubMed Central

    Yan, Jiajie; Thomson, Justin K.; Wu, Xiaomin; Zhao, Weiwei; Pollard, Andrew E.; Ai, Xun

    2014-01-01

    Background Gap junctions (GJs) are the principal membrane structures that conduct electrical impulses between cardiac myocytes while interstitial collagen (IC) can physically separate adjacent myocytes and limit cell-cell communication. Emerging evidence suggests that both GJ and interstitial structural remodeling are linked to cardiac arrhythmia development. However, automated quantitative identification of GJ distribution and IC deposition from microscopic histological images has proven to be challenging. Such quantification is required to improve the understanding of functional consequences of GJ and structural remodeling in cardiac electrophysiology studies. Methods and Results Separate approaches were employed for GJ and IC identification in images from histologically stained tissue sections obtained from rabbit and human atria. For GJ identification, we recognized N-Cadherin (N-Cad) as part of the gap junction connexin 43 (Cx43) molecular complex. Because N-Cad anchors Cx43 on intercalated discs (ID) to form functional GJ channels on cell membranes, we computationally dilated N-Cad pixels to create N-Cad units that covered all ID-associated Cx43 pixels on Cx43/N-Cad double immunostained confocal images. This approach allowed segmentation between ID-associated and non-ID-associated Cx43. Additionally, use of N-Cad as a unique internal reference with Z-stack layer-by-layer confocal images potentially limits sample processing related artifacts in Cx43 quantification. For IC quantification, color map thresholding of Masson's Trichrome blue stained sections allowed straightforward and automated segmentation of collagen from non-collagen pixels. Our results strongly demonstrate that the two novel image-processing approaches can minimize potential overestimation or underestimation of gap junction and structural remodeling in healthy and pathological hearts. The results of using the two novel methods will significantly improve our understanding of the molecular and

  1. Image-Based Quantification of Plant Immunity and Disease.

    PubMed

    Laflamme, Bradley; Middleton, Maggie; Lo, Timothy; Desveaux, Darrell; Guttman, David S

    2016-12-01

    Measuring the extent and severity of disease is a critical component of plant pathology research and crop breeding. Unfortunately, existing visual scoring systems are qualitative, subjective, and the results are difficult to transfer between research groups, while existing quantitative methods can be quite laborious. Here, we present plant immunity and disease image-based quantification (PIDIQ), a quantitative, semi-automated system to rapidly and objectively measure disease symptoms in a biologically relevant context. PIDIQ applies an ImageJ-based macro to plant photos in order to distinguish healthy tissue from tissue that has yellowed due to disease. It can process a directory of images in an automated manner and report the relative ratios of healthy to diseased leaf area, thereby providing a quantitative measure of plant health that can be statistically compared with appropriate controls. We used the Arabidopsis thaliana-Pseudomonas syringae model system to show that PIDIQ is able to identify both enhanced plant health associated with effector-triggered immunity as well as elevated disease symptoms associated with effector-triggered susceptibility. Finally, we show that the quantitative results provided by PIDIQ correspond to those obtained via traditional in planta pathogen growth assays. PIDIQ provides a simple and effective means to nondestructively quantify disease from whole plants and we believe it will be equally effective for monitoring disease on excised leaves and stems.

  2. Quantification of micro stickies

    Treesearch

    Mahendra Doshi; Jeffrey Dyer; Salman Aziz; Kristine Jackson; Said M. Abubakr

    1997-01-01

    The objective of this project was to compare the different methods for the quantification of micro stickies. The hydrophobic materials investigated in this project for the collection of micro stickies were Microfoam* (polypropylene packing material), low density polyethylene film (LDPE), high density polyethylene (HDPE; a flat piece from a square plastic bottle), paper...

  3. Quantification and characterization of leakage errors

    NASA Astrophysics Data System (ADS)

    Wood, Christopher J.; Gambetta, Jay M.

    2018-03-01

    We present a general framework for the quantification and characterization of leakage errors that result when a quantum system is encoded in the subspace of a larger system. To do this we introduce metrics for quantifying the coherent and incoherent properties of the resulting errors and we illustrate this framework with several examples relevant to superconducting qubits. In particular, we propose two quantities, the leakage and seepage rates, which together with average gate fidelity allow for characterizing the average performance of quantum gates in the presence of leakage and show how the randomized benchmarking protocol can be modified to enable the robust estimation of all three quantities for a Clifford gate set.

  4. Quantification of extracellular matrix expansion by CMR in infiltrative heart disease.

    PubMed

    Mongeon, François-Pierre; Jerosch-Herold, Michael; Coelho-Filho, Otávio Rizzi; Blankstein, Ron; Falk, Rodney H; Kwong, Raymond Y

    2012-09-01

    The aim of this study was to perform direct quantification of myocardial extracellular volume fraction (ECF) with T1-weighted cardiac magnetic resonance (CMR) imaging in patients suspected to have infiltrative heart disease. Infiltrative heart disease refers to accumulation of abnormal substances within the myocardium. Qualitative assessment of late gadolinium enhancement (LGE) remains the most commonly used method for CMR evaluation of patients suspected with myocardial infiltration. This technique is widely available and can be performed in a reproducible and standardized manner. However, the degree of extracellular matrix expansion due to myocardial infiltration in the intercellular space has, to date, not been amenable to noninvasive quantification with LGE. We performed 3-T CMR in 38 patients (mean age 68 ± 15 years) who were referred for assessment of infiltrative heart disease and also in 9 healthy volunteers as control subjects. The T1 quantification by Look-Locker gradient-echo before and after contrast determined segmental myocardial partition coefficients. The ECF was obtained by referencing the tissue partition coefficient for gadolinium to the plasma volume fraction in blood, derived from serum hematocrit. Cine CMR and LGE imaging in matching locations were also performed. Seventeen patients (45%) had cardiac amyloidosis (CA) (biopsy-confirmed or clinically highly probable), 20 (53%) had a non-amyloid cardiomyopathy, and 1 had lysosomal storage disease. Median global ECF was substantially higher in CA patients (0.49) compared with non-amyloid cardiomyopathy patients (0.33, p < 0.0001) and volunteers (0.24, p = 0.0001). The ECF strongly correlated with visually assessed segmental LGE (r = 0.80, p < 0.0001) and LV mass index (r = 0.69, p < 0.0001), reflecting severity of myocardial infiltration. In patients with CA, ECF was highest in segments with LGE, although it remained elevated in segments without qualitative LGE. The CMR ECF quantification

  5. Results of Investigative Tests of Gas Turbine Engine Compressor Blades Obtained by Electrochemical Machining

    NASA Astrophysics Data System (ADS)

    Kozhina, T. D.; Kurochkin, A. V.

    2016-04-01

    The paper highlights results of the investigative tests of GTE compressor Ti-alloy blades obtained by the method of electrochemical machining with oscillating tool-electrodes, carried out in order to define the optimal parameters of the ECM process providing attainment of specified blade quality parameters given in the design documentation, while providing maximal performance. The new technological methods suggested based on the results of the tests; in particular application of vibrating tool-electrodes and employment of locating elements made of high-strength materials, significantly extend the capabilities of this method.

  6. 18O-labeled proteome reference as global internal standards for targeted quantification by selected reaction monitoring-mass spectrometry.

    PubMed

    Kim, Jong-Seo; Fillmore, Thomas L; Liu, Tao; Robinson, Errol; Hossain, Mahmud; Champion, Boyd L; Moore, Ronald J; Camp, David G; Smith, Richard D; Qian, Wei-Jun

    2011-12-01

    Selected reaction monitoring (SRM)-MS is an emerging technology for high throughput targeted protein quantification and verification in biomarker discovery studies; however, the cost associated with the application of stable isotope-labeled synthetic peptides as internal standards can be prohibitive for screening a large number of candidate proteins as often required in the preverification phase of discovery studies. Herein we present a proof of concept study using an (18)O-labeled proteome reference as global internal standards (GIS) for SRM-based relative quantification. The (18)O-labeled proteome reference (or GIS) can be readily prepared and contains a heavy isotope ((18)O)-labeled internal standard for every possible tryptic peptide. Our results showed that the percentage of heavy isotope ((18)O) incorporation applying an improved protocol was >99.5% for most peptides investigated. The accuracy, reproducibility, and linear dynamic range of quantification were further assessed based on known ratios of standard proteins spiked into the labeled mouse plasma reference. Reliable quantification was observed with high reproducibility (i.e. coefficient of variance <10%) for analyte concentrations that were set at 100-fold higher or lower than those of the GIS based on the light ((16)O)/heavy ((18)O) peak area ratios. The utility of (18)O-labeled GIS was further illustrated by accurate relative quantification of 45 major human plasma proteins. Moreover, quantification of the concentrations of C-reactive protein and prostate-specific antigen was illustrated by coupling the GIS with standard additions of purified protein standards. Collectively, our results demonstrated that the use of (18)O-labeled proteome reference as GIS provides a convenient, low cost, and effective strategy for relative quantification of a large number of candidate proteins in biological or clinical samples using SRM.

  7. Lesion Quantification in Dual-Modality Mammotomography

    NASA Astrophysics Data System (ADS)

    Li, Heng; Zheng, Yibin; More, Mitali J.; Goodale, Patricia J.; Williams, Mark B.

    2007-02-01

    This paper describes a novel x-ray/SPECT dual modality breast imaging system that provides 3D structural and functional information. While only a limited number of views on one side of the breast can be acquired due to mechanical and time constraints, we developed a technique to compensate for the limited angle artifact in reconstruction images and accurately estimate both the lesion size and radioactivity concentration. Various angular sampling strategies were evaluated using both simulated and experimental data. It was demonstrated that quantification of lesion size to an accuracy of 10% and quantification of radioactivity to an accuracy of 20% are feasible from limited-angle data acquired with clinically practical dosage and acquisition time

  8. Analysis of laser fluorosensor systems for remote algae detection and quantification

    NASA Technical Reports Server (NTRS)

    Browell, E. V.

    1977-01-01

    The development and performance of single- and multiple-wavelength laser fluorosensor systems for use in the remote detection and quantification of algae are discussed. The appropriate equation for the fluorescence power received by a laser fluorosensor system is derived in detail. Experimental development of a single wavelength system and a four wavelength system, which selectively excites the algae contained in the four primary algal color groups, is reviewed, and test results are presented. A comprehensive error analysis is reported which evaluates the uncertainty in the remote determination of the chlorophyll a concentration contained in algae by single- and multiple-wavelength laser fluorosensor systems. Results of the error analysis indicate that the remote quantification of chlorophyll a by a laser fluorosensor system requires optimum excitation wavelength(s), remote measurement of marine attenuation coefficients, and supplemental instrumentation to reduce uncertainties in the algal fluorescence cross sections.

  9. Magnetic immunoassay coupled with inductively coupled plasma mass spectrometry for simultaneous quantification of alpha-fetoprotein and carcinoembryonic antigen in human serum

    NASA Astrophysics Data System (ADS)

    Zhang, Xing; Chen, Beibei; He, Man; Zhang, Yiwen; Xiao, Guangyang; Hu, Bin

    2015-04-01

    The absolute quantification of glycoproteins in complex biological samples is a challenge and of great significance. Herein, 4-mercaptophenylboronic acid functionalized magnetic beads were prepared to selectively capture glycoproteins, while antibody conjugated gold and silver nanoparticles were synthesized as element tags to label two different glycoproteins. Based on that, a new approach of magnetic immunoassay-inductively coupled plasma mass spectrometry (ICP-MS) was established for simultaneous quantitative analysis of glycoproteins. Taking biomarkers of alpha-fetoprotein (AFP) and carcinoembryonic antigen (CEA) as two model glycoproteins, experimental parameters involved in the immunoassay procedure were carefully optimized and analytical performance of the proposed method was evaluated. The limits of detection (LODs) for AFP and CEA were 0.086 μg L- 1 and 0.054 μg L- 1 with the relative standard deviations (RSDs, n = 7, c = 5 μg L- 1) of 6.5% and 6.2% for AFP and CEA, respectively. Linear range for both AFP and CEA was 0.2-50 μg L- 1. To validate the applicability of the proposed method, human serum samples were analyzed, and the obtained results were in good agreement with that obtained by the clinical chemiluminescence immunoassay. The developed method exhibited good selectivity and sensitivity for the simultaneous determination of AFP and CEA, and extended the applicability of metal nanoparticle tags based on ICP-MS methodology in multiple glycoprotein quantifications.

  10. Validation of a DIXON-based fat quantification technique for the measurement of visceral fat using a CT-based reference standard.

    PubMed

    Heckman, Katherine M; Otemuyiwa, Bamidele; Chenevert, Thomas L; Malyarenko, Dariya; Derstine, Brian A; Wang, Stewart C; Davenport, Matthew S

    2018-06-27

    The purpose of the study is to determine whether a novel semi-automated DIXON-based fat quantification algorithm can reliably quantify visceral fat using a CT-based reference standard. This was an IRB-approved retrospective cohort study of 27 subjects who underwent abdominopelvic CT within 7 days of proton density fat fraction (PDFF) mapping on a 1.5T MRI. Cross-sectional visceral fat area per slice (cm 2 ) was measured in blinded fashion in each modality at intervertebral disc levels from T12 to L4. CT estimates were obtained using a previously published semi-automated computational image processing system that sums pixels with attenuation - 205 to - 51 HU. MR estimates were obtained using two novel semi-automated DIXON-based fat quantification algorithms that measure visceral fat area by spatially regularizing non-uniform fat-only signal intensity or de-speckling PDFF 2D images and summing pixels with PDFF ≥ 50%. Pearson's correlations and Bland-Altman analyses were performed. Visceral fat area per slice ranged from 9.2 to 429.8 cm 2 for MR and from 1.6 to 405.5 cm 2 for CT. There was a strong correlation between CT and MR methods in measured visceral fat area across all studied vertebral body levels (r = 0.97; n = 101 observations); the least (r = 0.93) correlation was at T12. Bland-Altman analysis revealed a bias of 31.7 cm 2 (95% CI [- 27.1]-90.4 cm 2 ), indicating modestly higher visceral fat assessed by MR. MR- and CT-based visceral fat quantification are highly correlated and have good cross-modality reliability, indicating that visceral fat quantification by either method can yield a stable and reliable biomarker.

  11. An efficient assisted history matching and uncertainty quantification workflow using Gaussian processes proxy models and variogram based sensitivity analysis: GP-VARS

    NASA Astrophysics Data System (ADS)

    Rana, Sachin; Ertekin, Turgay; King, Gregory R.

    2018-05-01

    Reservoir history matching is frequently viewed as an optimization problem which involves minimizing misfit between simulated and observed data. Many gradient and evolutionary strategy based optimization algorithms have been proposed to solve this problem which typically require a large number of numerical simulations to find feasible solutions. Therefore, a new methodology referred to as GP-VARS is proposed in this study which uses forward and inverse Gaussian processes (GP) based proxy models combined with a novel application of variogram analysis of response surface (VARS) based sensitivity analysis to efficiently solve high dimensional history matching problems. Empirical Bayes approach is proposed to optimally train GP proxy models for any given data. The history matching solutions are found via Bayesian optimization (BO) on forward GP models and via predictions of inverse GP model in an iterative manner. An uncertainty quantification method using MCMC sampling in conjunction with GP model is also presented to obtain a probabilistic estimate of reservoir properties and estimated ultimate recovery (EUR). An application of the proposed GP-VARS methodology on PUNQ-S3 reservoir is presented in which it is shown that GP-VARS provides history match solutions in approximately four times less numerical simulations as compared to the differential evolution (DE) algorithm. Furthermore, a comparison of uncertainty quantification results obtained by GP-VARS, EnKF and other previously published methods shows that the P50 estimate of oil EUR obtained by GP-VARS is in close agreement to the true values for the PUNQ-S3 reservoir.

  12. Photoacoustic bio-quantification of graphene based nanomaterials at a single cell level (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Nedosekin, Dmitry A.; Nolan, Jacqueline; Biris, Alexandru S.; Zharov, Vladimir P.

    2017-03-01

    Arkansas Nanomedicine Center at the University of Arkansas for Medical Sciences in collaboration with other Arkansas Universities and the FDA-based National Center of Toxicological Research in Jefferson, AR is developing novel techniques for rapid quantification of graphene-based nanomaterials (GBNs) in various biological samples. All-carbon GBNs have wide range of potential applications in industry, agriculture, food processing and medicine; however, quantification of GBNs is difficult in carbon reach biological tissues. The accurate quantification of GBNs is essential for research on material toxicity and the development of GBNs-based drug delivery platforms. We have developed microscopy and cytometry platforms for detection and quantification of GBNs in single cells, tissue and blood samples using photoacoustic contrast of GBNs. We demonstrated PA quantification of individual graphene uptake by single cells. High-resolution PA microscopy provided mapping of GBN distribution within live cells to establish correlation with intracellular toxic phenomena using apoptotic and necrotic assays. This new methodology and corresponding technical platform provide the insight on possible toxicological risks of GBNs at singe cells levels. In addition, in vivo PA image flow cytometry demonstrated the capability to monitor of GBNs pharmacokinetics in mouse model and to map the resulting biodistribution of GBNs in mouse tissues. The integrated PA platform provided an unprecedented sensitivity toward GBNs and allowed to enhance conventional toxicology research by providing a direct correlation between uptake of GBNs at a single cell level and cell viability status.

  13. Quantitative fluorescence spectroscopy and flow cytometry analyses of cell-penetrating peptides internalization pathways: optimization, pitfalls, comparison with mass spectrometry quantification

    NASA Astrophysics Data System (ADS)

    Illien, Françoise; Rodriguez, Nicolas; Amoura, Mehdi; Joliot, Alain; Pallerla, Manjula; Cribier, Sophie; Burlina, Fabienne; Sagan, Sandrine

    2016-11-01

    The mechanism of cell-penetrating peptides entry into cells is unclear, preventing the development of more efficient vectors for biotechnological or therapeutic purposes. Here, we developed a protocol relying on fluorometry to distinguish endocytosis from direct membrane translocation, using Penetratin, TAT and R9. The quantities of internalized CPPs measured by fluorometry in cell lysates converge with those obtained by our previously reported mass spectrometry quantification method. By contrast, flow cytometry quantification faces several limitations due to fluorescence quenching processes that depend on the cell line and occur at peptide/cell ratio >6.108 for CF-Penetratin. The analysis of cellular internalization of a doubly labeled fluorescent and biotinylated Penetratin analogue by the two independent techniques, fluorometry and mass spectrometry, gave consistent results at the quantitative and qualitative levels. Both techniques revealed the use of two alternative translocation and endocytosis pathways, whose relative efficacy depends on cell-surface sugars and peptide concentration. We confirmed that Penetratin translocates at low concentration and uses endocytosis at high μM concentrations. We further demonstrate that the hydrophobic/hydrophilic nature of the N-terminal extremity impacts on the internalization efficiency of CPPs. We expect these results and the associated protocols to help unraveling the translocation pathway to the cytosol of cells.

  14. Quantification of protein carbonylation.

    PubMed

    Wehr, Nancy B; Levine, Rodney L

    2013-01-01

    Protein carbonylation is the most commonly used measure of oxidative modification of proteins. It is most often measured spectrophotometrically or immunochemically by derivatizing proteins with the classical carbonyl reagent 2,4 dinitrophenylhydrazine (DNPH). We present protocols for the derivatization and quantification of protein carbonylation with these two methods, including a newly described dot blot with greatly increased sensitivity.

  15. Uncertainty quantification for PZT bimorph actuators

    NASA Astrophysics Data System (ADS)

    Bravo, Nikolas; Smith, Ralph C.; Crews, John

    2018-03-01

    In this paper, we discuss the development of a high fidelity model for a PZT bimorph actuator used for micro-air vehicles, which includes the Robobee. We developed a high-fidelity model for the actuator using the homogenized energy model (HEM) framework, which quantifies the nonlinear, hysteretic, and rate-dependent behavior inherent to PZT in dynamic operating regimes. We then discussed an inverse problem on the model. We included local and global sensitivity analysis of the parameters in the high-fidelity model. Finally, we will discuss the results of Bayesian inference and uncertainty quantification on the HEM.

  16. Waste generated in high-rise buildings construction: a quantification model based on statistical multiple regression.

    PubMed

    Parisi Kern, Andrea; Ferreira Dias, Michele; Piva Kulakowski, Marlova; Paulo Gomes, Luciana

    2015-05-01

    Reducing construction waste is becoming a key environmental issue in the construction industry. The quantification of waste generation rates in the construction sector is an invaluable management tool in supporting mitigation actions. However, the quantification of waste can be a difficult process because of the specific characteristics and the wide range of materials used in different construction projects. Large variations are observed in the methods used to predict the amount of waste generated because of the range of variables involved in construction processes and the different contexts in which these methods are employed. This paper proposes a statistical model to determine the amount of waste generated in the construction of high-rise buildings by assessing the influence of design process and production system, often mentioned as the major culprits behind the generation of waste in construction. Multiple regression was used to conduct a case study based on multiple sources of data of eighteen residential buildings. The resulting statistical model produced dependent (i.e. amount of waste generated) and independent variables associated with the design and the production system used. The best regression model obtained from the sample data resulted in an adjusted R(2) value of 0.694, which means that it predicts approximately 69% of the factors involved in the generation of waste in similar constructions. Most independent variables showed a low determination coefficient when assessed in isolation, which emphasizes the importance of assessing their joint influence on the response (dependent) variable. Copyright © 2015 Elsevier Ltd. All rights reserved.

  17. Quantification of abdominal aortic deformation after EVAR

    NASA Astrophysics Data System (ADS)

    Demirci, Stefanie; Manstad-Hulaas, Frode; Navab, Nassir

    2009-02-01

    Quantification of abdominal aortic deformation is an important requirement for the evaluation of endovascular stenting procedures and the further refinement of stent graft design. During endovascular aortic repair (EVAR) treatment, the aortic shape is subject to severe deformation that is imposed by medical instruments such as guide wires, catheters, and, the stent graft. This deformation can affect the flow characteristics and morphology of the aorta which have been shown to be elicitors for stent graft failures and be reason for reappearance of aneurysms. We present a method for quantifying the deformation of an aneurysmatic aorta imposed by an inserted stent graft device. The outline of the procedure includes initial rigid alignment of the two abdominal scans, segmentation of abdominal vessel trees, and automatic reduction of their centerline structures to one specified region of interest around the aorta. This is accomplished by preprocessing and remodeling of the pre- and postoperative aortic shapes before performing a non-rigid registration. We further narrow the resulting displacement fields to only include local non-rigid deformation and therefore, eliminate all remaining global rigid transformations. Finally, deformations for specified locations can be calculated from the resulting displacement fields. In order to evaluate our method, experiments for the extraction of aortic deformation fields are conducted on 15 patient datasets from endovascular aortic repair (EVAR) treatment. A visual assessment of the registration results and evaluation of the usage of deformation quantification were performed by two vascular surgeons and one interventional radiologist who are all experts in EVAR procedures.

  18. An assessment of consistence of exhaust gas emission test results obtained under controlled NEDC conditions

    NASA Astrophysics Data System (ADS)

    Balawender, K.; Jaworski, A.; Kuszewski, H.; Lejda, K.; Ustrzycki, A.

    2016-09-01

    Measurements concerning emissions of pollutants contained in automobile combustion engine exhaust gases is of primary importance in view of their harmful impact on the natural environment. This paper presents results of tests aimed at determining exhaust gas pollutant emissions from a passenger car engine obtained under repeatable conditions on a chassis dynamometer. The test set-up was installed in a controlled climate chamber allowing to maintain the temperature conditions within the range from -20°C to +30°C. The analysis covered emissions of such components as CO, CO2, NOx, CH4, THC, and NMHC. The purpose of the study was to assess repeatability of results obtained in a number of tests performed as per NEDC test plan. The study is an introductory stage of a wider research project concerning the effect of climate conditions and fuel type on emission of pollutants contained in exhaust gases generated by automotive vehicles.

  19. Performance of the cobas Hepatitis B virus (HBV) test using the cobas 4800 system and comparison of HBV DNA quantification ability between the COBAS AmpliPrep/COBAS TaqMan HBV test version 2.0 and cobas HBV test.

    PubMed

    Shin, Kyung-Hwa; Lee, Hyun-Ji; Chang, Chulhun L; Kim, Hyung-Hoi

    2018-04-01

    Hepatitis B virus (HBV) DNA levels are used to predict the response to therapy, determine therapy initiation, monitor resistance to therapy, and establish treatment success. To verify the performance of the cobas HBV test using the cobas 4800 system for HBV DNA quantification and to compare the HBV DNA quantification ability between the cobas HBV test and COBAS AmpliPrep/COBAS TaqMan HBV version 2.0 (CAP/CTM v2.0). The precision, linearity, and limit of detection of the cobas HBV test were evaluated using the 4th World Health Organization International Standard material and plasma samples. Clinical samples that yielded quantitative results using the CAP/CTM v2.0 and cobas HBV tests were subjected to correlational analysis. Three hundred forty-nine samples were subjected to correlational analysis, among which 114 samples showed results above the lower limit of quantification. Comparable results were obtained ([cobas HBV test] = 1.038 × [CAP/CTM v2.0]-0.173, r = 0.914) in 114 samples, which yielded values above the lower limit of quantification. The results for 86.8% of the samples obtained using the cobas HBV test were within 0.5 log 10 IU/mL of the CAP/CTM v2.0 results. The total precision values against the low and high positive controls were 1.4% (mean level: 2.25 log 10 IU/mL) and 3.2% (mean level: 6.23 log 10 IU/mL), respectively. The cobas HBV test demonstrated linearity (1.15-6.75 log 10 IU/mL, y = 0.95 × 6 + 0.17, r 2  = 0.994). The cobas HBV test showed good correlation with CAP/CTM v2.0, and had good precision and an acceptable limit of detection. The cobas HBV test using the cobas 4800 is a reliable method for quantifying HBV DNA levels in the clinical setting. Copyright © 2018. Published by Elsevier B.V.

  20. Glucose Meters: A Review of Technical Challenges to Obtaining Accurate Results

    PubMed Central

    Tonyushkina, Ksenia; Nichols, James H.

    2009-01-01

    , anemia, hypotension, and other disease states. This article reviews the challenges involved in obtaining accurate glucose meter results. PMID:20144348

  1. Perfusion quantification in contrast-enhanced ultrasound (CEUS)--ready for research projects and routine clinical use.

    PubMed

    Tranquart, F; Mercier, L; Frinking, P; Gaud, E; Arditi, M

    2012-07-01

    With contrast-enhanced ultrasound (CEUS) now established as a valuable imaging modality for many applications, a more specific demand has recently emerged for quantifying perfusion and using measured parameters as objective indicators for various disease states. However, CEUS perfusion quantification remains challenging and is not well integrated in daily clinical practice. The development of VueBox™ alleviates existing limitations and enables quantification in a standardized way. VueBox™ operates as an off-line software application, after dynamic contrast-enhanced ultrasound (DCE-US) is performed. It enables linearization of DICOM clips, assessment of perfusion using patented curve-fitting models, and generation of parametric images by synthesizing perfusion information at the pixel level using color coding. VueBox™ is compatible with most of the available ultrasound platforms (nonlinear contrast-enabled), has the ability to process both bolus and disruption-replenishment kinetics loops, allows analysis results and their context to be saved, and generates analysis reports automatically. Specific features have been added to VueBox™, such as fully automatic in-plane motion compensation and an easy-to-use clip editor. Processing time has been reduced as a result of parallel programming optimized for multi-core processors. A long list of perfusion parameters is available for each of the two administration modes to address all possible demands currently reported in the literature for diagnosis or treatment monitoring. In conclusion, VueBox™ is a valid and robust quantification tool to be used for standardizing perfusion quantification and to improve the reproducibility of results across centers. © Georg Thieme Verlag KG Stuttgart · New York.

  2. Simple quantification of surface carboxylic acids on chemically oxidized multi-walled carbon nanotubes

    NASA Astrophysics Data System (ADS)

    Gong, Hyejin; Kim, Seong-Taek; Lee, Jong Doo; Yim, Sanggyu

    2013-02-01

    The surface of multi-walled carbon nanotube (MWCNT) was chemically oxidized using nitric acid and sulfuric-nitric acid mixtures. Thermogravimetric analysis, transmission electron microscopy and infrared spectroscopy revealed that the use of acid mixtures led to higher degree of oxidation. More quantitative identification of surface carboxylic acids was carried out using X-ray photoelectron spectroscopy (XPS) and acid-base titration. However, these techniques are costly and require very long analysis times to promptly respond to the extent of the reaction. We propose a much simpler method using pH measurements and pre-determined pKa value in order to estimate the concentration of carboxylic acids on the oxidized MWCNT surfaces. The results from this technique were consistent with those obtained from XPS and titration, and it is expected that this simple quantification method can provide a cheap and fast way to monitor and control the oxidation reaction of MWCNT.

  3. Information transduction capacity reduces the uncertainties in annotation-free isoform discovery and quantification

    PubMed Central

    Deng, Yue; Bao, Feng; Yang, Yang; Ji, Xiangyang; Du, Mulong; Zhang, Zhengdong

    2017-01-01

    Abstract The automated transcript discovery and quantification of high-throughput RNA sequencing (RNA-seq) data are important tasks of next-generation sequencing (NGS) research. However, these tasks are challenging due to the uncertainties that arise in the inference of complete splicing isoform variants from partially observed short reads. Here, we address this problem by explicitly reducing the inherent uncertainties in a biological system caused by missing information. In our approach, the RNA-seq procedure for transforming transcripts into short reads is considered an information transmission process. Consequently, the data uncertainties are substantially reduced by exploiting the information transduction capacity of information theory. The experimental results obtained from the analyses of simulated datasets and RNA-seq datasets from cell lines and tissues demonstrate the advantages of our method over state-of-the-art competitors. Our algorithm is an open-source implementation of MaxInfo. PMID:28911101

  4. Quantification of myocardial fibrosis by digital image analysis and interactive stereology.

    PubMed

    Daunoravicius, Dainius; Besusparis, Justinas; Zurauskas, Edvardas; Laurinaviciene, Aida; Bironaite, Daiva; Pankuweit, Sabine; Plancoulaine, Benoit; Herlin, Paulette; Bogomolovas, Julius; Grabauskiene, Virginija; Laurinavicius, Arvydas

    2014-06-09

    Cardiac fibrosis disrupts the normal myocardial structure and has a direct impact on heart function and survival. Despite already available digital methods, the pathologist's visual score is still widely considered as ground truth and used as a primary method in histomorphometric evaluations. The aim of this study was to compare the accuracy of digital image analysis tools and the pathologist's visual scoring for evaluating fibrosis in human myocardial biopsies, based on reference data obtained by point counting performed on the same images. Endomyocardial biopsy material from 38 patients diagnosed with inflammatory dilated cardiomyopathy was used. The extent of total cardiac fibrosis was assessed by image analysis on Masson's trichrome-stained tissue specimens using automated Colocalization and Genie software, by Stereology grid count and manually by Pathologist's visual score. A total of 116 slides were analyzed. The mean results obtained by the Colocalization software (13.72 ± 12.24%) were closest to the reference value of stereology (RVS), while the Genie software and Pathologist score gave a slight underestimation. RVS values correlated strongly with values obtained using the Colocalization and Genie (r>0.9, p<0.001) software as well as the pathologist visual score. Differences in fibrosis quantification by Colocalization and RVS were statistically insignificant. However, significant bias was found in the results obtained by using Genie versus RVS and pathologist score versus RVS with mean difference values of: -1.61% and 2.24%. Bland-Altman plots showed a bidirectional bias dependent on the magnitude of the measurement: Colocalization software overestimated the area fraction of fibrosis in the lower end, and underestimated in the higher end of the RVS values. Meanwhile, Genie software as well as the pathologist score showed more uniform results throughout the values, with a slight underestimation in the mid-range for both. Both applied digital image analysis

  5. Aerosol-type retrieval and uncertainty quantification from OMI data

    NASA Astrophysics Data System (ADS)

    Kauppi, Anu; Kolmonen, Pekka; Laine, Marko; Tamminen, Johanna

    2017-11-01

    We discuss uncertainty quantification for aerosol-type selection in satellite-based atmospheric aerosol retrieval. The retrieval procedure uses precalculated aerosol microphysical models stored in look-up tables (LUTs) and top-of-atmosphere (TOA) spectral reflectance measurements to solve the aerosol characteristics. The forward model approximations cause systematic differences between the modelled and observed reflectance. Acknowledging this model discrepancy as a source of uncertainty allows us to produce more realistic uncertainty estimates and assists the selection of the most appropriate LUTs for each individual retrieval.This paper focuses on the aerosol microphysical model selection and characterisation of uncertainty in the retrieved aerosol type and aerosol optical depth (AOD). The concept of model evidence is used as a tool for model comparison. The method is based on Bayesian inference approach, in which all uncertainties are described as a posterior probability distribution. When there is no single best-matching aerosol microphysical model, we use a statistical technique based on Bayesian model averaging to combine AOD posterior probability densities of the best-fitting models to obtain an averaged AOD estimate. We also determine the shared evidence of the best-matching models of a certain main aerosol type in order to quantify how plausible it is that it represents the underlying atmospheric aerosol conditions.The developed method is applied to Ozone Monitoring Instrument (OMI) measurements using a multiwavelength approach for retrieving the aerosol type and AOD estimate with uncertainty quantification for cloud-free over-land pixels. Several larger pixel set areas were studied in order to investigate the robustness of the developed method. We evaluated the retrieved AOD by comparison with ground-based measurements at example sites. We found that the uncertainty of AOD expressed by posterior probability distribution reflects the difficulty in model

  6. Quantification of Lignin and Its Structural Features in Plant Biomass Using 13C Lignin as Internal Standard for Pyrolysis-GC-SIM-MS.

    PubMed

    van Erven, Gijs; de Visser, Ries; Merkx, Donny W H; Strolenberg, Willem; de Gijsel, Peter; Gruppen, Harry; Kabel, Mirjam A

    2017-10-17

    Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13 C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12 C and 13 C lignin were isolated from nonlabeled and uniformly 13 C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13 C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12 C lignin analogue and was shown to be extremely accurate (>99.9%, R 2 > 0.999) and precise (RSD < 1.5%). Third, 13 C-IS based lignin quantification was applied to four common poaceous biomass sources (wheat straw, barley straw, corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin.

  7. The Qiagen Investigator® Quantiplex HYres as an alternative kit for DNA quantification.

    PubMed

    Frégeau, Chantal J; Laurin, Nancy

    2015-05-01

    The Investigator® Quantiplex HYres kit was evaluated as a potential replacement for dual DNA quantification of casework samples. This kit was determined to be highly sensitive with a limit of quantification and limit of detection of 0.0049ng/μL and 0.0003ng/μL, respectively, for both human and male DNA, using full or half reaction volumes. It was also accurate in assessing the amount of male DNA present in 96 mock and actual casework male:female mixtures (various ratios) processed in this exercise. The close correlation between the male/human DNA ratios expressed in percentages derived from the Investigator® Quantiplex HYres quantification results and the male DNA proportion calculated in mixed AmpFlSTR® Profiler® Plus or AmpFlSTR® Identifiler® Plus profiles, using the Amelogenin Y peak and STR loci, allowed guidelines to be developed to facilitate decisions regarding when to submit samples to Y-STR rather than autosomal STR profiling. The internal control (IC) target was shown to be more sensitive to inhibitors compared to the human and male DNA targets included in the Investigator® Quantiplex HYres kit serving as a good quality assessor of DNA extracts. The new kit met our criteria of enhanced sensitivity, accuracy, consistency, reliability and robustness for casework DNA quantification. Crown Copyright © 2015. Published by Elsevier Ireland Ltd. All rights reserved.

  8. STEM VQ Method, Using Scanning Transmission Electron Microscopy (STEM) for Accurate Virus Quantification

    DTIC Science & Technology

    2017-02-02

    Corresponding Author Abstract Accurate virus quantification is sought, but a perfect method still eludes the scientific community. Electron...unlimited. UNCLASSIFIED 2 provides morphology data and counts all viral particles, including partial or noninfectious particles; however, EM methods ...consistent, reproducible virus quantification method called Scanning Transmission Electron Microscopy – Virus Quantification (STEM-VQ) which simplifies

  9. Comparative quantification of dietary supplemented neural creatine concentrations with (1)H-MRS peak fitting and basis spectrum methods.

    PubMed

    Turner, Clare E; Russell, Bruce R; Gant, Nicholas

    2015-11-01

    Magnetic resonance spectroscopy (MRS) is an analytical procedure that can be used to non-invasively measure the concentration of a range of neural metabolites. Creatine is an important neurometabolite with dietary supplementation offering therapeutic potential for neurological disorders with dysfunctional energetic processes. Neural creatine concentrations can be probed using proton MRS and quantified using a range of software packages based on different analytical methods. This experiment examines the differences in quantification performance of two commonly used analysis packages following a creatine supplementation strategy with potential therapeutic application. Human participants followed a seven day dietary supplementation regime in a placebo-controlled, cross-over design interspersed with a five week wash-out period. Spectroscopy data were acquired the day immediately following supplementation and analyzed with two commonly-used software packages which employ vastly different quantification methods. Results demonstrate that neural creatine concentration was augmented following creatine supplementation when analyzed using the peak fitting method of quantification (105.9%±10.1). In contrast, no change in neural creatine levels were detected with supplementation when analysis was conducted using the basis spectrum method of quantification (102.6%±8.6). Results suggest that software packages that employ the peak fitting procedure for spectral quantification are possibly more sensitive to subtle changes in neural creatine concentrations. The relative simplicity of the spectroscopy sequence and the data analysis procedure suggest that peak fitting procedures may be the most effective means of metabolite quantification when detection of subtle alterations in neural metabolites is necessary. The straightforward technique can be used on a clinical magnetic resonance imaging system. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Comparison of indirect and direct quantification of esters of monochloropropanediol in vegetable oil.

    PubMed

    Dubois, Mathieu; Tarres, Adrienne; Goldmann, Till; Empl, Anna Maria; Donaubauer, Alfred; Seefelder, Walburga

    2012-05-04

    The presence of fatty acid esters of monochloropropanediol (MEs) in food is a recent concern raised due to the carcinogenicity of their hydrolysable moieties 2- and 3-monochloropropanediol (2- and 3-MCPD). Several indirect methods for the quantification of MEs have been developed and are commonly in use until today, however significant discrepancies among analytical results obtained are challenging their reliability. The aim of the present study was therefore to test the trueness of an indirect method by comparing it to a newly developed direct method using palm oil and palm olein as examples. The indirect method was based on ester cleavage under acidic conditions, derivatization of the liberated 2- and 3-MCPD with heptafluorobutyryl imidazole and GC-MS determination. The direct method was comprised of two extraction procedures targeting 2-and 3-MCPD mono esters (co-extracting as well glycidyl esters) by the use of double solid phase extraction (SPE), and 2- and 3-MCPD di-esters by the use of silica gel column, respectively. Detection was carried out by liquid chromatography coupled to time of flight mass spectrometry (LC-ToF-MS). Accurate quantification of the intact compounds was assured by means of matrix matched standard addition on extracts. Analysis of 22 palm oil and 7 palm olein samples (2- plus 3-MCPD contamination ranged from 0.3 to 8.8 μg/g) by both methods revealed no significant bias. Both methods were therefore considered as comparable in terms of results; however the indirect method was shown to require less analytical standards, being less tedious and furthermore applicable to all type of different vegetable oils and hence recommended for routine application. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Comparison Study of MS-HRM and Pyrosequencing Techniques for Quantification of APC and CDKN2A Gene Methylation

    PubMed Central

    Migheli, Francesca; Stoccoro, Andrea; Coppedè, Fabio; Wan Omar, Wan Adnan; Failli, Alessandra; Consolini, Rita; Seccia, Massimo; Spisni, Roberto; Miccoli, Paolo; Mathers, John C.; Migliore, Lucia

    2013-01-01

    There is increasing interest in the development of cost-effective techniques for the quantification of DNA methylation biomarkers. We analyzed 90 samples of surgically resected colorectal cancer tissues for APC and CDKN2A promoter methylation using methylation sensitive-high resolution melting (MS-HRM) and pyrosequencing. MS-HRM is a less expensive technique compared with pyrosequencing but is usually more limited because it gives a range of methylation estimates rather than a single value. Here, we developed a method for deriving single estimates, rather than a range, of methylation using MS-HRM and compared the values obtained in this way with those obtained using the gold standard quantitative method of pyrosequencing. We derived an interpolation curve using standards of known methylated/unmethylated ratio (0%, 12.5%, 25%, 50%, 75%, and 100% of methylation) to obtain the best estimate of the extent of methylation for each of our samples. We observed similar profiles of methylation and a high correlation coefficient between the two techniques. Overall, our new approach allows MS-HRM to be used as a quantitative assay which provides results which are comparable with those obtained by pyrosequencing. PMID:23326336

  12. Plasma protein absolute quantification by nano-LC Q-TOF UDMSE for clinical biomarker verification

    PubMed Central

    ILIES, MARIA; IUGA, CRISTINA ADELA; LOGHIN, FELICIA; DHOPLE, VISHNU MUKUND; HAMMER, ELKE

    2017-01-01

    Background and aims Proteome-based biomarker studies are targeting proteins that could serve as diagnostic, prognosis, and prediction molecules. In the clinical routine, immunoassays are currently used for the absolute quantification of such biomarkers, with the major limitation that only one molecule can be targeted per assay. The aim of our study was to test a mass spectrometry based absolute quantification method for the verification of plasma protein sets which might serve as reliable biomarker panels for the clinical practice. Methods Six EDTA plasma samples were analyzed after tryptic digestion using a high throughput data independent acquisition nano-LC Q-TOF UDMSE proteomics approach. Synthetic Escherichia coli standard peptides were spiked in each sample for the absolute quantification. Data analysis was performed using ProgenesisQI v2.0 software (Waters Corporation). Results Our method ensured absolute quantification of 242 non redundant plasma proteins in a single run analysis. The dynamic range covered was 105. 86% were represented by classical plasma proteins. The overall median coefficient of variation was 0.36, while a set of 63 proteins was found to be highly stable. Absolute protein concentrations strongly correlated with values reviewed in the literature. Conclusions Nano-LC Q-TOF UDMSE proteomic analysis can be used for a simple and rapid determination of absolute amounts of plasma proteins. A large number of plasma proteins could be analyzed, while a wide dynamic range was covered with low coefficient of variation at protein level. The method proved to be a reliable tool for the quantification of protein panel for biomarker verification in the clinical practice. PMID:29151793

  13. Rapid Quantification of Melamine in Different Brands/Types of Milk Powders Using Standard Addition Net Analyte Signal and Near-Infrared Spectroscopy

    PubMed Central

    2016-01-01

    Multivariate calibration (MVC) and near-infrared (NIR) spectroscopy have demonstrated potential for rapid analysis of melamine in various dairy products. However, the practical application of ordinary MVC can be largely restricted because the prediction of a new sample from an uncalibrated batch would be subject to a significant bias due to matrix effect. In this study, the feasibility of using NIR spectroscopy and the standard addition (SA) net analyte signal (NAS) method (SANAS) for rapid quantification of melamine in different brands/types of milk powders was investigated. In SANAS, the NAS vector of melamine in an unknown sample as well as in a series of samples added with melamine standards was calculated and then the Euclidean norms of series standards were used to build a straightforward univariate regression model. The analysis results of 10 different brands/types of milk powders with melamine levels 0~0.12% (w/w) indicate that SANAS obtained accurate results with the root mean squared error of prediction (RMSEP) values ranging from 0.0012 to 0.0029. An additional advantage of NAS is to visualize and control the possible unwanted variations during standard addition. The proposed method will provide a practically useful tool for rapid and nondestructive quantification of melamine in different brands/types of milk powders. PMID:27525154

  14. Comparison of machine learning and semi-quantification algorithms for (I123)FP-CIT classification: the beginning of the end for semi-quantification?

    PubMed

    Taylor, Jonathan Christopher; Fenner, John Wesley

    2017-11-29

    Semi-quantification methods are well established in the clinic for assisted reporting of (I123) Ioflupane images. Arguably, these are limited diagnostic tools. Recent research has demonstrated the potential for improved classification performance offered by machine learning algorithms. A direct comparison between methods is required to establish whether a move towards widespread clinical adoption of machine learning algorithms is justified. This study compared three machine learning algorithms with that of a range of semi-quantification methods, using the Parkinson's Progression Markers Initiative (PPMI) research database and a locally derived clinical database for validation. Machine learning algorithms were based on support vector machine classifiers with three different sets of features: Voxel intensities Principal components of image voxel intensities Striatal binding radios from the putamen and caudate. Semi-quantification methods were based on striatal binding ratios (SBRs) from both putamina, with and without consideration of the caudates. Normal limits for the SBRs were defined through four different methods: Minimum of age-matched controls Mean minus 1/1.5/2 standard deviations from age-matched controls Linear regression of normal patient data against age (minus 1/1.5/2 standard errors) Selection of the optimum operating point on the receiver operator characteristic curve from normal and abnormal training data Each machine learning and semi-quantification technique was evaluated with stratified, nested 10-fold cross-validation, repeated 10 times. The mean accuracy of the semi-quantitative methods for classification of local data into Parkinsonian and non-Parkinsonian groups varied from 0.78 to 0.87, contrasting with 0.89 to 0.95 for classifying PPMI data into healthy controls and Parkinson's disease groups. The machine learning algorithms gave mean accuracies between 0.88 to 0.92 and 0.95 to 0.97 for local and PPMI data respectively. Classification

  15. Quantification of pericardial effusions by echocardiography and computed tomography.

    PubMed

    Leibowitz, David; Perlman, Gidon; Planer, David; Gilon, Dan; Berman, Philip; Bogot, Naama

    2011-01-15

    Echocardiography is a well-accepted tool for the diagnosis and quantification of pericardial effusion (PEff). Given the increasing use of computed tomographic (CT) scanning, more PEffs are being initially diagnosed by computed tomography. No study has compared quantification of PEff by computed tomography and echocardiography. The objective of this study was to assess the accuracy of quantification of PEff by 2-dimensional echocardiography and computed tomography compared to the amount of pericardial fluid drained at pericardiocentesis. We retrospectively reviewed an institutional database to identify patients who underwent chest computed tomography and echocardiography before percutaneous pericardiocentesis with documentation of the amount of fluid withdrawn. Digital 2-dimensional echocardiographic and CT images were retrieved and quantification of PEff volume was performed by applying the formula for the volume of a prolate ellipse, π × 4/3 × maximal long-axis dimension/2 × maximal transverse dimension/2 × maximal anteroposterior dimension/2, to the pericardial sac and to the heart. Nineteen patients meeting study qualifications were entered into the study. The amount of PEff drained was 200 to 1,700 ml (mean 674 ± 340). Echocardiographically calculated pericardial effusion volume correlated relatively well with PEff volume (r = 0.73, p <0.001, mean difference -41 ± 225 ml). There was only moderate correlation between CT volume quantification and actual volume drained (r = 0.4, p = 0.004, mean difference 158 ± 379 ml). In conclusion, echocardiography appears a more accurate imaging technique than computed tomography in quantitative assessment of nonloculated PEffs and should continue to be the primary imaging in these patients. Copyright © 2011 Elsevier Inc. All rights reserved.

  16. Computational analysis of PET by AIBL (CapAIBL): a cloud-based processing pipeline for the quantification of PET images

    NASA Astrophysics Data System (ADS)

    Bourgeat, Pierrick; Dore, Vincent; Fripp, Jurgen; Villemagne, Victor L.; Rowe, Chris C.; Salvado, Olivier

    2015-03-01

    With the advances of PET tracers for β-Amyloid (Aβ) detection in neurodegenerative diseases, automated quantification methods are desirable. For clinical use, there is a great need for PET-only quantification method, as MR images are not always available. In this paper, we validate a previously developed PET-only quantification method against MR-based quantification using 6 tracers: 18F-Florbetaben (N=148), 18F-Florbetapir (N=171), 18F-NAV4694 (N=47), 18F-Flutemetamol (N=180), 11C-PiB (N=381) and 18F-FDG (N=34). The results show an overall mean absolute percentage error of less than 5% for each tracer. The method has been implemented as a remote service called CapAIBL (http://milxcloud.csiro.au/capaibl). PET images are uploaded to a cloud platform where they are spatially normalised to a standard template and quantified. A report containing global as well as local quantification, along with surface projection of the β-Amyloid deposition is automatically generated at the end of the pipeline and emailed to the user.

  17. Targeted quantification of low ng/mL level proteins in human serum without immunoaffinity depletion

    PubMed Central

    Shi, Tujin; Sun, Xuefei; Gao, Yuqian; Fillmore, Thomas L.; Schepmoes, Athena A.; Zhao, Rui; He, Jintang; Moore, Ronald J.; Kagan, Jacob; Rodland, Karin D.; Liu, Tao; Liu, Alvin Y.; Smith, Richard D.; Tang, Keqi; Camp, David G.; Qian, Wei-Jun

    2013-01-01

    We recently reported an antibody-free targeted protein quantification strategy, termed high-pressure, high-resolution separations with intelligent selection and multiplexing (PRISM) for achieving significantly enhanced sensitivity using selected reaction monitoring (SRM) mass spectrometry. Integrating PRISM with front-end IgY14 immunoaffinity depletion, sensitive detection of targeted proteins at 50–100 pg/mL levels in human blood plasma/serum was demonstrated. However, immunoaffinity depletion is often associated with undesired losses of target proteins of interest. Herein we report further evaluation of PRISM-SRM quantification of low-abundance serum proteins without immunoaffinity depletion. Limits of quantification (LOQ) at low ng/mL levels with a median coefficient of variation (CV) of ~12% were achieved for proteins spiked into human female serum. PRISM-SRM provided >100-fold improvement in the LOQ when compared to conventional LC-SRM measurements. PRISM-SRM was then applied to measure several low-abundance endogenous serum proteins, including prostate-specific antigen (PSA), in clinical prostate cancer patient sera. PRISM-SRM enabled confident detection of all target endogenous serum proteins except the low pg/mL-level cardiac troponin T. A correlation coefficient >0.99 was observed for PSA between the results from PRISM-SRM and immunoassays. Our results demonstrate that PRISM-SRM can successful quantify low ng/mL proteins in human plasma or serum without depletion. We anticipate broad applications for PRISM-SRM quantification of low-abundance proteins in candidate biomarker verification and systems biology studies. PMID:23763644

  18. Quantification of mixed chimerism by real time PCR on whole blood-impregnated FTA cards.

    PubMed

    Pezzoli, N; Silvy, M; Woronko, A; Le Treut, T; Lévy-Mozziconacci, A; Reviron, D; Gabert, J; Picard, C

    2007-09-01

    This study has investigated quantification of chimerism in sex-mismatched transplantations by quantitative real time PCR (RQ-PCR) using FTA paper for blood sampling. First, we demonstrate that the quantification of DNA from EDTA-blood which has been deposit on FTA card is accurate and reproducible. Secondly, we show that fraction of recipient cells detected by RQ-PCR was concordant between the FTA and salting-out method, reference DNA extraction method. Furthermore, the sensitivity of detection of recipient cells is relatively similar with the two methods. Our results show that this innovative method can be used for MC assessment by RQ-PCR.

  19. Ariadne's Thread: A Robust Software Solution Leading to Automated Absolute and Relative Quantification of SRM Data.

    PubMed

    Nasso, Sara; Goetze, Sandra; Martens, Lennart

    2015-09-04

    Selected reaction monitoring (SRM) MS is a highly selective and sensitive technique to quantify protein abundances in complex biological samples. To enhance the pace of SRM large studies, a validated, robust method to fully automate absolute quantification and to substitute for interactive evaluation would be valuable. To address this demand, we present Ariadne, a Matlab software. To quantify monitored targets, Ariadne exploits metadata imported from the transition lists, and targets can be filtered according to mProphet output. Signal processing and statistical learning approaches are combined to compute peptide quantifications. To robustly estimate absolute abundances, the external calibration curve method is applied, ensuring linearity over the measured dynamic range. Ariadne was benchmarked against mProphet and Skyline by comparing its quantification performance on three different dilution series, featuring either noisy/smooth traces without background or smooth traces with complex background. Results, evaluated as efficiency, linearity, accuracy, and precision of quantification, showed that Ariadne's performance is independent of data smoothness and complex background presence and that Ariadne outperforms mProphet on the noisier data set and improved 2-fold Skyline's accuracy and precision for the lowest abundant dilution with complex background. Remarkably, Ariadne could statistically distinguish from each other all different abundances, discriminating dilutions as low as 0.1 and 0.2 fmol. These results suggest that Ariadne offers reliable and automated analysis of large-scale SRM differential expression studies.

  20. Airglow during ionospheric modifications by the sura facility radiation. experimental results obtained in 2010

    NASA Astrophysics Data System (ADS)

    Grach, S. M.; Klimenko, V. V.; Shindin, A. V.; Nasyrov, I. A.; Sergeev, E. N.; A. Yashnov, V.; A. Pogorelko, N.

    2012-06-01

    We present the results of studying the structure and dynamics of the HF-heated volume above the Sura facility obtained in 2010 by measurements of ionospheric airglow in the red (λ = 630 nm) and green (λ = 557.7 nm) lines of atomic oxygen. Vertical sounding of the ionosphere (followed by modeling of the pump-wave propagation) and measurements of stimulated electromagnetic emission were used for additional diagnostics of ionospheric parameters and the processes occurring in the heated volume.

  1. [Classical and molecular methods for identification and quantification of domestic moulds].

    PubMed

    Fréalle, E; Bex, V; Reboux, G; Roussel, S; Bretagne, S

    2017-12-01

    To study the impact of the constant and inevitable inhalation of moulds, it is necessary to sample, identify and count the spores. Environmental sampling methods can be separated into three categories: surface sampling is easy to perform but non quantitative, air sampling is easy to calibrate but provides time limited information, and dust sampling which is more representative of long term exposure to moulds. The sampling strategy depends on the objectives (evaluation of the risk of exposure for individuals; quantification of the household contamination; evaluation of the efficacy of remediation). The mould colonies obtained in culture are identified using microscopy, Maldi-TOF, and/or DNA sequencing. Electrostatic dust collectors are an alternative to older methods for identifying and quantifying household mould spores. They are easy to use and relatively cheap. Colony counting should be progressively replaced by quantitative real-time PCR, which is already validated, while waiting for more standardised high throughput sequencing methods for assessment of mould contamination without technical bias. Despite some technical recommendations for obtaining reliable and comparable results, the huge diversity of environmental moulds, the variable quantity of spores inhaled and the association with other allergens (mites, plants) make the evaluation of their impact on human health difficult. Hence there is a need for reliable and generally applicable quantitative methods. Copyright © 2017 SPLF. Published by Elsevier Masson SAS. All rights reserved.

  2. Proposal for study on IR light and glucose phantom interaction for human glucose quantification applications

    NASA Astrophysics Data System (ADS)

    Romo-Cárdenas, Gerardo S.; Sanchez-Lopez, Juan D.; Nieto-Hipolito, Juan I.; Cosio-León, María.; Luque-Morales, Priscy; Vazquez-Briseno, Mabel

    2016-09-01

    It has been established the importance of a constant glucose monitoring in order to keep a regular control for diabetes patients. Several medical studies accept the necessity of exploring alternatives for the traditional digital glucometer, given the pain and discomfort related to this technique, which can lead to a compromised control of the disease. Several efforts based on the application of IR spectroscopy had been done with favorable, yet not conclusive results. Therefore it's necessary to apply a comprehensive and interdisciplinary study based on the biochemical and optical properties of the glucose in the human body, in order to understand the interaction between this substance, its surroundings and IR light. These study propose a comprehensive approach of the glucose and IR light interaction, considering and combining important biochemical, physiological and optical properties, as well as some machine learning techniques for the data analysis. The results of this work would help to define the right parameters aiming to obtain an optical glucose quantification system and protocol.

  3. Installation Restoration Program. Confirmation/Quantification Stage 1. Phase 2

    DTIC Science & Technology

    1985-03-07

    INSTALLATION RESTORATION PROGRAM i0 PHASE II - CONFIRMATION/QUANTIFICATION 0STAGE 1 KIRTLAND AFB KIRTLAND AFB, NEW MEXICO 87117 IIl PREPARED BY SCIENCE...APPLICATIONS INTERNATIONAL CORPORATION 505 MARQUETTE NW, SUITE 1200 ALBUQUERQUE, NEW MEXICO 871021 5MARCH 1985 FINAL REPORT FROM FEB 1983 TO MAR 1985...QUANTIFICATION STAGE 1 i FINAL REPORT FOR IKIRTLAND AFB KIRTLAND AFB, NEW MEXICO 87117U HEADQUARTERS MILITARY AIRLIFT COMMAND COMMAND SURGEON’S OFFICE (HQ MAC

  4. Quantification of extra virgin olive oil in dressing and edible oil blends using the representative TMS-4,4'-desmethylsterols gas-chromatographic-normalized fingerprint.

    PubMed

    Pérez-Castaño, Estefanía; Sánchez-Viñas, Mercedes; Gázquez-Evangelista, Domingo; Bagur-González, M Gracia

    2018-01-15

    This paper describes and discusses the application of trimethylsilyl (TMS)-4,4'-desmethylsterols derivatives chromatographic fingerprints (obtained from an off-line HPLC-GC-FID system) for the quantification of extra virgin olive oil in commercial vinaigrettes, dressing salad and in-house reference materials (i-HRM) using two different Partial Least Square-Regression (PLS-R) multivariate quantification methods. Different data pre-processing strategies were carried out being the whole one: (i) internal normalization; (ii) sampling based on The Nyquist Theorem; (iii) internal correlation optimized shifting, icoshift; (iv) baseline correction (v) mean centering and (vi) selecting zones. The first model corresponds to a matrix of dimensions 'n×911' variables and the second one to a matrix of dimensions 'n×431' variables. It has to be highlighted that the proposed two PLS-R models allow the quantification of extra virgin olive oil in binary blends, foodstuffs, etc., when the provided percentage is greater than 25%. Copyright © 2017 Elsevier Ltd. All rights reserved.

  5. Whole farm quantification of GHG emissions within smallholder farms in developing countries

    NASA Astrophysics Data System (ADS)

    Seebauer, Matthias

    2014-03-01

    The IPCC has compiled the best available scientific methods into published guidelines for estimating greenhouse gas emissions and emission removals from the land-use sector. In order to evaluate existing GHG quantification tools to comprehensively quantify GHG emissions and removals in smallholder conditions, farm scale quantification was tested with farm data from Western Kenya. After conducting a cluster analysis to identify different farm typologies GHG quantification was exercised using the VCS SALM methodology complemented with IPCC livestock emission factors and the cool farm tool. The emission profiles of four farm clusters representing the baseline conditions in the year 2009 are compared with 2011 where farmers adopted sustainable land management practices (SALM). The results demonstrate the variation in both the magnitude of the estimated GHG emissions per ha between different smallholder farm typologies and the emissions estimated by applying two different accounting tools. The farm scale quantification further shows that the adoption of SALM has a significant impact on emission reduction and removals and the mitigation benefits range between 4 and 6.5 tCO2 ha-1 yr-1 with significantly different mitigation benefits depending on typologies of the crop-livestock systems, their different agricultural practices, as well as adoption rates of improved practices. However, the inherent uncertainty related to the emission factors applied by accounting tools has substantial implications for reported agricultural emissions. With regard to uncertainty related to activity data, the assessment confirms the high variability within different farm types as well as between different parameters surveyed to comprehensively quantify GHG emissions within smallholder farms.

  6. Quantification of polyhydroxyalkanoates in mixed and pure cultures biomass by Fourier transform infrared spectroscopy: comparison of different approaches.

    PubMed

    Isak, I; Patel, M; Riddell, M; West, M; Bowers, T; Wijeyekoon, S; Lloyd, J

    2016-08-01

    Fourier transform infrared (FTIR) spectroscopy was used in this study for the rapid quantification of polyhydroxyalkanoates (PHA) in mixed and pure culture bacterial biomass. Three different statistical analysis methods (regression, partial least squares (PLS) and nonlinear) were applied to the FTIR data and the results were plotted against the PHA values measured with the reference gas chromatography technique. All methods predicted PHA content in mixed culture biomass with comparable efficiency, indicated by similar residuals values. The PHA in these cultures ranged from low to medium concentration (0-44 wt% of dried biomass content). However, for the analysis of the combined mixed and pure culture biomass with PHA concentration ranging from low to high (0-93% of dried biomass content), the PLS method was most efficient. This paper reports, for the first time, the use of a single calibration model constructed with a combination of mixed and pure cultures covering a wide PHA range, for predicting PHA content in biomass. Currently no one universal method exists for processing FTIR data for polyhydroxyalkanoates (PHA) quantification. This study compares three different methods of analysing FTIR data for quantification of PHAs in biomass. A new data-processing approach was proposed and the results were compared against existing literature methods. Most publications report PHA quantification of medium range in pure culture. However, in our study we encompassed both mixed and pure culture biomass containing a broader range of PHA in the calibration curve. The resulting prediction model is useful for rapid quantification of a wider range of PHA content in biomass. © 2016 The Society for Applied Microbiology.

  7. Quantification of real thermal, catalytic, and hydrodeoxygenated bio-oils via comprehensive two-dimensional gas chromatography with mass spectrometry.

    PubMed

    Silva, Raquel V S; Tessarolo, Nathalia S; Pereira, Vinícius B; Ximenes, Vitor L; Mendes, Fábio L; de Almeida, Marlon B B; Azevedo, Débora A

    2017-03-01

    The elucidation of bio-oil composition is important to evaluate the processes of biomass conversion and its upgrading, and to suggest the proper use for each sample. Comprehensive two-dimensional gas chromatography with time-of-flight mass spectrometry (GC×GC-TOFMS) is a widely applied analytical approach for bio-oil investigation due to the higher separation and resolution capacity from this technique. This work addresses the issue of analytical performance to assess the comprehensive characterization of real bio-oil samples via GC×GC-TOFMS. The approach was applied to the individual quantification of compounds of real thermal (PWT), catalytic process (CPO), and hydrodeoxygenation process (HDO) bio-oils. Quantification was performed with reliability using the analytical curves of oxygenated and hydrocarbon standards as well as the deuterated internal standards. The limit of quantification was set at 1ngµL -1 for major standards, except for hexanoic acid, which was set at 5ngµL -1 . The GC×GC-TOFMS method provided good precision (<10%) and excellent accuracy (recovery range of 70-130%) for the quantification of individual hydrocarbons and oxygenated compounds in real bio-oil samples. Sugars, furans, and alcohols appear as the major constituents of the PWT, CPO, and HDO samples, respectively. In order to obtain bio-oils with better quality, the catalytic pyrolysis process may be a better option than hydrogenation due to the effective reduction of oxygenated compound concentrations and the lower cost of the process, when hydrogen is not required to promote deoxygenation in the catalytic pyrolysis process. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Quantification of myocardial fibrosis by digital image analysis and interactive stereology

    PubMed Central

    2014-01-01

    Background Cardiac fibrosis disrupts the normal myocardial structure and has a direct impact on heart function and survival. Despite already available digital methods, the pathologist’s visual score is still widely considered as ground truth and used as a primary method in histomorphometric evaluations. The aim of this study was to compare the accuracy of digital image analysis tools and the pathologist’s visual scoring for evaluating fibrosis in human myocardial biopsies, based on reference data obtained by point counting performed on the same images. Methods Endomyocardial biopsy material from 38 patients diagnosed with inflammatory dilated cardiomyopathy was used. The extent of total cardiac fibrosis was assessed by image analysis on Masson’s trichrome-stained tissue specimens using automated Colocalization and Genie software, by Stereology grid count and manually by Pathologist’s visual score. Results A total of 116 slides were analyzed. The mean results obtained by the Colocalization software (13.72 ± 12.24%) were closest to the reference value of stereology (RVS), while the Genie software and Pathologist score gave a slight underestimation. RVS values correlated strongly with values obtained using the Colocalization and Genie (r > 0.9, p < 0.001) software as well as the pathologist visual score. Differences in fibrosis quantification by Colocalization and RVS were statistically insignificant. However, significant bias was found in the results obtained by using Genie versus RVS and pathologist score versus RVS with mean difference values of: -1.61% and 2.24%. Bland-Altman plots showed a bidirectional bias dependent on the magnitude of the measurement: Colocalization software overestimated the area fraction of fibrosis in the lower end, and underestimated in the higher end of the RVS values. Meanwhile, Genie software as well as the pathologist score showed more uniform results throughout the values, with a slight underestimation in the mid

  9. Analytical validation of a flow cytometric protocol for quantification of platelet microparticles in dogs.

    PubMed

    Cremer, Signe E; Krogh, Anne K H; Hedström, Matilda E K; Christiansen, Liselotte B; Tarnow, Inge; Kristensen, Annemarie T

    2018-06-01

    Platelet microparticles (PMPs) are subcellular procoagulant vesicles released upon platelet activation. In people with clinical diseases, alterations in PMP concentrations have been extensively investigated, but few canine studies exist. This study aims to validate a canine flow cytometric protocol for PMP quantification and to assess the influence of calcium on PMP concentrations. Microparticles (MP) were quantified in citrated whole blood (WB) and platelet-poor plasma (PPP) using flow cytometry. Anti-CD61 antibody and Annexin V (AnV) were used to detect platelets and phosphatidylserine, respectively. In 13 healthy dogs, CD61 + /AnV - concentrations were analyzed with/without a calcium buffer. CD61 + /AnV - , CD61 + /AnV + , and CD61 - /AnV + MP quantification were validated in 10 healthy dogs. The coefficient of variation (CV) for duplicate (intra-assay) and parallel (inter-assay) analyses and detection limits (DLs) were calculated. CD61 + /AnV - concentrations were higher in calcium buffer; 841,800 MP/μL (526,000-1,666,200) vs without; 474,200 MP/μL (278,800-997,500), P < .05. In WB, PMP were above DLs and demonstrated acceptable (<20%) intra-assay and inter-assay CVs in 9/10 dogs: 1.7% (0.5-8.9) and 9.0% (0.9-11.9), respectively, for CD61 + /AnV - and 2.4% (0.2-8.7) and 7.8% (0.0-12.8), respectively, for CD61 + /AnV + . Acceptable CVs were not seen for the CD61 - /AnV + MP. In PPP, quantifications were challenged by high inter-assay CV, overlapping DLs and hemolysis and lipemia interfered with quantification in 5/10 dogs. Calcium induced higher in vitro PMP concentrations, likely due to platelet activation. PMP concentrations were reliably quantified in WB, indicating the potential for clinical applications. PPP analyses were unreliable due to high inter-CV and DL overlap, and not obtainable due to hemolysis and lipemia interference. © 2018 American Society for Veterinary Clinical Pathology.

  10. Quantification of Lignin and Its Structural Features in Plant Biomass Using 13C Lignin as Internal Standard for Pyrolysis-GC-SIM-MS

    PubMed Central

    2017-01-01

    Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12C and 13C lignin were isolated from nonlabeled and uniformly 13C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12C lignin analogue and was shown to be extremely accurate (>99.9%, R2 > 0.999) and precise (RSD < 1.5%). Third, 13C-IS based lignin quantification was applied to four common poaceous biomass sources (wheat straw, barley straw, corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin. PMID:28926698

  11. Relative quantification of biomarkers using mixed-isotope labeling coupled with MS

    PubMed Central

    Chapman, Heidi M; Schutt, Katherine L; Dieter, Emily M; Lamos, Shane M

    2013-01-01

    The identification and quantification of important biomarkers is a critical first step in the elucidation of biological systems. Biomarkers take many forms as cellular responses to stimuli and can be manifested during transcription, translation, and/or metabolic processing. Increasingly, researchers have relied upon mixed-isotope labeling (MIL) coupled with MS to perform relative quantification of biomarkers between two or more biological samples. MIL effectively tags biomarkers of interest for ease of identification and quantification within the mass spectrometer by using isotopic labels that introduce a heavy and light form of the tag. In addition to MIL coupled with MS, a number of other approaches have been used to quantify biomarkers including protein gel staining, enzymatic labeling, metabolic labeling, and several label-free approaches that generate quantitative data from the MS signal response. This review focuses on MIL techniques coupled with MS for the quantification of protein and small-molecule biomarkers. PMID:23157360

  12. pyQms enables universal and accurate quantification of mass spectrometry data.

    PubMed

    Leufken, Johannes; Niehues, Anna; Sarin, L Peter; Wessel, Florian; Hippler, Michael; Leidel, Sebastian A; Fufezan, Christian

    2017-10-01

    Quantitative mass spectrometry (MS) is a key technique in many research areas (1), including proteomics, metabolomics, glycomics, and lipidomics. Because all of the corresponding molecules can be described by chemical formulas, universal quantification tools are highly desirable. Here, we present pyQms, an open-source software for accurate quantification of all types of molecules measurable by MS. pyQms uses isotope pattern matching that offers an accurate quality assessment of all quantifications and the ability to directly incorporate mass spectrometer accuracy. pyQms is, due to its universal design, applicable to every research field, labeling strategy, and acquisition technique. This opens ultimate flexibility for researchers to design experiments employing innovative and hitherto unexplored labeling strategies. Importantly, pyQms performs very well to accurately quantify partially labeled proteomes in large scale and high throughput, the most challenging task for a quantification algorithm. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  13. Quantification by aberration corrected (S)TEM of boundaries formed by symmetry breaking phase transformations.

    PubMed

    Schryvers, D; Salje, E K H; Nishida, M; De Backer, A; Idrissi, H; Van Aert, S

    2017-05-01

    The present contribution gives a review of recent quantification work of atom displacements, atom site occupations and level of crystallinity in various systems and based on aberration corrected HR(S)TEM images. Depending on the case studied, picometer range precisions for individual distances can be obtained, boundary widths at the unit cell level determined or statistical evolutions of fractions of the ordered areas calculated. In all of these cases, these quantitative measures imply new routes for the applications of the respective materials. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Definition of a new thermal contrast and pulse correction for defect quantification in pulsed thermography

    NASA Astrophysics Data System (ADS)

    Benítez, Hernán D.; Ibarra-Castanedo, Clemente; Bendada, AbdelHakim; Maldague, Xavier; Loaiza, Humberto; Caicedo, Eduardo

    2008-01-01

    It is well known that the methods of thermographic non-destructive testing based on the thermal contrast are strongly affected by non-uniform heating at the surface. Hence, the results obtained from these methods considerably depend on the chosen reference point. The differential absolute contrast (DAC) method was developed to eliminate the need of determining a reference point that defined the thermal contrast with respect to an ideal sound area. Although, very useful at early times, the DAC accuracy decreases when the heat front approaches the sample rear face. We propose a new DAC version by explicitly introducing the sample thickness using the thermal quadrupoles theory and showing that the new DAC range of validity increases for long times while preserving the validity for short times. This new contrast is used for defect quantification in composite, Plexiglas™ and aluminum samples.

  15. Fast quantification of bovine milk proteins employing external cavity-quantum cascade laser spectroscopy.

    PubMed

    Schwaighofer, Andreas; Kuligowski, Julia; Quintás, Guillermo; Mayer, Helmut K; Lendl, Bernhard

    2018-06-30

    Analysis of proteins in bovine milk is usually tackled by time-consuming analytical approaches involving wet-chemical, multi-step sample clean-up procedures. The use of external cavity-quantum cascade laser (EC-QCL) based IR spectroscopy was evaluated as an alternative screening tool for direct and simultaneous quantification of individual proteins (i.e. casein and β-lactoglobulin) and total protein content in commercial bovine milk samples. Mid-IR spectra of protein standard mixtures were used for building partial least squares (PLS) regression models. A sample set comprising different milk types (pasteurized; differently processed extended shelf life, ESL; ultra-high temperature, UHT) was analysed and results were compared to reference methods. Concentration values of the QCL-IR spectroscopy approach obtained within several minutes are in good agreement with reference methods involving multiple sample preparation steps. The potential application as a fast screening method for estimating the heat load applied to liquid milk is demonstrated. Copyright © 2018 Elsevier Ltd. All rights reserved.

  16. Microbial quantification in activated sludge: the hits and misses.

    PubMed

    Hall, S J; Keller, J; Blackall, L L

    2003-01-01

    Since the implementation of the activated sludge process for treating wastewater, there has been a reliance on chemical and physical parameters to monitor the system. However, in biological nutrient removal (BNR) processes, the microorganisms responsible for some of the transformations should be used to monitor the processes with the overall goal to achieve better treatment performance. The development of in situ identification and rapid quantification techniques for key microorganisms involved in BNR are required to achieve this goal. This study explored the quantification of Nitrospira, a key organism in the oxidation of nitrite to nitrate in BNR. Two molecular genetic microbial quantification techniques were evaluated: real-time polymerase chain reaction (PCR) and fluorescence in situ hybridisation (FISH) followed by digital image analysis. A correlation between the Nitrospira quantitative data and the nitrate production rate, determined in batch tests, was attempted. The disadvantages and advantages of both methods will be discussed.

  17. [Development and validation of an HPLC method for the quantification of vitamin A in human milk. Its application to a rural population in Argentina].

    PubMed

    López, Laura B; Baroni, Andrea V; Rodríguez, Viviana G; Greco, Carola B; de Costa, Sara Macías; de Ferrer, Patricia Ronayne; Rodríguez de Pece, Silvia

    2005-06-01

    A methodology for the quantification of vitamin A in human milk was developed and validated. Vitamin A levels were assessed in 223 samples corresponding to the 5th, 6th and 7th postpartum months, obtained in the province of Santiago del Estero, Argentina. The samples (500 microL) were saponified with potassium hydroxide/ethanol, extracted with hexane, evaporated to dryness and reconstituted with methanol. A column RP-C18, a mobile phase methanol/water (91:9 v/v) and a fluorescence detector (lambda excitation 330 nm and lambda emition 470 nm) were used for the separation and quantification of vitamin A. The analytical parameters of linearity (r2: 0.9995), detection (0.010 microg/mL) and quantification (0.025 microg/mL) limits, precision of the method (relative standard deviation, RSD = 9.0% within a day and RSD = 8.9% among days) and accuracy (recovery = 83.8%) demonstrate that the developed method allows the quantification of vitamin A in an efficient way. The mean values + standard deviation (SD) obtained for the analyzed samples were 0.60 +/- 0.32; 0.65 +/- 0.33 and 0.61 +/- 0.26 microg/ mL for the 5th, 6th and 7th postpartum months, respectively. There were no significant differences among the three months studied and the values found were similar to those in the literature. Considering the whole population under study, 19.3% showed vitamin A levels less than 0.40 microg/mL, which represents a risk to the children in this group since at least 0.50 microg/mL are necessary to meet the infant daily needs.

  18. Quantification and Statistical Analysis Methods for Vessel Wall Components from Stained Images with Masson's Trichrome

    PubMed Central

    Hernández-Morera, Pablo; Castaño-González, Irene; Travieso-González, Carlos M.; Mompeó-Corredera, Blanca; Ortega-Santana, Francisco

    2016-01-01

    Purpose To develop a digital image processing method to quantify structural components (smooth muscle fibers and extracellular matrix) in the vessel wall stained with Masson’s trichrome, and a statistical method suitable for small sample sizes to analyze the results previously obtained. Methods The quantification method comprises two stages. The pre-processing stage improves tissue image appearance and the vessel wall area is delimited. In the feature extraction stage, the vessel wall components are segmented by grouping pixels with a similar color. The area of each component is calculated by normalizing the number of pixels of each group by the vessel wall area. Statistical analyses are implemented by permutation tests, based on resampling without replacement from the set of the observed data to obtain a sampling distribution of an estimator. The implementation can be parallelized on a multicore machine to reduce execution time. Results The methods have been tested on 48 vessel wall samples of the internal saphenous vein stained with Masson’s trichrome. The results show that the segmented areas are consistent with the perception of a team of doctors and demonstrate good correlation between the expert judgments and the measured parameters for evaluating vessel wall changes. Conclusion The proposed methodology offers a powerful tool to quantify some components of the vessel wall. It is more objective, sensitive and accurate than the biochemical and qualitative methods traditionally used. The permutation tests are suitable statistical techniques to analyze the numerical measurements obtained when the underlying assumptions of the other statistical techniques are not met. PMID:26761643

  19. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    NASA Astrophysics Data System (ADS)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  20. Subnuclear foci quantification using high-throughput 3D image cytometry

    NASA Astrophysics Data System (ADS)

    Wadduwage, Dushan N.; Parrish, Marcus; Choi, Heejin; Engelward, Bevin P.; Matsudaira, Paul; So, Peter T. C.

    2015-07-01

    Ionising radiation causes various types of DNA damages including double strand breaks (DSBs). DSBs are often recognized by DNA repair protein ATM which forms gamma-H2AX foci at the site of the DSBs that can be visualized using immunohistochemistry. However most of such experiments are of low throughput in terms of imaging and image analysis techniques. Most of the studies still use manual counting or classification. Hence they are limited to counting a low number of foci per cell (5 foci per nucleus) as the quantification process is extremely labour intensive. Therefore we have developed a high throughput instrumentation and computational pipeline specialized for gamma-H2AX foci quantification. A population of cells with highly clustered foci inside nuclei were imaged, in 3D with submicron resolution, using an in-house developed high throughput image cytometer. Imaging speeds as high as 800 cells/second in 3D were achieved by using HiLo wide-field depth resolved imaging and a remote z-scanning technique. Then the number of foci per cell nucleus were quantified using a 3D extended maxima transform based algorithm. Our results suggests that while most of the other 2D imaging and manual quantification studies can count only up to about 5 foci per nucleus our method is capable of counting more than 100. Moreover we show that 3D analysis is significantly superior compared to the 2D techniques.

  1. Identification and quantification of Cu-chlorophyll adulteration of edible oils.

    PubMed

    Fang, Mingchih; Tsai, Chia-Fen; Wu, Guan-Yan; Tseng, Su-Hsiang; Cheng, Hwei-Fang; Kuo, Ching-Hao; Hsu, Che-Lun; Kao, Ya-Min; Shih, Daniel Yang-Chih; Chiang, Yu-Mei

    2015-01-01

    Cu-pyropheophytin a, the major Cu-pigment of Cu-chlorophyll, was determined in edible oil by high-resolution mass spectrometry with a high-performance liquid chromatography-quadrupole (HPLC-Q)-Orbitrap system and by HPLC coupled with a photodiode-array detector. Respective limit of detection and limit of quantification levels of 0.02 μg/g and 0.05 μg/g were obtained. Twenty-nine commercial oil products marked as olive oil, grapeseed oil and blended oil, all sourced directly from a food company that committed adulteration with Cu-chlorophyll, were investigated. In this company, four green dyes illegally used in oils were seized during factory investigation by the health authorities. The food additive Cu-pyropheophytin a was found in all confiscated samples in concentrations between 0.02 and 0.39 μg/g. Survey results of another 235 commercial oil samples manufactured from other companies, including olive pomace oil, extra virgin olive oil, olive oil, grapeseed oil and blended oil, indicated high positive incidences of 63%, 39%, 44%, 97% and 8%, respectively, with a concentration range between 0.02 and 0.54 μg/g. High Cu-chlorophyll concentrations are indications for fraudulent adulteration of oils.

  2. Low order models for uncertainty quantification in acoustic propagation problems

    NASA Astrophysics Data System (ADS)

    Millet, Christophe

    2016-11-01

    Long-range sound propagation problems are characterized by both a large number of length scales and a large number of normal modes. In the atmosphere, these modes are confined within waveguides causing the sound to propagate through multiple paths to the receiver. For uncertain atmospheres, the modes are described as random variables. Concise mathematical models and analysis reveal fundamental limitations in classical projection techniques due to different manifestations of the fact that modes that carry small variance can have important effects on the large variance modes. In the present study, we propose a systematic strategy for obtaining statistically accurate low order models. The normal modes are sorted in decreasing Sobol indices using asymptotic expansions, and the relevant modes are extracted using a modified iterative Krylov-based method. The statistics of acoustic signals are computed by decomposing the original pulse into a truncated sum of modal pulses that can be described by a stationary phase method. As the low-order acoustic model preserves the overall structure of waveforms under perturbations of the atmosphere, it can be applied to uncertainty quantification. The result of this study is a new algorithm which applies on the entire phase space of acoustic fields.

  3. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    PubMed

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  4. Evaluation of a Commercial Sandwich Enzyme-Linked Immunosorbent Assay for the Quantification of Beta-Casomorphin 7 in Yogurt Using Solid-Phase Extraction Coupled to Liquid Chromatography-Tandem Mass Spectrometry as the "Gold Standard" Method.

    PubMed

    Nguyen, Duc Doan; Busetti, Francesco; Johnson, Stuart Keith; Solah, Vicky Ann

    2018-03-01

    This study investigated beta-casomorphin 7 (BCM7) in yogurt by means of LC-tandem MS (MS/MS) and enzyme-linked immunosorbent assay (ELISA) and use LC-MS/MS as the "gold standard" method to evaluate the applicability of a commercial ELISA. The level of BCM7 in milk obtained from ELISA analysis was much lower than that obtained by LC-MS/MS analysis and trended to increase during fermentation and storage of yogurt. Meanwhile, the results obtained from LC-MS/MS showed that BCM7 degraded during stages of yogurt processing, and its degradation may have been caused by X-prolyl dipeptidyl aminopeptidase activity. As a result, the commercial sandwich ELISA kit was not suitable for the quantification of BCM7 in fermented dairy milk.

  5. Single cell genomic quantification by non-fluorescence nonlinear microscopy

    NASA Astrophysics Data System (ADS)

    Kota, Divya; Liu, Jing

    2017-02-01

    Human epidermal growth receptor 2 (Her2) is a gene which plays a major role in breast cancer development. The quantification of Her2 expression in single cells is limited by several drawbacks in existing fluorescence-based single molecule techniques, such as low signal-to-noise ratio (SNR), strong autofluorescence and background signals from biological components. For rigorous genomic quantification, a robust method of orthogonal detection is highly desirable and we demonstrated it by two non-fluorescent imaging techniques -transient absorption microscopy (TAM) and second harmonic generation (SHG). In TAM, gold nanoparticles (AuNPs) are chosen as an orthogonal probes for detection of single molecules which gives background-free quantifications of single mRNA transcript. In SHG, emission from barium titanium oxide (BTO) nanoprobes was demonstrated which allows stable signal beyond the autofluorescence window. Her2 mRNA was specifically labeled with nanoprobes which are conjugated with antibodies or oligonucleotides and quantified at single copy sensitivity in the cancer cells and tissues. Furthermore, a non-fluorescent super-resolution concept, named as second harmonic super-resolution microscopy (SHaSM), was proposed to quantify individual Her2 transcripts in cancer cells beyond the diffraction limit. These non-fluorescent imaging modalities will provide new dimensions in biomarker quantification at single molecule sensitivity in turbid biological samples, offering a strong cross-platform strategy for clinical monitoring at single cell resolution.

  6. Sensitivity of Chemical Shift-Encoded Fat Quantification to Calibration of Fat MR Spectrum

    PubMed Central

    Wang, Xiaoke; Hernando, Diego; Reeder, Scott B.

    2015-01-01

    Purpose To evaluate the impact of different fat spectral models on proton density fat-fraction (PDFF) quantification using chemical shift-encoded (CSE) MRI. Material and Methods Simulations and in vivo imaging were performed. In a simulation study, spectral models of fat were compared pairwise. Comparison of magnitude fitting and mixed fitting was performed over a range of echo times and fat fractions. In vivo acquisitions from 41 patients were reconstructed using 7 published spectral models of fat. T2-corrected STEAM-MRS was used as reference. Results Simulations demonstrate that imperfectly calibrated spectral models of fat result in biases that depend on echo times and fat fraction. Mixed fitting is more robust against this bias than magnitude fitting. Multi-peak spectral models showed much smaller differences among themselves than when compared to the single-peak spectral model. In vivo studies show all multi-peak models agree better (for mixed fitting, slope ranged from 0.967–1.045 using linear regression) with reference standard than the single-peak model (for mixed fitting, slope=0.76). Conclusion It is essential to use a multi-peak fat model for accurate quantification of fat with CSE-MRI. Further, fat quantification techniques using multi-peak fat models are comparable and no specific choice of spectral model is shown to be superior to the rest. PMID:25845713

  7. Quantification of Training and Competition Loads in Endurance Sports: Methods and Applications.

    PubMed

    Mujika, Iñigo

    2017-04-01

    Training quantification is basic to evaluate an endurance athlete's responses to training loads, ensure adequate stress/recovery balance, and determine the relationship between training and performance. Quantifying both external and internal workload is important, because external workload does not measure the biological stress imposed by the exercise sessions. Generally used quantification methods include retrospective questionnaires, diaries, direct observation, and physiological monitoring, often based on the measurement of oxygen uptake, heart rate, and blood lactate concentration. Other methods in use in endurance sports include speed measurement and the measurement of power output, made possible by recent technological advances such as power meters in cycling and triathlon. Among subjective methods of quantification, rating of perceived exertion stands out because of its wide use. Concurrent assessments of the various quantification methods allow researchers and practitioners to evaluate stress/recovery balance, adjust individual training programs, and determine the relationships between external load, internal load, and athletes' performance. This brief review summarizes the most relevant external- and internal-workload-quantification methods in endurance sports and provides practical examples of their implementation to adjust the training programs of elite athletes in accordance with their individualized stress/recovery balance.

  8. Quantification of the Keto-Hydroperoxide (HOOCH2OCHO) and Other Elusive Intermediates during Low-Temperature Oxidation of Dimethyl Ether.

    PubMed

    Moshammer, Kai; Jasper, Ahren W; Popolan-Vaida, Denisia M; Wang, Zhandong; Bhavani Shankar, Vijai Shankar; Ruwe, Lena; Taatjes, Craig A; Dagaut, Philippe; Hansen, Nils

    2016-10-04

    This work provides new temperature-dependent mole fractions of elusive intermediates relevant to the low-temperature oxidation of dimethyl ether (DME). It extends the previous study of Moshammer et al. [ J. Phys. Chem. A 2015 , 119 , 7361 - 7374 ] in which a combination of a jet-stirred reactor and molecular beam mass spectrometry with single-photon ionization via tunable synchrotron-generated vacuum-ultraviolet radiation was used to identify (but not quantify) several highly oxygenated species. Here, temperature-dependent concentration profiles of 17 components were determined in the range of 450-1000 K and compared to up-to-date kinetic modeling results. Special emphasis is paid toward the validation and application of a theoretical method for predicting photoionization cross sections that are hard to obtain experimentally but essential to turn mass spectral data into mole fraction profiles. The presented approach enabled the quantification of the hydroperoxymethyl formate (HOOCH 2 OCH 2 O), which is a key intermediate in the low-temperature oxidation of DME. The quantification of this keto-hydroperoxide together with the temperature-dependent concentration profiles of other intermediates including H 2 O 2 , HCOOH, CH 3 OCHO, and CH 3 OOH reveals new opportunities for the development of a next-generation DME combustion chemistry mechanism.

  9. Fully Automated Quantification of Cytomegalovirus (CMV) in Whole Blood with the New Sensitive Abbott RealTime CMV Assay in the Era of the CMV International Standard

    PubMed Central

    Schnepf, Nathalie; Scieux, Catherine; Resche-Riggon, Matthieu; Feghoul, Linda; Xhaard, Alienor; Gallien, Sébastien; Molina, Jean-Michel; Socié, Gérard; Viglietti, Denis; Simon, François; Mazeron, Marie-Christine

    2013-01-01

    Fully standardized reproducible and sensitive quantification assays for cytomegalovirus (CMV) are needed to better define thresholds for antiviral therapy initiation and interruption. We evaluated the newly released Abbott RealTime CMV assay for CMV quantification in whole blood (WB) that includes automated extraction and amplification (m2000 RealTime system). Sensitivity, accuracy, linearity, and intra- and interassay variability were validated in a WB matrix using Quality Control for Molecular Diagnostics (QCMD) panels and the WHO international standard (IS). The intra- and interassay coefficients of variation were 1.37% and 2.09% at 5 log10 copies/ml and 2.41% and 3.80% at 3 log10 copies/ml, respectively. According to expected values for the QCMD and Abbott RealTime CMV methods, the lower limits of quantification were 104 and <50 copies/ml, respectively. The conversion factor between international units and copies (2.18), determined from serial dilutions of the WHO IS in WB, was significantly different from the factor provided by the manufacturer (1.56) (P = 0.001). Results from 302 clinical samples were compared with those from the Qiagen artus CMV assay on the same m2000 RealTime system. The two assays provided highly concordant results (concordance correlation coefficient, 0.92), but the Abbott RealTime CMV assay detected and quantified, respectively, 20.6% and 47.8% more samples than the Qiagen/artus CMV assay. The sensitivity and reproducibility of the results, along with the automation, fulfilled the quality requirements for implementation of the Abbott RealTime CMV assay in clinical settings. Our results highlight the need for careful validation of conversion factors provided by the manufacturers for the WHO IS in WB to allow future comparison of results obtained with different assays. PMID:23616450

  10. Comparison of Calibration of Sensors Used for the Quantification of Nuclear Energy Rate Deposition

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Brun, J.; Reynard-Carette, C.; Tarchalski, M.

    paper will concern these two kinds of calorimetric sensors. It will focus in particular on studies on their out-of-pile calibrations. Firstly, the characteristics of the sensor designs will be detailed (such as geometry, dimension, material sample, assembly, instrumentation). Then the out-of-pile calibration methods will be described. Furthermore numerical results obtained thanks to 2D axisymmetrical thermal simulations (Finite Element Method, CAST3M) and experimental results will be presented for each sensor. A comparison of the two different thermal sensor behaviours will be realized. To conclude a discussion of the advantages and the drawbacks of each sensor will be performed especially regarding measurement methods. (authors)« less

  11. Quantification of Pulmonary Inflammatory Processes Using Chest Radiography: Tuberculosis as the Motivating Application

    PubMed Central

    Giacomini, Guilherme; Miranda, José R.A.; Pavan, Ana Luiza M.; Duarte, Sérgio B.; Ribeiro, Sérgio M.; Pereira, Paulo C.M.; Alves, Allan F.F.; de Oliveira, Marcela; Pina, Diana R.

    2015-01-01

    Abstract The purpose of this work was to develop a quantitative method for evaluating the pulmonary inflammatory process (PIP) through the computational analysis of chest radiography exams in posteroanterior (PA) and lateral views. The quantification procedure was applied to patients with tuberculosis (TB) as the motivating application. A study of high-resolution computed tomography (HRCT) examinations of patients with TB was developed to establish a relation between the inflammatory process and the signal difference-to-noise ratio (SDNR) measured in the PA projection. A phantom essay was used to validate this relation, which was implemented using an algorithm that is able to estimate the volume of the inflammatory region based solely on SDNR values in the chest radiographs of patients. The PIP volumes that were quantified for 30 patients with TB were used for comparisons with direct HRCT analysis for the same patient. The Bland–Altman statistical analyses showed no significant differences between the 2 quantification methods. The linear regression line had a correlation coefficient of R2 = 0.97 and P < 0.001, showing a strong association between the volume that was determined by our evaluation method and the results obtained by direct HRCT scan analysis. Since the diagnosis and follow-up of patients with TB is commonly performed using X-rays exams, the method developed herein can be considered an adequate tool for quantifying the PIP with a lower patient radiation dose and lower institutional cost. Although we used patients with TB for the application of the method, this method may be used for other pulmonary diseases characterized by a PIP. PMID:26131814

  12. Sensitive and selective liquid chromatography-tandem mass spectrometry method for the quantification of aniracetam in human plasma.

    PubMed

    Zhang, Jingjing; Liang, Jiabi; Tian, Yuan; Zhang, Zunjian; Chen, Yun

    2007-10-15

    A rapid, sensitive and selective LC-MS/MS method was developed and validated for the quantification of aniracetam in human plasma using estazolam as internal standard (IS). Following liquid-liquid extraction, the analytes were separated using a mobile phase of methanol-water (60:40, v/v) on a reverse phase C18 column and analyzed by a triple-quadrupole mass spectrometer in the selected reaction monitoring (SRM) mode using the respective [M+H]+ ions, m/z 220-->135 for aniracetam and m/z 295-->205 for the IS. The assay exhibited a linear dynamic range of 0.2-100 ng/mL for aniracetam in human plasma. The lower limit of quantification (LLOQ) was 0.2 ng/mL with a relative standard deviation of less than 15%. Acceptable precision and accuracy were obtained for concentrations over the standard curve range. The validated LC-MS/MS method has been successfully applied to study the pharmacokinetics of aniracetam in healthy male Chinese volunteers.

  13. Uncertainty Quantification of CFD Data Generated for a Model Scramjet Isolator Flowfield

    NASA Technical Reports Server (NTRS)

    Baurle, R. A.; Axdahl, E. L.

    2017-01-01

    Computational fluid dynamics is now considered to be an indispensable tool for the design and development of scramjet engine components. Unfortunately, the quantification of uncertainties is rarely addressed with anything other than sensitivity studies, so the degree of confidence associated with the numerical results remains exclusively with the subject matter expert that generated them. This practice must be replaced with a formal uncertainty quantification process for computational fluid dynamics to play an expanded role in the system design, development, and flight certification process. Given the limitations of current hypersonic ground test facilities, this expanded role is believed to be a requirement by some in the hypersonics community if scramjet engines are to be given serious consideration as a viable propulsion system. The present effort describes a simple, relatively low cost, nonintrusive approach to uncertainty quantification that includes the basic ingredients required to handle both aleatoric (random) and epistemic (lack of knowledge) sources of uncertainty. The nonintrusive nature of the approach allows the computational fluid dynamicist to perform the uncertainty quantification with the flow solver treated as a "black box". Moreover, a large fraction of the process can be automated, allowing the uncertainty assessment to be readily adapted into the engineering design and development workflow. In the present work, the approach is applied to a model scramjet isolator problem where the desire is to validate turbulence closure models in the presence of uncertainty. In this context, the relevant uncertainty sources are determined and accounted for to allow the analyst to delineate turbulence model-form errors from other sources of uncertainty associated with the simulation of the facility flow.

  14. Multiplex electrochemical DNA platform for femtomolar-level quantification of genetically modified soybean.

    PubMed

    Manzanares-Palenzuela, C Lorena; de-Los-Santos-Álvarez, Noemí; Lobo-Castañón, María Jesús; López-Ruiz, Beatriz

    2015-06-15

    Current EU regulations on the mandatory labeling of genetically modified organisms (GMOs) with a minimum content of 0.9% would benefit from the availability of reliable and rapid methods to detect and quantify DNA sequences specific for GMOs. Different genosensors have been developed to this aim, mainly intended for GMO screening. A remaining challenge, however, is the development of genosensing platforms for GMO quantification, which should be expressed as the number of event-specific DNA sequences per taxon-specific sequences. Here we report a simple and sensitive multiplexed electrochemical approach for the quantification of Roundup-Ready Soybean (RRS). Two DNA sequences, taxon (lectin) and event-specific (RR), are targeted via hybridization onto magnetic beads. Both sequences are simultaneously detected by performing the immobilization, hybridization and labeling steps in a single tube and parallel electrochemical readout. Hybridization is performed in a sandwich format using signaling probes labeled with fluorescein isothiocyanate (FITC) or digoxigenin (Dig), followed by dual enzymatic labeling using Fab fragments of anti-Dig and anti-FITC conjugated to peroxidase or alkaline phosphatase, respectively. Electrochemical measurement of the enzyme activity is finally performed on screen-printed carbon electrodes. The assay gave a linear range of 2-250 pM for both targets, with LOD values of 650 fM (160 amol) and 190 fM (50 amol) for the event-specific and the taxon-specific targets, respectively. Results indicate that the method could be applied for GMO quantification below the European labeling threshold level (0.9%), offering a general approach for the rapid quantification of specific GMO events in foods. Copyright © 2015 Elsevier B.V. All rights reserved.

  15. Laser-induced plasma characterization through self-absorption quantification

    NASA Astrophysics Data System (ADS)

    Hou, JiaJia; Zhang, Lei; Zhao, Yang; Yan, Xingyu; Ma, Weiguang; Dong, Lei; Yin, Wangbao; Xiao, Liantuan; Jia, Suotang

    2018-07-01

    A self-absorption quantification method is proposed to quantify the self-absorption degree of spectral lines, in which plasma characteristics including electron temperature, elemental concentration ratio, and absolute species number density can be deduced directly. Since there is no spectral intensity involved in the calculation, the analysis results are independent of the self-absorption effects and the additional spectral efficiency calibration is not required. In order to evaluate the practicality, the limitation for application and the precision of this method are also discussed. Experimental results of aluminum-lithium alloy prove that the proposed method is qualified to realize semi-quantitative measurements and fast plasma characteristics diagnostics.

  16. The use of self-quantification systems for personal health information: big data management activities and prospects.

    PubMed

    Almalki, Manal; Gray, Kathleen; Sanchez, Fernando Martin

    2015-01-01

    Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Self-quantification in personal health maintenance

  17. Clinical applications of MS-based protein quantification.

    PubMed

    Sabbagh, Bassel; Mindt, Sonani; Neumaier, Michael; Findeisen, Peter

    2016-04-01

    Mass spectrometry-based assays are increasingly important in clinical laboratory medicine and nowadays are already commonly used in several areas of routine diagnostics. These include therapeutic drug monitoring, toxicology, endocrinology, pediatrics, and microbiology. Accordingly, some of the most common analyses are therapeutic drug monitoring of immunosuppressants, vitamin D, steroids, newborn screening, and bacterial identification. However, MS-based quantification of peptides and proteins for routine diagnostic use is rather rare up to now despite excellent analytical specificity and good sensitivity. Here, we want to give an overview over current fit-for-purpose assays for MS-based protein quantification. Advantages as well as challenges of this approach will be discussed with focus on feasibility for routine diagnostic use. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  18. Quantification and Formalization of Security

    DTIC Science & Technology

    2010-02-01

    Quantification of Information Flow . . . . . . . . . . . . . . . . . . 30 2.4 Language Semantics . . . . . . . . . . . . . . . . . . . . . . . . . . 46...system behavior observed by users holding low clearances. This policy, or a variant of it, is enforced by many pro- gramming language -based mechanisms...illustrates with a particular programming language (while-programs plus probabilistic choice). The model is extended in §2.5 to programs in which

  19. Two-stream Convolutional Neural Network for Methane Emissions Quantification

    NASA Astrophysics Data System (ADS)

    Wang, J.; Ravikumar, A. P.; McGuire, M.; Bell, C.; Tchapmi, L. P.; Brandt, A. R.

    2017-12-01

    Methane, a key component of natural gas, has a 25x higher global warming potential than carbon dioxide on a 100-year basis. Accurately monitoring and mitigating methane emissions require cost-effective detection and quantification technologies. Optical gas imaging, one of the most commonly used leak detection technology, adopted by Environmental Protection Agency, cannot estimate leak-sizes. In this work, we harness advances in computer science to allow for rapid and automatic leak quantification. Particularly, we utilize two-stream deep Convolutional Networks (ConvNets) to estimate leak-size by capturing complementary spatial information from still plume frames, and temporal information from plume motion between frames. We build large leak datasets for training and evaluating purposes by collecting about 20 videos (i.e. 397,400 frames) of leaks. The videos were recorded at six distances from the source, covering 10 -60 ft. Leak sources included natural gas well-heads, separators, and tanks. All frames were labeled with a true leak size, which has eight levels ranging from 0 to 140 MCFH. Preliminary analysis shows that two-stream ConvNets provides significant accuracy advantage over single steam ConvNets. Spatial stream ConvNet can achieve an accuracy of 65.2%, by extracting important features, including texture, plume area, and pattern. Temporal stream, fed by the results of optical flow analysis, results in an accuracy of 58.3%. The integration of the two-stream ConvNets gives a combined accuracy of 77.6%. For future work, we will split the training and testing datasets in distinct ways in order to test the generalization of the algorithm for different leak sources. Several analytic metrics, including confusion matrix and visualization of key features, will be used to understand accuracy rates and occurrences of false positives. The quantification algorithm can help to find and fix super-emitters, and improve the cost-effectiveness of leak detection and repair

  20. VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.

    2015-12-01

    A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.

  1. A validated Fourier transform infrared spectroscopy method for quantification of total lactones in Inula racemosa and Andrographis paniculata.

    PubMed

    Shivali, Garg; Praful, Lahorkar; Vijay, Gadgil

    2012-01-01

    Fourier transform infrared (FT-IR) spectroscopy is a technique widely used for detection and quantification of various chemical moieties. This paper describes the use of the FT-IR spectroscopy technique for the quantification of total lactones present in Inula racemosa and Andrographis paniculata. To validate the FT-IR spectroscopy method for quantification of total lactones in I. racemosa and A. paniculata. Dried and powdered I. racemosa roots and A. paniculata plant were extracted with ethanol and dried to remove ethanol completely. The ethanol extract was analysed in a KBr pellet by FT-IR spectroscopy. The FT-IR spectroscopy method was validated and compared with a known spectrophotometric method for quantification of lactones in A. paniculata. By FT-IR spectroscopy, the amount of total lactones was found to be 2.12 ± 0.47% (n = 3) in I. racemosa and 8.65 ± 0.51% (n = 3) in A. paniculata. The method showed comparable results with a known spectrophotometric method used for quantification of such lactones: 8.42 ± 0.36% (n = 3) in A. paniculata. Limits of detection and quantification for isoallantolactone were 1 µg and 10 µg respectively; for andrographolide they were 1.5 µg and 15 µg respectively. Recoveries were over 98%, with good intra- and interday repeatability: RSD ≤ 2%. The FT-IR spectroscopy method proved linear, accurate, precise and specific, with low limits of detection and quantification, for estimation of total lactones, and is less tedious than the UV spectrophotometric method for the compounds tested. This validated FT-IR spectroscopy method is readily applicable for the quality control of I. racemosa and A. paniculata. Copyright © 2011 John Wiley & Sons, Ltd.

  2. Chaotic behavior in Malaysian stock market: A study with recurrence quantification analysis

    NASA Astrophysics Data System (ADS)

    Niu, Betty Voon Wan; Noorani, Mohd Salmi Md; Jaaman, Saiful Hafizah

    2016-11-01

    The dynamics of stock market has been questioned for decades. Its behavior appeared random yet some found it behaves as chaos. Up to 5000 daily adjusted closing data of FTSE Bursa Malaysia Kuala Lumpur Composite Index (KLSE) was investigated through recurrence plot and recurrence quantification analysis. Results were compared between stochastic system, chaotic system and deterministic system. Results show that KLSE daily adjusted closing data behaves chaotically.

  3. Relative quantification of N(epsilon)-(Carboxymethyl)lysine, imidazolone A, and the Amadori product in glycated lysozyme by MALDI-TOF mass spectrometry.

    PubMed

    Kislinger, Thomas; Humeny, Andreas; Peich, Carlo C; Zhang, Xiaohong; Niwa, Toshimitsu; Pischetsrieder, Monika; Becker, Cord-Michael

    2003-01-01

    The nonenzymatic glycation of proteins by reducing sugars, also known as the Maillard reaction, has received increasing recognition from nutritional science and medical research. In this study, we applied matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) to perform relative and simultaneous quantification of the Amadori product, which is an early glycation product, and of N(epsilon)-(carboxymethyl)lysine and imidazolone A, two important advanced glycation end products. Therefore, native lysozyme was incubated with d-glucose for increasing periods of time (1, 4, 8, and 16 weeks) in phosphate-buffered saline pH 7.8 at 50 degrees C. After enzymatic digestion with endoproteinase Glu-C, the N-terminal peptide fragment (m/z 838; amino acid sequence KVFGRCE) and the C-terminal peptide fragment (m/z 1202; amino acid sequence VQAWIRGCRL) were used for relative quantification of the three Maillard products. Amadori product, N(epsilon)-(carboxymethyl)lysine, and imidazolone A were the main glycation products formed under these conditions. Their formation was dependent on glucose concentration and reaction time. The kinetics were similar to those obtained by competitive ELISA, an established method for quantification of N(epsilon)-(carboxymethyl)lysine and imidazolone A. Inhibition experiments showed that coincubation with N(alpha)-acetylargine suppressed formation of imidazolone A but not of the Amadori product or N(epsilon)-(carboxymethyl)lysine. The presence of N(alpha)-acetyllysine resulted in the inhibition of lysine modifications but in higher concentrations of imidazolone A. o-Phenylenediamine decreased the yield of the Amadori product and completely inhibited the formation of N(epsilon)-(carboxymethyl)lysine and imidazolone A. MALDI-TOF-MS proved to be a new analytical tool for the simultaneous, relative quantification of specific products of the Maillard reaction. For the first time, kinetic data of defined products on

  4. Fully automated system for the quantification of human osteoarthritic knee joint effusion volume using magnetic resonance imaging

    PubMed Central

    2010-01-01

    Introduction Joint effusion is frequently associated with osteoarthritis (OA) flare-up and is an important marker of therapeutic response. This study aimed at developing and validating a fully automated system based on magnetic resonance imaging (MRI) for the quantification of joint effusion volume in knee OA patients. Methods MRI examinations consisted of two axial sequences: a T2-weighted true fast imaging with steady-state precession and a T1-weighted gradient echo. An automated joint effusion volume quantification system using MRI was developed and validated (a) with calibrated phantoms (cylinder and sphere) and effusion from knee OA patients; (b) with assessment by manual quantification; and (c) by direct aspiration. Twenty-five knee OA patients with joint effusion were included in the study. Results The automated joint effusion volume quantification was developed as a four stage sequencing process: bone segmentation, filtering of unrelated structures, segmentation of joint effusion, and subvoxel volume calculation. Validation experiments revealed excellent coefficients of variation with the calibrated cylinder (1.4%) and sphere (0.8%) phantoms. Comparison of the OA knee joint effusion volume assessed by the developed automated system and by manual quantification was also excellent (r = 0.98; P < 0.0001), as was the comparison with direct aspiration (r = 0.88; P = 0.0008). Conclusions The newly developed fully automated MRI-based system provided precise quantification of OA knee joint effusion volume with excellent correlation with data from phantoms, a manual system, and joint aspiration. Such an automated system will be instrumental in improving the reproducibility/reliability of the evaluation of this marker in clinical application. PMID:20846392

  5. Uncertainty Quantification in Geomagnetic Field Modeling

    NASA Astrophysics Data System (ADS)

    Chulliat, A.; Nair, M. C.; Alken, P.; Meyer, B.; Saltus, R.; Woods, A.

    2017-12-01

    Geomagnetic field models are mathematical descriptions of the various sources of the Earth's magnetic field, and are generally obtained by solving an inverse problem. They are widely used in research to separate and characterize field sources, but also in many practical applications such as aircraft and ship navigation, smartphone orientation, satellite attitude control, and directional drilling. In recent years, more sophisticated models have been developed, thanks to the continuous availability of high quality satellite data and to progress in modeling techniques. Uncertainty quantification has become an integral part of model development, both to assess the progress made and to address specific users' needs. Here we report on recent advances made by our group in quantifying the uncertainty of geomagnetic field models. We first focus on NOAA's World Magnetic Model (WMM) and the International Geomagnetic Reference Field (IGRF), two reference models of the main (core) magnetic field produced every five years. We describe the methods used in quantifying the model commission error as well as the omission error attributed to various un-modeled sources such as magnetized rocks in the crust and electric current systems in the atmosphere and near-Earth environment. A simple error model was derived from this analysis, to facilitate usage in practical applications. We next report on improvements brought by combining a main field model with a high resolution crustal field model and a time-varying, real-time external field model, like in NOAA's High Definition Geomagnetic Model (HDGM). The obtained uncertainties are used by the directional drilling industry to mitigate health, safety and environment risks.

  6. Quantification of Wilms' tumor 1 mRNA by digital polymerase chain reaction.

    PubMed

    Koizumi, Yuki; Furuya, Daisuke; Endo, Teruo; Asanuma, Kouichi; Yanagihara, Nozomi; Takahashi, Satoshi

    2018-02-01

    Wilms' tumor 1 (WT1) is overexpressed in various hematopoietic tumors and widely used as a marker of minimal residual disease. WT1 mRNA has been analyzed using quantitative real-time polymerase chain reaction (real-time PCR). In the present study, we analyzed 40 peripheral blood and bone marrow samples obtained from cases of acute myeloid leukemia, acute lymphoblastic leukemia, and myelodysplastic syndrome at Sapporo Medical University Hospital from April 2012 to January 2015. We performed quantification of WT1 was performed using QuantStudio 3D Digital PCR System (Thermo Fisher Scientific‎) and compared the results between digital PCR and real-time PCR technology. The correlation between digital PCR and real-time PCR was very strong (R = 0.99), and the detection limits of the two methods were equivalent. Digital PCR was able to accurately detect lower WT levels compared with real-time PCR. Digital PCR technology can thus be utilized to predict WT1/ABL1 expression level accurately and should thus be useful for diagnosis or the evaluation of drug efficiency in patients with leukemia.

  7. Application of Stochastic Labeling with Random-Sequence Barcodes for Simultaneous Quantification and Sequencing of Environmental 16S rRNA Genes.

    PubMed

    Hoshino, Tatsuhiko; Inagaki, Fumio

    2017-01-01

    Next-generation sequencing (NGS) is a powerful tool for analyzing environmental DNA and provides the comprehensive molecular view of microbial communities. For obtaining the copy number of particular sequences in the NGS library, however, additional quantitative analysis as quantitative PCR (qPCR) or digital PCR (dPCR) is required. Furthermore, number of sequences in a sequence library does not always reflect the original copy number of a target gene because of biases caused by PCR amplification, making it difficult to convert the proportion of particular sequences in the NGS library to the copy number using the mass of input DNA. To address this issue, we applied stochastic labeling approach with random-tag sequences and developed a NGS-based quantification protocol, which enables simultaneous sequencing and quantification of the targeted DNA. This quantitative sequencing (qSeq) is initiated from single-primer extension (SPE) using a primer with random tag adjacent to the 5' end of target-specific sequence. During SPE, each DNA molecule is stochastically labeled with the random tag. Subsequently, first-round PCR is conducted, specifically targeting the SPE product, followed by second-round PCR to index for NGS. The number of random tags is only determined during the SPE step and is therefore not affected by the two rounds of PCR that may introduce amplification biases. In the case of 16S rRNA genes, after NGS sequencing and taxonomic classification, the absolute number of target phylotypes 16S rRNA gene can be estimated by Poisson statistics by counting random tags incorporated at the end of sequence. To test the feasibility of this approach, the 16S rRNA gene of Sulfolobus tokodaii was subjected to qSeq, which resulted in accurate quantification of 5.0 × 103 to 5.0 × 104 copies of the 16S rRNA gene. Furthermore, qSeq was applied to mock microbial communities and environmental samples, and the results were comparable to those obtained using digital PCR and

  8. Detection and quantification of proteins and cells by use of elemental mass spectrometry: progress and challenges.

    PubMed

    Yan, Xiaowen; Yang, Limin; Wang, Qiuquan

    2013-07-01

    Much progress has been made in identification of the proteins in proteomes, and quantification of these proteins has attracted much interest. In addition to popular tandem mass spectrometric methods based on soft ionization, inductively coupled plasma mass spectrometry (ICPMS), a typical example of mass spectrometry based on hard ionization, usually used for analysis of elements, has unique advantages in absolute quantification of proteins by determination of an element with a definite stoichiometry in a protein or attached to the protein. In this Trends article, we briefly describe state-of-the-art ICPMS-based methods for quantification of proteins, emphasizing protein-labeling and element-tagging strategies developed on the basis of chemically selective reactions and/or biospecific interactions. Recent progress from protein to cell quantification by use of ICPMS is also discussed, and the possibilities and challenges of ICPMS-based protein quantification for universal, selective, or targeted quantification of proteins and cells in a biological sample are also discussed critically. We believe ICPMS-based protein quantification will become ever more important in targeted quantitative proteomics and bioanalysis in the near future.

  9. Recent advances in stable isotope labeling based techniques for proteome relative quantification.

    PubMed

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2014-10-24

    The large scale relative quantification of all proteins expressed in biological samples under different states is of great importance for discovering proteins with important biological functions, as well as screening disease related biomarkers and drug targets. Therefore, the accurate quantification of proteins at proteome level has become one of the key issues in protein science. Herein, the recent advances in stable isotope labeling based techniques for proteome relative quantification were reviewed, from the aspects of metabolic labeling, chemical labeling and enzyme-catalyzed labeling. Furthermore, the future research direction in this field was prospected. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Epicocconone, a sensitive and specific fluorescent dye for in situ quantification of extracellular proteins within bacterial biofilms.

    PubMed

    Randrianjatovo, I; Girbal-Neuhauser, E; Marcato-Romain, C-E

    2015-06-01

    Biofilms are ecosystems of closely associated bacteria encapsulated in an extracellular matrix mainly composed of polysaccharides and proteins. A novel approach was developed for in situ quantification of extracellular proteins (ePNs) in various bacterial biofilms using epicocconone, a natural, fluorescent compound that binds amine residues of proteins. Six commercial proteins were tested for their reaction with epicocconone, and bovine serum albumin (BSA) was selected for assay optimization. The optimized protocol, performed as a microassay, allowed protein amounts as low as 0.7 μg to as high as 50 μg per well to be detected. Addition of monosaccharides or polysaccharides (glucose, dextran or alginate) to the standard BSA solutions (0 to 250 μg ml(-1)) showed little or no sugar interference up to 2000 μg ml(-1), thus providing an assessment of the specificity of epicocconone for proteins. The optimized protocol was then applied to three different biofilms, and in situ quantification of ePN showed contrasted protein amounts of 22.1 ± 3.1, 38.3 ± 7.1 and 0.3 ± 0.1 μg equivalent BSA of proteins for 48-h biofilms of Pseudomonas aeruginosa, Bacillus licheniformis and Weissella confusa, respectively. Possible interference due to global matrix compounds on the in situ quantification of proteins was also investigated by applying the standard addition method (SAM). Low error percentages were obtained, indicating a correct quantification of both the ePN and the added proteins. For the first time, a specific and sensitive assay has been developed for in situ determination of ePN produced by bacterial cells. This advance should lead to an accurate, rapid tool for further protein labelling and microscopic observation of the extracellular matrix of biofilms.

  11. Quantification of DNA using the luminescent oxygen channeling assay.

    PubMed

    Patel, R; Pollner, R; de Keczer, S; Pease, J; Pirio, M; DeChene, N; Dafforn, A; Rose, S

    2000-09-01

    Simplified and cost-effective methods for the detection and quantification of nucleic acid targets are still a challenge in molecular diagnostics. Luminescent oxygen channeling assay (LOCI(TM)) latex particles can be conjugated to synthetic oligodeoxynucleotides and hybridized, via linking probes, to different DNA targets. These oligomer-conjugated LOCI particles survive thermocycling in a PCR reaction and allow quantified detection of DNA targets in both real-time and endpoint formats. The endpoint DNA quantification format utilized two sensitizer bead types that are sensitive to separate illumination wavelengths. These two bead types were uniquely annealed to target or control amplicons, and separate illuminations generated time-resolved chemiluminescence, which distinguished the two amplicon types. In the endpoint method, ratios of the two signals allowed determination of the target DNA concentration over a three-log range. The real-time format allowed quantification of the DNA target over a six-log range with a linear relationship between threshold cycle and log of the number of DNA targets. This is the first report of the use of an oligomer-labeled latex particle assay capable of producing DNA quantification and sequence-specific chemiluminescent signals in a homogeneous format. It is also the first report of the generation of two signals from a LOCI assay. The methods described here have been shown to be easily adaptable to new DNA targets because of the generic nature of the oligomer-labeled LOCI particles.

  12. Combining image-derived and venous input functions enables quantification of serotonin-1A receptors with [carbonyl-11C]WAY-100635 independent of arterial sampling.

    PubMed

    Hahn, Andreas; Nics, Lukas; Baldinger, Pia; Ungersböck, Johanna; Dolliner, Peter; Frey, Richard; Birkfellner, Wolfgang; Mitterhauser, Markus; Wadsak, Wolfgang; Karanikas, Georgios; Kasper, Siegfried; Lanzenberger, Rupert

    2012-08-01

    image- derived input functions (IDIFs) represent a promising technique for a simpler and less invasive quantification of PET studies as compared to arterial cannulation. However, a number of limitations complicate the routine use of IDIFs in clinical research protocols and the full substitution of manual arterial samples by venous ones has hardly been evaluated. This study aims for a direct validation of IDIFs and venous data for the quantification of serotonin-1A receptor binding (5-HT(1A)) with [carbonyl-(11)C]WAY-100635 before and after hormone treatment. Fifteen PET measurements with arterial and venous blood sampling were obtained from 10 healthy women, 8 scans before and 7 after eight weeks of hormone replacement therapy. Image-derived input functions were derived automatically from cerebral blood vessels, corrected for partial volume effects and combined with venous manual samples from 10 min onward (IDIF+VIF). Corrections for plasma/whole-blood ratio and metabolites were done separately with arterial and venous samples. 5-HT(1A) receptor quantification was achieved with arterial input functions (AIF) and IDIF+VIF using a two-tissue compartment model. Comparison between arterial and venous manual blood samples yielded excellent reproducibility. Variability (VAR) was less than 10% for whole-blood activity (p>0.4) and below 2% for plasma to whole-blood ratios (p>0.4). Variability was slightly higher for parent fractions (VARmax=24% at 5 min, p<0.05 and VAR<13% after 20 min, p>0.1) but still within previously reported values. IDIFs after partial volume correction had peak values comparable to AIFs (mean difference Δ=-7.6 ± 16.9 kBq/ml, p>0.1), whereas AIFs exhibited a delay (Δ=4 ± 6.4s, p<0.05) and higher peak width (Δ=15.9 ± 5.2s, p<0.001). Linear regression analysis showed strong agreement for 5-HT(1A) binding as obtained with AIF and IDIF+VIF at baseline (R(2)=0.95), after treatment (R(2)=0.93) and when pooling all scans (R(2)=0.93), with slopes and

  13. Correlation between radio-induced lymphocyte apoptosis measurements obtained from two French centres.

    PubMed

    Mirjolet, C; Merlin, J L; Dalban, C; Maingon, P; Azria, D

    2016-07-01

    In the era of modern treatment delivery, increasing the dose delivered to the target to improve local control might be modulated by the patient's intrinsic radio-sensitivity. A predictive assay based on radio-induced lymphocyte apoptosis quantification highlighted the significant correlation between CD4 and CD8 T-lymphocyte apoptosis and grade 2 or 3 radiation-induced late toxicities. By conducting this assay at several technical platforms, the aim of this study was to demonstrate that radio-induced lymphocyte apoptosis values obtained from two different platforms were comparable. For 25 patients included in the PARATOXOR trial running in Dijon the radio-induced lymphocyte apoptosis results obtained from the laboratory of Montpellier (IRCM, Inserm U1194, France), considered as the reference (referred to as Lab 1), were compared with those from the laboratory located at the Institut de cancérologie de Lorraine (ICL, France), referred to as Lab 2. Different statistical methods were used to measure the agreement between the radio-induced lymphocyte apoptosis data from the two laboratories (quantitative data). The Bland-Altman plot was used to identify potential bias. All statistical tests demonstrated good agreement between radio-induced lymphocyte apoptosis values obtained from both sites and no major bias was identified. Since radio-induced lymphocyte apoptosis values, which predict tolerance to radiotherapy, could be assessed by two laboratories and showed a high level of robustness and consistency, we can suggest that this assay be extended to any laboratories that use the same technique. Copyright © 2016 Société française de radiothérapie oncologique (SFRO). Published by Elsevier SAS. All rights reserved.

  14. Quantification and Segmentation of Brain Tissues from MR Images: A Probabilistic Neural Network Approach

    PubMed Central

    Wang, Yue; Adalý, Tülay; Kung, Sun-Yuan; Szabo, Zsolt

    2007-01-01

    This paper presents a probabilistic neural network based technique for unsupervised quantification and segmentation of brain tissues from magnetic resonance images. It is shown that this problem can be solved by distribution learning and relaxation labeling, resulting in an efficient method that may be particularly useful in quantifying and segmenting abnormal brain tissues where the number of tissue types is unknown and the distributions of tissue types heavily overlap. The new technique uses suitable statistical models for both the pixel and context images and formulates the problem in terms of model-histogram fitting and global consistency labeling. The quantification is achieved by probabilistic self-organizing mixtures and the segmentation by a probabilistic constraint relaxation network. The experimental results show the efficient and robust performance of the new algorithm and that it outperforms the conventional classification based approaches. PMID:18172510

  15. qPCR-based mitochondrial DNA quantification: Influence of template DNA fragmentation on accuracy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jackson, Christopher B., E-mail: Christopher.jackson@insel.ch; Gallati, Sabina, E-mail: sabina.gallati@insel.ch; Schaller, Andre, E-mail: andre.schaller@insel.ch

    2012-07-06

    Highlights: Black-Right-Pointing-Pointer Serial qPCR accurately determines fragmentation state of any given DNA sample. Black-Right-Pointing-Pointer Serial qPCR demonstrates different preservation of the nuclear and mitochondrial genome. Black-Right-Pointing-Pointer Serial qPCR provides a diagnostic tool to validate the integrity of bioptic material. Black-Right-Pointing-Pointer Serial qPCR excludes degradation-induced erroneous quantification. -- Abstract: Real-time PCR (qPCR) is the method of choice for quantification of mitochondrial DNA (mtDNA) by relative comparison of a nuclear to a mitochondrial locus. Quantitative abnormal mtDNA content is indicative of mitochondrial disorders and mostly confines in a tissue-specific manner. Thus handling of degradation-prone bioptic material is inevitable. We established a serialmore » qPCR assay based on increasing amplicon size to measure degradation status of any DNA sample. Using this approach we can exclude erroneous mtDNA quantification due to degraded samples (e.g. long post-exicision time, autolytic processus, freeze-thaw cycles) and ensure abnormal DNA content measurements (e.g. depletion) in non-degraded patient material. By preparation of degraded DNA under controlled conditions using sonification and DNaseI digestion we show that erroneous quantification is due to the different preservation qualities of the nuclear and the mitochondrial genome. This disparate degradation of the two genomes results in over- or underestimation of mtDNA copy number in degraded samples. Moreover, as analysis of defined archival tissue would allow to precise the molecular pathomechanism of mitochondrial disorders presenting with abnormal mtDNA content, we compared fresh frozen (FF) with formalin-fixed paraffin-embedded (FFPE) skeletal muscle tissue of the same sample. By extrapolation of measured decay constants for nuclear DNA ({lambda}{sub nDNA}) and mtDNA ({lambda}{sub mtDNA}) we present an approach to possibly correct

  16. A Pragmatic Smoothing Method for Improving the Quality of the Results in Atomic Spectroscopy

    NASA Astrophysics Data System (ADS)

    Bennun, Leonardo

    2017-07-01

    A new smoothing method for the improvement on the identification and quantification of spectral functions based on the previous knowledge of the signals that are expected to be quantified, is presented. These signals are used as weighted coefficients in the smoothing algorithm. This smoothing method was conceived to be applied in atomic and nuclear spectroscopies preferably to these techniques where net counts are proportional to acquisition time, such as particle induced X-ray emission (PIXE) and other X-ray fluorescence spectroscopic methods, etc. This algorithm, when properly applied, does not distort the form nor the intensity of the signal, so it is well suited for all kind of spectroscopic techniques. This method is extremely effective at reducing high-frequency noise in the signal much more efficient than a single rectangular smooth of the same width. As all of smoothing techniques, the proposed method improves the precision of the results, but in this case we found also a systematic improvement on the accuracy of the results. We still have to evaluate the improvement on the quality of the results when this method is applied over real experimental results. We expect better characterization of the net area quantification of the peaks, and smaller Detection and Quantification Limits. We have applied this method to signals that obey Poisson statistics, but with the same ideas and criteria, it could be applied to time series. In a general case, when this algorithm is applied over experimental results, also it would be required that the sought characteristic functions, required for this weighted smoothing method, should be obtained from a system with strong stability. If the sought signals are not perfectly clean, this method should be carefully applied

  17. Characterization and quantification of grape variety by means of shikimic acid concentration and protein fingerprint in still white wines.

    PubMed

    Chabreyrie, David; Chauvet, Serge; Guyon, François; Salagoïty, Marie-Hélène; Antinelli, Jean-François; Medina, Bernard

    2008-08-27

    Protein profiles, obtained by high-performance capillary electrophoresis (HPCE) on white wines previously dialyzed, combined with shikimic acid concentration and multivariate analysis, were used for the determination of grape variety composition of a still white wine. Six varieties were studied through monovarietal wines elaborated in the laboratory: Chardonnay (24 samples), Chenin (24), Petit Manseng (7), Sauvignon (37), Semillon (24), and Ugni Blanc (9). Homemade mixtures were elaborated from authentic monovarietal wines according to a Plackett-Burman sampling plan. After protein peak area normalization, a matrix was elaborated containing protein results of wines (mixtures and monovarietal). Partial least-squares processing was applied to this matrix allowing the elaboration of a model that provided a varietal quantification precision of around 20% for most of the grape varieties studied. The model was applied to commercial samples from various geographical origins, providing encouraging results for control purposes.

  18. Direct Quantification of Solute Diffusivity in Agarose and Articular Cartilage Using Correlation Spectroscopy.

    PubMed

    Shoga, Janty S; Graham, Brian T; Wang, Liyun; Price, Christopher

    2017-10-01

    Articular cartilage is an avascular tissue; diffusive transport is critical for its homeostasis. While numerous techniques have been used to quantify diffusivity within porous, hydrated tissues and tissue engineered constructs, these techniques have suffered from issues regarding invasiveness and spatial resolution. In the present study, we implemented and compared two separate correlation spectroscopy techniques, fluorescence correlation spectroscopy (FCS) and raster image correlation spectroscopy (RICS), for the direct, and minimally-invasive quantification of fluorescent solute diffusion in agarose and articular cartilage. Specifically, we quantified the diffusional properties of fluorescein and Alexa Fluor 488-conjugated dextrans (3k and 10k) in aqueous solutions, agarose gels of varying concentration (i.e. 1, 3, 5%), and in different zones of juvenile bovine articular cartilage explants (i.e. superficial, middle, and deep). In agarose, properties of solute diffusion obtained via FCS and RICS were inversely related to molecule size, gel concentration, and applied strain. In cartilage, the diffusional properties of solutes were similarly dependent upon solute size, cartilage zone, and compressive strain; findings that agree with work utilizing other quantification techniques. In conclusion, this study established the utility of FCS and RICS as simple and minimally invasive techniques for quantifying microscale solute diffusivity within agarose constructs and articular cartilage explants.

  19. Remote quantification of phycocyanin in potable water sources through an adaptive model

    NASA Astrophysics Data System (ADS)

    Song, Kaishan; Li, Lin; Tedesco, Lenore P.; Li, Shuai; Hall, Bob E.; Du, Jia

    2014-09-01

    Cyanobacterial blooms in water supply sources in both central Indiana USA (CIN) and South Australia (SA) are a cause of great concerns for toxin production and water quality deterioration. Remote sensing provides an effective approach for quick assessment of cyanobacteria through quantification of phycocyanin (PC) concentration. In total, 363 samples spanning a large variation of optically active constituents (OACs) in CIN and SA waters were collected during 24 field surveys. Concurrently, remote sensing reflectance spectra (Rrs) were measured. A partial least squares-artificial neural network (PLS-ANN) model, artificial neural network (ANN) and three-band model (TBM) were developed or tuned by relating the Rrs with PC concentration. Our results indicate that the PLS-ANN model outperformed the ANN and TBM with both the original spectra and simulated ESA/Sentinel-3/Ocean and Land Color Instrument (OLCI) and EO-1/Hyperion spectra. The PLS-ANN model resulted in a high coefficient of determination (R2) for CIN dataset (R2 = 0.92, R: 0.3-220.7 μg/L) and SA (R2 = 0.98, R: 0.2-13.2 μg/L). In comparison, the TBM model yielded an R2 = 0.77 and 0.94 for the CIN and SA datasets, respectively; while the ANN obtained an intermediate modeling accuracy (CIN: R2 = 0.86; SA: R2 = 0.95). Applying the simulated OLCI and Hyperion aggregated datasets, the PLS-ANN model still achieved good performance (OLCI: R2 = 0.84; Hyperion: R2 = 0.90); the TBM also presented acceptable performance for PC estimations (OLCI: R2 = 0.65, Hyperion: R2 = 0.70). Based on the results, the PLS-ANN is an effective modeling approach for the quantification of PC in productive water supplies based on its effectiveness in solving the non-linearity of PC with other OACs. Furthermore, our investigation indicates that the ratio of inorganic suspended matter (ISM) to PC concentration has close relationship to modeling relative errors (CIN: R2 = 0.81; SA: R2 = 0.92), indicating that ISM concentration exert

  20. Quantification of metronidazole in human plasma using a highly sensitive and rugged LC-MS/MS method for a bioequivalence study.

    PubMed

    Vanol, Pravin G; Sanyal, Mallika; Shah, Priyanka A; Shrivastav, Pranav S

    2018-03-23

    A highly sensitive, selective and rugged method has been described for the quantification of metronidazole (MTZ) in human plasma by liquid chromatography-tandem mass spectrometry using metronidazole-d4 as the internal standard (IS). The analyte and the IS were extracted from 100 μL plasma by liquid-liquid extraction. The clear samples obtained were chromatographed on an ACE C 18 (100 × 4.6 mm, 5 μm) column using acetonitrile and 10.0 mm ammonium formate in water, pH 4.00 (80:20, v/v) as the mobile phase. A triple quadrupole mass spectrometer system equipped with turbo ion spray source and operated in multiple reaction monitoring mode was used for the detection and quantification of MTZ. The calibration range was established from 0.01 to 10.0 μg/mL. The results of validation testing for precision and accuracy, selectivity, matrix effects, recovery and stability complied with current bioanalytical guidelines. A run time of 3.0 min permitted analysis of more than 300 samples in a day. The method was applied to a bioequivalence study with 250 mg MTZ tablet formulation in 24 healthy Indian males. Copyright © 2018 John Wiley & Sons, Ltd.

  1. Quantification of brake creep groan in vehicle tests and its relation with stick-slip obtained in laboratory tests

    NASA Astrophysics Data System (ADS)

    Neis, P. D.; Ferreira, N. F.; Poletto, J. C.; Matozo, L. T.; Masotti, D.

    2016-05-01

    This paper describes the development of a methodology for assessing and correlating stick-slip and brake creep groan. For doing that, results of tribotests are compared to data obtained in vehicle tests. A low velocity and a linear reduction in normal force were set for the tribotests. The vehicle tests consisted of subjecting a sport utility vehicle to three different ramp slopes. Creep groan events were measured by accelerometers placed on the brake calipers. The root mean square of the acceleration signal (QRMS parameter) was shown to be able to measure the creep groan severity resulting from the vehicle tests. Differences in QRMS were observed between front-rear and left-right wheels for all tested materials. Frequency spectrum analysis of the acceleration revealed that the wheel side and material type do not cause any significant shift in the creep groan frequency. QRMS measured in the vehicle tests presented good correlation with slip power (SP) summation. For this reason, SP summation may represent the "creep groan propensity" of brake materials. Thus, the proposed tribotest method can be utilized to predict the creep groan severity of brake materials in service.

  2. The NASA Langley Multidisciplinary Uncertainty Quantification Challenge

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2014-01-01

    This paper presents the formulation of an uncertainty quantification challenge problem consisting of five subproblems. These problems focus on key aspects of uncertainty characterization, sensitivity analysis, uncertainty propagation, extreme-case analysis, and robust design.

  3. Quantification of Quercetin Obtained from Allium cepa Lam. Leaves and its Effects on Streptozotocin-induced Diabetic Neuropathy.

    PubMed

    Dureshahwar, Khan; Mubashir, Mohammed; Une, Hemant Devidas

    2017-01-01

    Antioxidant potential has protective effects in diabetic neuropathy (DN); hence, the present study was designed with an objective to quantify quercetin from shade-dried leaves of Allium cepa Lam. and to study its effects on streptozotocin (STZ)-induced chronic DN. The shade-dried leaves of A. cepa Lam. were extracted with methanol and then fractionated using ethyl acetate (ACEA). The quantification of quercetin in ACEA was evaluated by high-performance thin layer chromatography (HPTLC). The STZ (40 mg/kg) was administered to Sprague-Dawley rats (180-250 g) maintained at normal housing conditions. The STZ was administered once a day for 3 consecutive days. The elevation in blood glucose was monitored for 3 weeks periodically using flavin adenine dinucleotide-glucose dehydrogenase method by Contour TS glucometer. Rats showing blood glucose above 250 mg/dl were selected for the study. Animals were divided into eight groups. ACEA (25, 50, and 100 mg/kg), quercetin (40 mg/kg), metformin (120 mg/kg), and gabapentin (100 mg/kg) were given orally once a day for 2 weeks. The blood glucose level was again measured at the end of treatment to assess DN. Thermal hyperalgesia, cold allodynia, motor incoordination, and neurotoxicity were studied initially and at the end of 2-week treatment. Biochemical parameters were also evaluated after 2-week drug treatment. The quercetin present in ACEA was 4.82% by HPTLC. All the ACEA treatment reduces blood glucose level at the end of the 2-week study and shows a significant neuroprotective effect in STZ-induced DN in the above experimental models. The quercetin present in ACEA proved protective effect in STZ-induced DN. High-performance thin layer chromatography reveals the presence of 4.82% quercetin in Allium cepa ethyl acetate. (ACEA). Its investigation against various diabetic neuropathy biomarkers has proved that ACEA has significant blood glucose reducing action shown neuroprotective action in thermal hyperalgesia, motor

  4. 1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.

    PubMed

    Dagnino, Denise; Schripsema, Jan

    2005-08-01

    A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.

  5. Optimization of the parameters for obtaining zirconia-alumina coatings, made by flame spraying from results of numerical simulation

    NASA Astrophysics Data System (ADS)

    Ferrer, M.; Vargas, F.; Peña, G.

    2017-12-01

    The K-Sommerfeld values (K) and the melting percentage (% F) obtained by numerical simulation using the Jets et Poudres software were used to find the projection parameters of zirconia-alumina coatings by thermal spraying flame, in order to obtain coatings with good morphological and structural properties to be used as thermal insulation. The experimental results show the relationship between the Sommerfeld parameter and the porosity of the zirconia-alumina coatings. It is found that the lowest porosity is obtained when the K-Sommerfeld value is close to 45 with an oxidant flame, on the contrary, when superoxidant flames are used K values are close 52, which improve wear resistance.

  6. The use of self-quantification systems for personal health information: big data management activities and prospects

    PubMed Central

    2015-01-01

    Background Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. Objectives In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. Method We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. Findings We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Conclusions

  7. Recommendations and Standardization of Biomarker Quantification Using NMR-Based Metabolomics with Particular Focus on Urinary Analysis.

    PubMed

    Emwas, Abdul-Hamid; Roy, Raja; McKay, Ryan T; Ryan, Danielle; Brennan, Lorraine; Tenori, Leonardo; Luchinat, Claudio; Gao, Xin; Zeri, Ana Carolina; Gowda, G A Nagana; Raftery, Daniel; Steinbeck, Christoph; Salek, Reza M; Wishart, David S

    2016-02-05

    NMR-based metabolomics has shown considerable promise in disease diagnosis and biomarker discovery because it allows one to nondestructively identify and quantify large numbers of novel metabolite biomarkers in both biofluids and tissues. Precise metabolite quantification is a prerequisite to move any chemical biomarker or biomarker panel from the lab to the clinic. Among the biofluids commonly used for disease diagnosis and prognosis, urine has several advantages. It is abundant, sterile, and easily obtained, needs little sample preparation, and does not require invasive medical procedures for collection. Furthermore, urine captures and concentrates many "unwanted" or "undesirable" compounds throughout the body, providing a rich source of potentially useful disease biomarkers; however, incredible variation in urine chemical concentrations makes analysis of urine and identification of useful urinary biomarkers by NMR challenging. We discuss a number of the most significant issues regarding NMR-based urinary metabolomics with specific emphasis on metabolite quantification for disease biomarker applications and propose data collection and instrumental recommendations regarding NMR pulse sequences, acceptable acquisition parameter ranges, relaxation effects on quantitation, proper handling of instrumental differences, sample preparation, and biomarker assessment.

  8. Quantification of susceptibility change at high-concentrated SPIO-labeled target by characteristic phase gradient recognition.

    PubMed

    Zhu, Haitao; Nie, Binbin; Liu, Hua; Guo, Hua; Demachi, Kazuyuki; Sekino, Masaki; Shan, Baoci

    2016-05-01

    Phase map cross-correlation detection and quantification may produce highlighted signal at superparamagnetic iron oxide nanoparticles, and distinguish them from other hypointensities. The method may quantify susceptibility change by performing least squares analysis between a theoretically generated magnetic field template and an experimentally scanned phase image. Because characteristic phase recognition requires the removal of phase wrap and phase background, additional steps of phase unwrapping and filtering may increase the chance of computing error and enlarge the inconsistence among algorithms. To solve problem, phase gradient cross-correlation and quantification method is developed by recognizing characteristic phase gradient pattern instead of phase image because phase gradient operation inherently includes unwrapping and filtering functions. However, few studies have mentioned the detectable limit of currently used phase gradient calculation algorithms. The limit may lead to an underestimation of large magnetic susceptibility change caused by high-concentrated iron accumulation. In this study, mathematical derivation points out the value of maximum detectable phase gradient calculated by differential chain algorithm in both spatial and Fourier domain. To break through the limit, a modified quantification method is proposed by using unwrapped forward differentiation for phase gradient generation. The method enlarges the detectable range of phase gradient measurement and avoids the underestimation of magnetic susceptibility. Simulation and phantom experiments were used to quantitatively compare different methods. In vivo application performs MRI scanning on nude mice implanted by iron-labeled human cancer cells. Results validate the limit of detectable phase gradient and the consequent susceptibility underestimation. Results also demonstrate the advantage of unwrapped forward differentiation compared with differential chain algorithms for susceptibility

  9. Ranking Fragment Ions Based on Outlier Detection for Improved Label-Free Quantification in Data-Independent Acquisition LC-MS/MS

    PubMed Central

    Bilbao, Aivett; Zhang, Ying; Varesio, Emmanuel; Luban, Jeremy; Strambio-De-Castillia, Caterina; Lisacek, Frédérique; Hopfgartner, Gérard

    2016-01-01

    Data-independent acquisition LC-MS/MS techniques complement supervised methods for peptide quantification. However, due to the wide precursor isolation windows, these techniques are prone to interference at the fragment ion level, which in turn is detrimental for accurate quantification. The “non-outlier fragment ion” (NOFI) ranking algorithm has been developed to assign low priority to fragment ions affected by interference. By using the optimal subset of high priority fragment ions these interfered fragment ions are effectively excluded from quantification. NOFI represents each fragment ion as a vector of four dimensions related to chromatographic and MS fragmentation attributes and applies multivariate outlier detection techniques. Benchmarking conducted on a well-defined quantitative dataset (i.e. the SWATH Gold Standard), indicates that NOFI on average is able to accurately quantify 11-25% more peptides than the commonly used Top-N library intensity ranking method. The sum of the area of the Top3-5 NOFIs produces similar coefficients of variation as compared to the library intensity method but with more accurate quantification results. On a biologically relevant human dendritic cell digest dataset, NOFI properly assigns low priority ranks to 85% of annotated interferences, resulting in sensitivity values between 0.92 and 0.80 against 0.76 for the Spectronaut interference detection algorithm. PMID:26412574

  10. Microfluidic platform for detection and quantification of magnetic markers

    NASA Astrophysics Data System (ADS)

    Kokkinis, Georgios; Cardoso, Susana; Giouroudi, Ioanna

    2017-05-01

    This paper reports on a microfluidic platform with an integrated spin valve giant magneto-resistance (GMR) sensor used for the detection and quantification of single magnetic micromarkers. A microfluidic channel containing the magnetic fluid, microconductors (MCs) for collection of the magnetic markers and a spin valve GMR sensor for detecting the presence of their magnetic stray field were integrated on a single chip. The results show that the sensor is capable of detecting a single magnetic marker with 2.8 μm diameter.

  11. Simple and Inexpensive Quantification of Ammonia in Whole Blood

    PubMed Central

    Ayyub, Omar B.; Behrens, Adam M.; Heligman, Brian T.; Natoli, Mary E.; Ayoub, Joseph J.; Cunningham, Gary; Summar, Marshall; Kofinas, Peter

    2015-01-01

    Quantification of ammonia in whole blood has applications in the diagnosis and management of many hepatic diseases, including cirrhosis and rare urea cycle disorders, amounting to more than 5 million patients in the United States. Current techniques for ammonia measurement suffer from limited range, poor resolution, false positives or large, complex sensor set-ups. Here we demonstrate a technique utilizing inexpensive reagents and simple methods for quantifying ammonia in 100 μl of whole blood. The sensor comprises a modified form of the indophenol reaction, which resists sources of destructive interference in blood, in conjunction with a cation-exchange membrane. The presented sensing scheme is selective against other amine containing molecules such as amino acids and has a shelf life of at least 50 days. Additionally, the resulting system has high sensitivity and allows for the accurate reliable quantification of ammonia in whole human blood samples at a minimum range of 25 to 500 μM, which is clinically for rare hyperammonemic disorders and liver disease. Furthermore, concentrations of 50 and 100 μM ammonia could be reliably discerned with p=0.0001. PMID:25936660

  12. Large scale systematic proteomic quantification from non-metastatic to metastatic colorectal cancer

    NASA Astrophysics Data System (ADS)

    Yin, Xuefei; Zhang, Yang; Guo, Shaowen; Jin, Hong; Wang, Wenhai; Yang, Pengyuan

    2015-07-01

    A systematic proteomic quantification of formalin-fixed, paraffin-embedded (FFPE) colorectal cancer tissues from stage I to stage IIIC was performed in large scale. 1017 proteins were identified with 338 proteins in quantitative changes by label free method, while 341 proteins were quantified with significant expression changes among 6294 proteins by iTRAQ method. We found that proteins related to migration expression increased and those for binding and adherent decreased during the colorectal cancer development according to the gene ontology (GO) annotation and ingenuity pathway analysis (IPA). The integrin alpha 5 (ITA5) in integrin family was focused, which was consistent with the metastasis related pathway. The expression level of ITA5 decreased in metastasis tissues and the result has been further verified by Western blotting. Another two cell migration related proteins vitronectin (VTN) and actin-related protein (ARP3) were also proved to be up-regulated by both mass spectrometry (MS) based quantification results and Western blotting. Up to now, our result shows one of the largest dataset in colorectal cancer proteomics research. Our strategy reveals a disease driven omics-pattern for the metastasis colorectal cancer.

  13. Development of a real-time PCR method for the differential detection and quantification of four solanaceae in GMO analysis: potato (Solanum tuberosum), tomato (Solanum lycopersicum), eggplant (Solanum melongena), and pepper (Capsicum annuum).

    PubMed

    Chaouachi, Maher; El Malki, Redouane; Berard, Aurélie; Romaniuk, Marcel; Laval, Valérie; Brunel, Dominique; Bertheau, Yves

    2008-03-26

    The labeling of products containing genetically modified organisms (GMO) is linked to their quantification since a threshold for the presence of fortuitous GMOs in food has been established. This threshold is calculated from a combination of two absolute quantification values: one for the specific GMO target and the second for an endogenous reference gene specific to the taxon. Thus, the development of reliable methods to quantify GMOs using endogenous reference genes in complex matrixes such as food and feed is needed. Plant identification can be difficult in the case of closely related taxa, which moreover are subject to introgression events. Based on the homology of beta-fructosidase sequences obtained from public databases, two couples of consensus primers were designed for the detection, quantification, and differentiation of four Solanaceae: potato (Solanum tuberosum), tomato (Solanum lycopersicum), pepper (Capsicum annuum), and eggplant (Solanum melongena). Sequence variability was studied first using lines and cultivars (intraspecies sequence variability), then using taxa involved in gene introgressions, and finally, using taxonomically close taxa (interspecies sequence variability). This study allowed us to design four highly specific TaqMan-MGB probes. A duplex real time PCR assay was developed for simultaneous quantification of tomato and potato. For eggplant and pepper, only simplex real time PCR tests were developed. The results demonstrated the high specificity and sensitivity of the assays. We therefore conclude that beta-fructosidase can be used as an endogenous reference gene for GMO analysis.

  14. Breast density quantification with cone-beam CT: A post-mortem study

    PubMed Central

    Johnson, Travis; Ding, Huanjun; Le, Huy Q.; Ducote, Justin L.; Molloi, Sabee

    2014-01-01

    Forty post-mortem breasts were imaged with a flat-panel based cone-beam x-ray CT system at 50 kVp. The feasibility of breast density quantification has been investigated using standard histogram thresholding and an automatic segmentation method based on the fuzzy c-means algorithm (FCM). The breasts were chemically decomposed into water, lipid, and protein immediately after image acquisition was completed. The percent fibroglandular volume (%FGV) from chemical analysis was used as the gold standard for breast density comparison. Both image-based segmentation techniques showed good precision in breast density quantification with high linear coefficients between the right and left breast of each pair. When comparing with the gold standard using %FGV from chemical analysis, Pearson’s r-values were estimated to be 0.983 and 0.968 for the FCM clustering and the histogram thresholding techniques, respectively. The standard error of the estimate (SEE) was also reduced from 3.92% to 2.45% by applying the automatic clustering technique. The results of the postmortem study suggested that breast tissue can be characterized in terms of water, lipid and protein contents with high accuracy by using chemical analysis, which offers a gold standard for breast density studies comparing different techniques. In the investigated image segmentation techniques, the FCM algorithm had high precision and accuracy in breast density quantification. In comparison to conventional histogram thresholding, it was more efficient and reduced inter-observer variation. PMID:24254317

  15. [DNA quantification of blood samples pre-treated with pyramidon].

    PubMed

    Zhu, Chuan-Hong; Zheng, Dao-Li; Ni, Rao-Zhi; Wang, Hai-Sheng; Ning, Ping; Fang, Hui; Liu, Yan

    2014-06-01

    To study DNA quantification and STR typing of samples pre-treated with pyramidon. The blood samples of ten unrelated individuals were anticoagulated in EDTA. The blood stains were made on the filter paper. The experimental groups were divided into six groups in accordance with the storage time, 30 min, 1 h, 3 h, 6 h, 12 h and 24h after pre-treated with pyramidon. DNA was extracted by three methods: magnetic bead-based extraction, QIAcube DNA purification method and Chelex-100 method. The quantification of DNA was made by fluorescent quantitative PCR. STR typing was detected by PCR-STR fluorescent technology. In the same DNA extraction method, the sample DNA decreased gradually with times after pre-treatment with pyramidon. In the same storage time, the DNA quantification in different extraction methods had significant differences. Sixteen loci DNA typing were detected in 90.56% of samples. Pyramidon pre-treatment could cause DNA degradation, but effective STR typing can be achieved within 24 h. The magnetic bead-based extraction is the best method for STR profiling and DNA extraction.

  16. GPU-Accelerated Voxelwise Hepatic Perfusion Quantification

    PubMed Central

    Wang, H; Cao, Y

    2012-01-01

    Voxelwise quantification of hepatic perfusion parameters from dynamic contrast enhanced (DCE) imaging greatly contributes to assessment of liver function in response to radiation therapy. However, the efficiency of the estimation of hepatic perfusion parameters voxel-by-voxel in the whole liver using a dual-input single-compartment model requires substantial improvement for routine clinical applications. In this paper, we utilize the parallel computation power of a graphics processing unit (GPU) to accelerate the computation, while maintaining the same accuracy as the conventional method. Using CUDA-GPU, the hepatic perfusion computations over multiple voxels are run across the GPU blocks concurrently but independently. At each voxel, non-linear least squares fitting the time series of the liver DCE data to the compartmental model is distributed to multiple threads in a block, and the computations of different time points are performed simultaneously and synchronically. An efficient fast Fourier transform in a block is also developed for the convolution computation in the model. The GPU computations of the voxel-by-voxel hepatic perfusion images are compared with ones by the CPU using the simulated DCE data and the experimental DCE MR images from patients. The computation speed is improved by 30 times using a NVIDIA Tesla C2050 GPU compared to a 2.67 GHz Intel Xeon CPU processor. To obtain liver perfusion maps with 626400 voxels in a patient’s liver, it takes 0.9 min with the GPU-accelerated voxelwise computation, compared to 110 min with the CPU, while both methods result in perfusion parameters differences less than 10−6. The method will be useful for generating liver perfusion images in clinical settings. PMID:22892645

  17. Electrochemical quantification of the antioxidant capacity of medicinal plants using biosensors.

    PubMed

    Rodríguez-Sevilla, Erika; Ramírez-Silva, María-Teresa; Romero-Romo, Mario; Ibarra-Escutia, Pedro; Palomar-Pardavé, Manuel

    2014-08-08

    The working area of a screen-printed electrode, SPE, was modified with the enzyme tyrosinase (Tyr) using different immobilization methods, namely entrapment with water-soluble polyvinyl alcohol (PVA), cross-linking using glutaraldehyde (GA), and cross-linking using GA and human serum albumin (HSA); the resulting electrodes were termed SPE/Tyr/PVA, SPE/Tyr/GA and SPE/Tyr/HSA/GA, respectively. These biosensors were characterized by means of amperometry and EIS techniques. From amperometric evaluations, the apparent Michaelis-Menten constant, Km', of each biosensor was evaluated while the respective charge transfer resistance, Rct, was assessed from impedance measurements. It was found that the SPE/Tyr/GA had the smallest Km' (57 ± 7) µM and Rct values. This electrode also displayed both the lowest detection and quantification limits for catechol quantification. Using the SPE/Tyr/GA, the Trolox Equivalent Antioxidant Capacity (TEAC) was determined from infusions prepared with "mirto" (Salvia microphylla), "hHierba dulce" (Lippia dulcis) and "salve real" (Lippia alba), medicinal plants commonly used in Mexico.

  18. Automatic Segmentation and Quantification of Filamentous Structures in Electron Tomography

    PubMed Central

    Loss, Leandro A.; Bebis, George; Chang, Hang; Auer, Manfred; Sarkar, Purbasha; Parvin, Bahram

    2016-01-01

    Electron tomography is a promising technology for imaging ultrastructures at nanoscale resolutions. However, image and quantitative analyses are often hindered by high levels of noise, staining heterogeneity, and material damage either as a result of the electron beam or sample preparation. We have developed and built a framework that allows for automatic segmentation and quantification of filamentous objects in 3D electron tomography. Our approach consists of three steps: (i) local enhancement of filaments by Hessian filtering; (ii) detection and completion (e.g., gap filling) of filamentous structures through tensor voting; and (iii) delineation of the filamentous networks. Our approach allows for quantification of filamentous networks in terms of their compositional and morphological features. We first validate our approach using a set of specifically designed synthetic data. We then apply our segmentation framework to tomograms of plant cell walls that have undergone different chemical treatments for polysaccharide extraction. The subsequent compositional and morphological analyses of the plant cell walls reveal their organizational characteristics and the effects of the different chemical protocols on specific polysaccharides. PMID:28090597

  19. Automatic Segmentation and Quantification of Filamentous Structures in Electron Tomography.

    PubMed

    Loss, Leandro A; Bebis, George; Chang, Hang; Auer, Manfred; Sarkar, Purbasha; Parvin, Bahram

    2012-10-01

    Electron tomography is a promising technology for imaging ultrastructures at nanoscale resolutions. However, image and quantitative analyses are often hindered by high levels of noise, staining heterogeneity, and material damage either as a result of the electron beam or sample preparation. We have developed and built a framework that allows for automatic segmentation and quantification of filamentous objects in 3D electron tomography. Our approach consists of three steps: (i) local enhancement of filaments by Hessian filtering; (ii) detection and completion (e.g., gap filling) of filamentous structures through tensor voting; and (iii) delineation of the filamentous networks. Our approach allows for quantification of filamentous networks in terms of their compositional and morphological features. We first validate our approach using a set of specifically designed synthetic data. We then apply our segmentation framework to tomograms of plant cell walls that have undergone different chemical treatments for polysaccharide extraction. The subsequent compositional and morphological analyses of the plant cell walls reveal their organizational characteristics and the effects of the different chemical protocols on specific polysaccharides.

  20. A Spanish model for quantification and management of construction waste.

    PubMed

    Solís-Guzmán, Jaime; Marrero, Madelyn; Montes-Delgado, Maria Victoria; Ramírez-de-Arellano, Antonio

    2009-09-01

    Currently, construction and demolition waste (C&D waste) is a worldwide issue that concerns not only governments but also the building actors involved in construction activity. In Spain, a new national decree has been regulating the production and management of C&D waste since February 2008. The present work describes the waste management model that has inspired this decree: the Alcores model implemented with good results in Los Alcores Community (Seville, Spain). A detailed model is also provided to estimate the volume of waste that is expected to be generated on the building site. The quantification of C&D waste volume, from the project stage, is essential for the building actors to properly plan and control its disposal. This quantification model has been developed by studying 100 dwelling projects, especially their bill of quantities, and defining three coefficients to estimate the demolished volume (CT), the wreckage volume (CR) and the packaging volume (CE). Finally, two case studies are included to illustrate the usefulness of the model to estimate C&D waste volume in both new construction and demolition projects.

  1. Uncertainty Quantification and Certification Prediction of Low-Boom Supersonic Aircraft Configurations

    NASA Technical Reports Server (NTRS)

    West, Thomas K., IV; Reuter, Bryan W.; Walker, Eric L.; Kleb, Bil; Park, Michael A.

    2014-01-01

    The primary objective of this work was to develop and demonstrate a process for accurate and efficient uncertainty quantification and certification prediction of low-boom, supersonic, transport aircraft. High-fidelity computational fluid dynamics models of multiple low-boom configurations were investigated including the Lockheed Martin SEEB-ALR body of revolution, the NASA 69 Delta Wing, and the Lockheed Martin 1021-01 configuration. A nonintrusive polynomial chaos surrogate modeling approach was used for reduced computational cost of propagating mixed, inherent (aleatory) and model-form (epistemic) uncertainty from both the computation fluid dynamics model and the near-field to ground level propagation model. A methodology has also been introduced to quantify the plausibility of a design to pass a certification under uncertainty. Results of this study include the analysis of each of the three configurations of interest under inviscid and fully turbulent flow assumptions. A comparison of the uncertainty outputs and sensitivity analyses between the configurations is also given. The results of this study illustrate the flexibility and robustness of the developed framework as a tool for uncertainty quantification and certification prediction of low-boom, supersonic aircraft.

  2. New high-definition thickness data obtained at tropical glaciers: preliminary results from Antisana volcano (Ecuador) using GPR prospection

    NASA Astrophysics Data System (ADS)

    Zapata, Camilo; Andrade, Daniel; Córdova, Jorge; Maisincho, Luis; Carvajal, Juan; Calispa, Marlon; Villacís, Marcos

    2014-05-01

    The study of tropical glaciers has been a significant contribution to the understanding of glacier dynamics and climate change. Much of the data and results have been obtained by analyzing plan-view images obtained by air- and space-borne sensors, as well as depth data obtained by diverse methodologies at selected points on the glacier surface. However, the measurement of glacier thicknesses has remained an elusive task in tropical glaciers, often located in rough terrains where the application of geophysical surveys (i.e. seismic surveys) requires logistics sometimes hardly justified by the amount of obtained data. In the case of Ecuador, however, where most glaciers have developed on active volcanoes and represent sources/reservoirs of fresh water, the precise knowledge of such information is fundamental for scientific research but also in order to better assess key aspects for the society. The relatively recent but fast development of the GPR technology has helped to obtain new highdefinition thickness data at Antisana volcano that will be used to: 1) better understand the dynamics and fate of tropical glaciers; 2) better estimate the amount of fresh water stored in the glaciers; 3) better assess the hazards associated with the sudden widespread melting of glaciers during volcanic eruptions. The measurements have been obtained at glaciers 12 and 15 of Antisana volcano, with the help of a commercial GPR equipped with a 25 MHz antenna. A total of 30 transects have been obtained, covering a distance of more than 3 km, from the glacier ablation zone, located at ~ 4600 masl, up to the level of 5200 masl. The preliminary results show a positive correlation between altitude and glacier thickness, with maximum and minimum calculated values reaching up to 80 m, and down to 15 m, respectively. The experience gained at Antisana volcano will be used to prepare a more widespread GPR survey in the glaciers of Cotopaxi volcano, whose implications in terms of volcanic hazards

  3. Improved Strategies and Optimization of Calibration Models for Real-time PCR Absolute Quantification

    EPA Science Inventory

    Real-time PCR absolute quantification applications rely on the use of standard curves to make estimates of DNA target concentrations in unknown samples. Traditional absolute quantification approaches dictate that a standard curve must accompany each experimental run. However, t...

  4. Evaluation of the impact of matrix effect on quantification of pesticides in foods by gas chromatography-mass spectrometry using isotope-labeled internal standards.

    PubMed

    Yarita, Takashi; Aoyagi, Yoshie; Otake, Takamitsu

    2015-05-29

    The impact of the matrix effect in GC-MS quantification of pesticides in food using the corresponding isotope-labeled internal standards was evaluated. A spike-and-recovery study of nine target pesticides was first conducted using paste samples of corn, green soybean, carrot, and pumpkin. The observed analytical values using isotope-labeled internal standards were more accurate for most target pesticides than that obtained using the external calibration method, but were still biased from the spiked concentrations when a matrix-free calibration solution was used for calibration. The respective calibration curves for each target pesticide were also prepared using matrix-free calibration solutions and matrix-matched calibration solutions with blank soybean extract. The intensity ratio of the peaks of most target pesticides to that of the corresponding isotope-labeled internal standards was influenced by the presence of the matrix in the calibration solution; therefore, the observed slope varied. The ratio was also influenced by the type of injection method (splitless or on-column). These results indicated that matrix-matching of the calibration solution is required for very accurate quantification, even if isotope-labeled internal standards were used for calibration. Copyright © 2015 Elsevier B.V. All rights reserved.

  5. An International Collaboration To Standardize HIV-2 Viral Load Assays: Results from the 2009 ACHIEV2E Quality Control Study▿

    PubMed Central

    Damond, F.; Benard, A.; Balotta, Claudia; Böni, Jürg; Cotten, Matthew; Duque, Vitor; Ferns, Bridget; Garson, Jeremy; Gomes, Perpetua; Gonçalves, Fátima; Gottlieb, Geoffrey; Kupfer, Bernd; Ruelle, Jean; Rodes, Berta; Soriano, Vicente; Wainberg, Mark; Taieb, Audrey; Matheron, Sophie; Chene, Genevieve; Brun-Vezinet, Francoise

    2011-01-01

    Accurate HIV-2 plasma viral load quantification is crucial for adequate HIV-2 patient management and for the proper conduct of clinical trials and international cohort collaborations. This study compared the homogeneity of HIV-2 RNA quantification when using HIV-2 assays from ACHIEV2E study sites and either in-house PCR calibration standards or common viral load standards supplied to all collaborators. Each of the 12 participating laboratories quantified blinded HIV-2 samples, using its own HIV-2 viral load assay and standard as well as centrally validated and distributed common HIV-2 group A and B standards (http://www.hiv.lanl.gov/content/sequence/HelpDocs/subtypes-more.html). Aliquots of HIV-2 group A and B strains, each at 2 theoretical concentrations (2.7 and 3.7 log10 copies/ml), were tested. Intralaboratory, interlaboratory, and overall variances of quantification results obtained with both standards were compared using F tests. For HIV-2 group A quantifications, overall and interlaboratory and/or intralaboratory variances were significantly lower when using the common standard than when using in-house standards at the concentration levels of 2.7 log10 copies/ml and 3.7 log10 copies/ml, respectively. For HIV-2 group B, a high heterogeneity was observed and the variances did not differ according to the type of standard used. In this international collaboration, the use of a common standard improved the homogeneity of HIV-2 group A RNA quantification only. The diversity of HIV-2 group B, particularly in PCR primer-binding regions, may explain the heterogeneity in quantification of this strain. Development of a validated HIV-2 viral load assay that accurately quantifies distinct circulating strains is needed. PMID:21813718

  6. Comparative study between extraction techniques and column separation for the quantification of sinigrin and total isothiocyanates in mustard seed.

    PubMed

    Cools, Katherine; Terry, Leon A

    2012-07-15

    Glucosinolates are β-thioglycosides which are found naturally in Cruciferae including the genus Brassica. When enzymatically hydrolysed, glucosinolates yield isothiocyanates and give a pungent taste. Both glucosinolates and isothiocyanates have been linked with anticancer activity as well as antifungal and antibacterial properties and therefore the quantification of these compounds is scientifically important. A wide range of literature exists on glucosinolates, however the extraction and quantification procedures differ greatly resulting in discrepancies between studies. The aim of this study was therefore to compare the most popular extraction procedures to identify the most efficacious method and whether each extraction can also be used for the quantification of total isothiocyanates. Four extraction techniques were compared for the quantification of sinigrin from mustard cv. Centennial (Brassica juncea L.) seed; boiling water, boiling 50% (v/v) aqueous acetonitrile, boiling 100% methanol and 70% (v/v) aqueous methanol at 70 °C. Prior to injection into the HPLC, the extractions which involved solvents (acetonitrile or methanol) were freeze-dried and resuspended in water. To identify whether the same extract could be used to measure total isothiocyanates, a dichloromethane extraction was carried out on the sinigrin extracts. For the quantification of sinigrin alone, boiling 50% (v/v) acetonitrile was found to be the most efficacious extraction solvent of the four tested yielding 15% more sinigrin than the water extraction. However, the removal of the acetonitrile by freeze-drying had a negative impact on the isothiocyanate content. Quantification of both sinigrin and total isothiocyanates was possible when the sinigrin was extracted using boiling water. Two columns were compared for the quantification of sinigrin revealing the Zorbax Eclipse to be the best column using this particular method. Copyright © 2012 Elsevier B.V. All rights reserved.

  7. Improved correlation between CT emphysema quantification and pulmonary function test by density correction of volumetric CT data based on air and aortic density.

    PubMed

    Kim, Song Soo; Seo, Joon Beom; Kim, Namkug; Chae, Eun Jin; Lee, Young Kyung; Oh, Yeon Mok; Lee, Sang Do

    2014-01-01

    To determine the improvement of emphysema quantification with density correction and to determine the optimal site to use for air density correction on volumetric computed tomography (CT). Seventy-eight CT scans of COPD patients (GOLD II-IV, smoking history 39.2±25.3 pack-years) were obtained from several single-vendor 16-MDCT scanners. After density measurement of aorta, tracheal- and external air, volumetric CT density correction was conducted (two reference values: air, -1,000 HU/blood, +50 HU). Using in-house software, emphysema index (EI) and mean lung density (MLD) were calculated. Differences in air densities, MLD and EI prior to and after density correction were evaluated (paired t-test). Correlation between those parameters and FEV1 and FEV1/FVC were compared (age- and sex adjusted partial correlation analysis). Measured densities (HU) of tracheal- and external air differed significantly (-990 ± 14, -1016 ± 9, P<0.001). MLD and EI on original CT data, after density correction using tracheal- and external air also differed significantly (MLD: -874.9 ± 27.6 vs. -882.3 ± 24.9 vs. -860.5 ± 26.6; EI: 16.8 ± 13.4 vs. 21.1 ± 14.5 vs. 9.7 ± 10.5, respectively, P<0.001). The correlation coefficients between CT quantification indices and FEV1, and FEV1/FVC increased after density correction. The tracheal air correction showed better results than the external air correction. Density correction of volumetric CT data can improve correlations of emphysema quantification and PFT. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  8. Quantification of penicillin G during labor and delivery by capillary electrophoresis.

    PubMed

    Thomas, Andrea; Ukpoma, Omon K; Inman, Jennifer A; Kaul, Anil K; Beeson, James H; Roberts, Kenneth P

    2008-04-24

    In this study, a capillary electrophoresis (CE) method was developed as a means to measure levels of penicillin G (PCN G) in Group B Streptococcus (GBS) positive pregnant women during labor and delivery. Volunteers for this developmental study were administered five million units of PCN G at the onset of labor. Urine, blood, and amniotic fluid samples were collected during labor and post delivery. Samples were semi-purified by solid-phase extraction (SPE) using Waters tC18 SepPak 3cc cartridges with a sodium phosphate/methanol step gradient for elution. Capillary electrophoresis or reversed-phase high-performance liquid chromatography (RP-HPLC) with diode-array absorbance detection were used to separate the samples in less than 30 min. Quantification was accomplished by establishing a calibration curve with a linear dynamic range. The tC18 SPE methodology provided substantial sample clean-up with high recovery yields of PCN G ( approximately 90%). It was found that SPE was critical for maintaining the integrity of the separation column when using RP-HPLC, but was not necessary for sample analysis by CE where no stationary phase is present. Quantification results ranged from millimolar concentrations of PCN G in maternal urine to micromolar concentrations in amniotic fluid. Serum and cord blood levels of PCN G were below quantification limits, which is likely due to the prolonged delay in sample collection after antibiotic administration. These results show that CE can serve as a simple and effective means to characterize the pharmacokinetic distribution of PCN G from mother to unborn fetus during labor and delivery. It is anticipated that similar methodologies have the potential to provide a quick, simple, and cost-effective means of monitoring the clinical efficacy of PCN G and other drugs during pregnancy.

  9. Quantification of confocal images of biofilms grown on irregular surfaces

    PubMed Central

    Ross, Stacy Sommerfeld; Tu, Mai Han; Falsetta, Megan L.; Ketterer, Margaret R.; Kiedrowski, Megan R.; Horswill, Alexander R.; Apicella, Michael A.; Reinhardt, Joseph M.; Fiegel, Jennifer

    2014-01-01

    Bacterial biofilms grow on many types of surfaces, including flat surfaces such as glass and metal and irregular surfaces such as rocks, biological tissues and polymers. While laser scanning confocal microscopy can provide high-resolution images of biofilms grown on any surface, quantification of biofilm-associated bacteria is currently limited to bacteria grown on flat surfaces. This can limit researchers studying irregular surfaces to qualitative analysis or quantification of only the total bacteria in an image. In this work, we introduce a new algorithm called modified connected volume filtration (MCVF) to quantify bacteria grown on top of an irregular surface that is fluorescently labeled or reflective. Using the MCVF algorithm, two new quantification parameters are introduced. The modified substratum coverage parameter enables quantification of the connected-biofilm bacteria on top of the surface and on the imaging substratum. The utility of MCVF and the modified substratum coverage parameter were shown with Pseudomonas aeruginosa and Staphylococcus aureus biofilms grown on human airway epithelial cells. A second parameter, the percent association, provides quantified data on the colocalization of the bacteria with a labeled component, including bacteria within a labeled tissue. The utility of quantifying the bacteria associated with the cell cytoplasm was demonstrated with Neisseria gonorrhoeae biofilms grown on cervical epithelial cells. This algorithm provides more flexibility and quantitative ability to researchers studying biofilms grown on a variety of irregular substrata. PMID:24632515

  10. Quantification of protein expression in cells and cellular subcompartments on immunohistochemical sections using a computer supported image analysis system.

    PubMed

    Braun, Martin; Kirsten, Robert; Rupp, Niels J; Moch, Holger; Fend, Falko; Wernert, Nicolas; Kristiansen, Glen; Perner, Sven

    2013-05-01

    Quantification of protein expression based on immunohistochemistry (IHC) is an important step for translational research and clinical routine. Several manual ('eyeballing') scoring systems are used in order to semi-quantify protein expression based on chromogenic intensities and distribution patterns. However, manual scoring systems are time-consuming and subject to significant intra- and interobserver variability. The aim of our study was to explore, whether new image analysis software proves to be sufficient as an alternative tool to quantify protein expression. For IHC experiments, one nucleus specific marker (i.e., ERG antibody), one cytoplasmic specific marker (i.e., SLC45A3 antibody), and one marker expressed in both compartments (i.e., TMPRSS2 antibody) were chosen. Stainings were applied on TMAs, containing tumor material of 630 prostate cancer patients. A pathologist visually quantified all IHC stainings in a blinded manner, applying a four-step scoring system. For digital quantification, image analysis software (Tissue Studio v.2.1, Definiens AG, Munich, Germany) was applied to obtain a continuous spectrum of average staining intensity. For each of the three antibodies we found a strong correlation of the manual protein expression score and the score of the image analysis software. Spearman's rank correlation coefficient was 0.94, 0.92, and 0.90 for ERG, SLC45A3, and TMPRSS2, respectively (p⟨0.01). Our data suggest that the image analysis software Tissue Studio is a powerful tool for quantification of protein expression in IHC stainings. Further, since the digital analysis is precise and reproducible, computer supported protein quantification might help to overcome intra- and interobserver variability and increase objectivity of IHC based protein assessment.

  11. The impact of carbon-13 and deuterium on relative quantification of proteins using stable isotope diethyl labeling.

    PubMed

    Koehler, Christian J; Arntzen, Magnus Ø; Thiede, Bernd

    2015-05-15

    Stable isotopic labeling techniques are useful for quantitative proteomics. A cost-effective and convenient method for diethylation by reductive amination was established. The impact using either carbon-13 or deuterium on quantification accuracy and precision was investigated using diethylation. We established an effective approach for stable isotope labeling by diethylation of amino groups of peptides. The approach was validated using matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) and nanospray liquid chromatography/electrospray ionization (nanoLC/ESI)-ion trap/orbitrap for mass spectrometric analysis as well as MaxQuant for quantitative data analysis. Reaction conditions with low reagent costs, high yields and minor side reactions were established for diethylation. Furthermore, we showed that diethylation can be applied to up to sixplex labeling. For duplex experiments, we compared diethylation in the analysis of the proteome of HeLa cells using acetaldehyde-(13) C(2)/(12) C(2) and acetaldehyde-(2) H(4)/(1) H(4). Equal numbers of proteins could be identified and quantified; however, (13) C(4)/(12) C(4) -diethylation revealed a lower variance of quantitative peptide ratios within proteins resulting in a higher precision of quantified proteins and less falsely regulated proteins. The results were compared with dimethylation showing minor effects because of the lower number of deuteriums. The described approach for diethylation of primary amines is a cost-effective and accurate method for up to sixplex relative quantification of proteomes. (13) C(4)/(12) C(4) -diethylation enables duplex quantification based on chemical labeling without using deuterium which reduces identification of false-negatives and increases the quality of the quantification results. Copyright © 2015 John Wiley & Sons, Ltd.

  12. STATISTICAL ANALYSIS OF TANK 19F FLOOR SAMPLE RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 19F as per the statistical sampling plan developed by Harris and Shine. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples resultsmore » to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL95%) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current scrape sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 19F. The uncertainty is quantified in this report by an UCL95% on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL95% was based entirely on the six current scrape sample results (each averaged across three analytical determinations).« less

  13. Assessment of SCAR markers to design real-time PCR primers for rhizosphere quantification of Azospirillum brasilense phytostimulatory inoculants of maize.

    PubMed

    Couillerot, O; Poirier, M-A; Prigent-Combaret, C; Mavingui, P; Caballero-Mellado, J; Moënne-Loccoz, Y

    2010-08-01

    To assess the applicability of sequence characterized amplified region (SCAR) markers obtained from BOX, ERIC and RAPD fragments to design primers for real-time PCR quantification of the phytostimulatory maize inoculants Azospirillum brasilense UAP-154 and CFN-535 in the rhizosphere. Primers were designed based on strain-specific SCAR markers and were screened for successful amplification of target strain and absence of cross-reaction with other Azospirillum strains. The specificity of primers thus selected was verified under real-time PCR conditions using genomic DNA from strain collection and DNA from rhizosphere samples. The detection limit was 60 fg DNA with pure cultures and 4 x 10(3) (for UAP-154) and 4 x 10(4) CFU g(-1) (for CFN-535) in the maize rhizosphere. Inoculant quantification was effective from 10(4) to 10(8) CFU g(-1) soil. BOX-based SCAR markers were useful to find primers for strain-specific real-time PCR quantification of each A. brasilense inoculant in the maize rhizosphere. Effective root colonization is a prerequisite for successful Azospirillum phytostimulation, but cultivation-independent monitoring methods were lacking. The real-time PCR methods developed here will help understand the effect of environmental conditions on root colonization and phytostimulation by A. brasilense UAP-154 and CFN-535.

  14. Using Acoustic Structure Quantification During B-Mode Sonography for Evaluation of Hashimoto Thyroiditis.

    PubMed

    Rhee, Sun Jung; Hong, Hyun Sook; Kim, Chul-Hee; Lee, Eun Hye; Cha, Jang Gyu; Jeong, Sun Hye

    2015-12-01

    This study aimed to evaluate the usefulness of Acoustic Structure Quantification (ASQ; Toshiba Medical Systems Corporation, Nasushiobara, Japan) values in the diagnosis of Hashimoto thyroiditis using B-mode sonography and to identify a cutoff ASQ level that differentiates Hashimoto thyroiditis from normal thyroid tissue. A total of 186 thyroid lobes with Hashimoto thyroiditis and normal thyroid glands underwent sonography with ASQ imaging. The quantitative results were reported in an echo amplitude analysis (Cm(2)) histogram with average, mode, ratio, standard deviation, blue mode, and blue average values. Receiver operating characteristic curve analysis was performed to assess the diagnostic ability of the ASQ values in differentiating Hashimoto thyroiditis from normal thyroid tissue. Intraclass correlation coefficients of the ASQ values were obtained between 2 observers. Of the 186 thyroid lobes, 103 (55%) had Hashimoto thyroiditis, and 83 (45%) were normal. There was a significant difference between the ASQ values of Hashimoto thyroiditis glands and those of normal glands (P < .001). The ASQ values in patients with Hashimoto thyroiditis were significantly greater than those in patients with normal thyroid glands. The areas under the receiver operating characteristic curves for the ratio, blue average, average, blue mode, mode, and standard deviation were: 0.936, 0.902, 0.893, 0.855, 0.846, and 0.842, respectively. The ratio cutoff value of 0.27 offered the best diagnostic performance, with sensitivity of 87.38% and specificity of 95.18%. The intraclass correlation coefficients ranged from 0.86 to 0.94, which indicated substantial agreement between the observers. Acoustic Structure Quantification is a useful and promising sonographic method for diagnosing Hashimoto thyroiditis. Not only could it be a helpful tool for quantifying thyroid echogenicity, but it also would be useful for diagnosis of Hashimoto thyroiditis. © 2015 by the American Institute of

  15. Quantification of the biocontrol agent Trichoderma harzianum with real-time TaqMan PCR and its potential extrapolation to the hyphal biomass.

    PubMed

    López-Mondéjar, Rubén; Antón, Anabel; Raidl, Stefan; Ros, Margarita; Pascual, José Antonio

    2010-04-01

    The species of the genus Trichoderma are used successfully as biocontrol agents against a wide range of phytopathogenic fungi. Among them, Trichoderma harzianum is especially effective. However, to develop more effective fungal biocontrol strategies in organic substrates and soil, tools for monitoring the control agents are required. Real-time PCR is potentially an effective tool for the quantification of fungi in environmental samples. The aim of this study consisted of the development and application of a real-time PCR-based method to the quantification of T. harzianum, and the extrapolation of these data to fungal biomass values. A set of primers and a TaqMan probe for the ITS region of the fungal genome were designed and tested, and amplification was correlated to biomass measurements obtained with optical microscopy and image analysis, of the hyphal length of the mycelium of the colony. A correlation of 0.76 between ITS copies and biomass was obtained. The extrapolation of the quantity of ITS copies, calculated based on real-time PCR data, into quantities of fungal biomass provides potentially a more accurate value of the quantity of soil fungi. Copyright 2009 Elsevier Ltd. All rights reserved.

  16. Digital ELISA for the quantification of attomolar concentrations of Alzheimer's disease biomarker protein Tau in biological samples.

    PubMed

    Pérez-Ruiz, Elena; Decrop, Deborah; Ven, Karen; Tripodi, Lisa; Leirs, Karen; Rosseels, Joelle; van de Wouwer, Marlies; Geukens, Nick; De Vos, Ann; Vanmechelen, Eugeen; Winderickx, Joris; Lammertyn, Jeroen; Spasic, Dragana

    2018-07-26

    The close correlation between Tau pathology and Alzheimer's disease (AD) progression makes this protein a suitable biomarker for diagnosis and monitoring of the disorder evolution. However, the use of Tau in diagnostics has been hampered, as it currently requires collection of cerebrospinal fluid (CSF), which is an invasive clinical procedure. Although measuring Tau-levels in blood plasma would be favorable, the concentrations are below the detection limit of a conventional ELISA. In this work, we developed a digital ELISA for the quantification of attomolar protein Tau concentrations in both buffer and biological samples. Individual Tau molecules were first captured on the surface of magnetic particles using in-house developed antibodies and subsequently isolated into the femtoliter-sized wells of a 2 × 2 mm 2 microwell array. Combination of high-affinity antibodies, optimal assay conditions and a digital quantification approach resulted in a 24 ± 7 aM limit of detection (LOD) in buffer samples. Additionally, a dynamic range of 6 orders of magnitude was achieved by combining the digital readout with an analogue approach, allowing quantification from attomolar to picomolar levels of Tau using the same platform. This proves the compatibility of the presented assay with the wide range of Tau concentrations encountered in different biological samples. Next, the developed digital assay was applied to detect total Tau levels in spiked blood plasma. A similar LOD (55 ± 29 aM) was obtained compared to the buffer samples, which was 5000-fold more sensitive than commercially available ELISAs and even outperformed previously reported digital assays with 10-fold increase in sensitivity. Finally, the performance of the developed digital ELISA was assessed by quantifying protein Tau in three clinical CSF samples. Here, a high correlation (i.e. Pearson coefficient of 0.99) was found between the measured percentage of active particles and the reference protein Tau

  17. Direct Quantification of Methane Emissions Across the Supply Chain: Identification of Mitigation Targets

    NASA Astrophysics Data System (ADS)

    Darzi, M.; Johnson, D.; Heltzel, R.; Clark, N.

    2017-12-01

    Researchers at West Virginia University's Center for Alternative Fuels, Engines, and Emissions have recently participated in a variety of studies targeted at direction quantification of methane emissions from across the natural gas supply chain. These studies included assessing methane emissions from heavy-duty vehicles and their fuel stations, active unconventional well sites - during both development and production, natural gas compression and storage facilities, natural gas engines - both large and small, two- and four-stroke, and low-throughput equipment associated with coal bed methane wells. Engine emissions were sampled using conventional instruments such as Fourier transform infrared spectrometers and heated flame ionization detection analyzers. However, to accurately quantify a wide range of other sources beyond the tailpipe (both leaks and losses), a full flow sampling system was developed, which included an integrated cavity-enhanced absorption spectrometer. Through these direct quantification efforts and analysis major sources of methane emissions were identified. Technological solutions and best practices exist or could be developed to reduce methane emissions by focusing on the "lowest-hanging fruit." For example, engine crankcases from across the supply chain should employ vent mitigation systems to reduce methane and other emissions. An overview of the direct quantification system and various campaign measurements results will be presented along with the identification of other targets for additional mitigation.

  18. Highly sensitive quantification for human plasma-targeted metabolomics using an amine derivatization reagent.

    PubMed

    Arashida, Naoko; Nishimoto, Rumi; Harada, Masashi; Shimbo, Kazutaka; Yamada, Naoyuki

    2017-02-15

    Amino acids and their related metabolites play important roles in various physiological processes and have consequently become biomarkers for diseases. However, accurate quantification methods have only been established for major compounds, such as amino acids and a limited number of target metabolites. We previously reported a highly sensitive high-throughput method for the simultaneous quantification of amines using 3-aminopyridyl-N-succinimidyl carbamate as a derivatization reagent combined with liquid chromatography-tandem mass spectrometry (LC-MS/MS). Herein, we report the successful development of a practical and accurate LC-MS/MS method to analyze low concentrations of 40 physiological amines in 19 min. Thirty-five of these amines showed good linearity, limits of quantification, accuracy, precision, and recovery characteristics in plasma, with scheduled selected reaction monitoring acquisitions. Plasma samples from 10 healthy volunteers were evaluated using our newly developed method. The results revealed that 27 amines were detected in one of the samples, and that 24 of these compounds could be quantified. Notably, this new method successfully quantified metabolites with high accuracy across three orders of magnitude, with lowest and highest averaged concentrations of 31.7 nM (for spermine) and 18.3 μM (for α-aminobutyric acid), respectively. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Spatially resolved quantification of agrochemicals on plant surfaces using energy dispersive X-ray microanalysis.

    PubMed

    Hunsche, Mauricio; Noga, Georg

    2009-12-01

    In the present study the principle of energy dispersive X-ray microanalysis (EDX), i.e. the detection of elements based on their characteristic X-rays, was used to localise and quantify organic and inorganic pesticides on enzymatically isolated fruit cuticles. Pesticides could be discriminated from the plant surface because of their distinctive elemental composition. Findings confirm the close relation between net intensity (NI) and area covered by the active ingredient (AI area). Using wide and narrow concentration ranges of glyphosate and glufosinate, respectively, results showed that quantification of AI requires the selection of appropriate regression equations while considering NI, peak-to-background (P/B) ratio, and AI area. The use of selected internal standards (ISs) such as Ca(NO(3))(2) improved the accuracy of the quantification slightly but led to the formation of particular, non-typical microstructured deposits. The suitability of SEM-EDX as a general technique to quantify pesticides was evaluated additionally on 14 agrochemicals applied at diluted or regular concentration. Among the pesticides tested, spatial localisation and quantification of AI amount could be done for inorganic copper and sulfur as well for the organic agrochemicals glyphosate, glufosinate, bromoxynil and mancozeb. (c) 2009 Society of Chemical Industry.

  20. Simultaneous quantification and splenocyte-proliferating activities of nucleosides and bases in Cervi cornu Pantotrichum

    PubMed Central

    Zong, Ying; Wang, Yu; Li, Hang; Li, Na; Zhang, Hui; Sun, Jiaming; Niu, Xiaohui; Gao, Xiaochen

    2014-01-01

    Background: Cervi Cornu Pantotrichum has been a well known traditional Chinese medicine, which is young horn of Cervus Nippon Temminck (Hualurong: HLR). At present, the methods used for the quality control of Cervi Cornu Pantotrichum show low specificity. Objective: To describe a holistic method based on chemical characteristics and splenocyte-proliferating activities to evaluate the quality of HLR. Materials and Methods: The nucleosides and bases from HLR were identified by high performance liquid chromatography electrospray ionization mass spectrometry (HPLC-ESI-MS), and six of them were chosen to be used for simultaneous HPLC quantification according to the results of proliferation of mouse splenocytes in vitro. Results: In this study, eight nucleosides and bases have been identified. In addition, uracil, hypoxanthine, uridine, inosine, guanosine, and adenosine were chosen to be used for simultaneous HPLC quantification. Simultaneous quantification of these six substances was performed on ten groups of HLR under the condition of a TIANHE Kromasil C18 column (5 μm, 4.6 mm × 250 mm i.d.) and a gradient elution of water and acetonitrile. Of the ten groups, HLR displayed the highest total nucleoside contents (TNC, sum of adenosine and uracil, 0.412 mg/g) with the strongest splenocyte-proliferating activities. Conclusion: These results suggest that TNC (such as particularly highly contained adenosine and uracil) in HLR has a certain correlation with the activity of splenocyte-proliferating, and it may be used as a quality control for HLR. This comprehensive method could be applied to other traditional Chinese medicines to ameliorate their quality control. PMID:25422536

  1. Powder X-ray diffraction method for the quantification of cocrystals in the crystallization mixture.

    PubMed

    Padrela, Luis; de Azevedo, Edmundo Gomes; Velaga, Sitaram P

    2012-08-01

    The solid state purity of cocrystals critically affects their performance. Thus, it is important to accurately quantify the purity of cocrystals in the final crystallization product. The aim of this study was to develop a powder X-ray diffraction (PXRD) quantification method for investigating the purity of cocrystals. The method developed was employed to study the formation of indomethacin-saccharin (IND-SAC) cocrystals by mechanochemical methods. Pure IND-SAC cocrystals were geometrically mixed with 1:1 w/w mixture of indomethacin/saccharin in various proportions. An accurately measured amount (550 mg) of the mixture was used for the PXRD measurements. The most intense, non-overlapping, characteristic diffraction peak of IND-SAC was used to construct the calibration curve in the range 0-100% (w/w). This calibration model was validated and used to monitor the formation of IND-SAC cocrystals by liquid-assisted grinding (LAG). The IND-SAC cocrystal calibration curve showed excellent linearity (R(2) = 0.9996) over the entire concentration range, displaying limit of detection (LOD) and limit of quantification (LOQ) values of 1.23% (w/w) and 3.74% (w/w), respectively. Validation results showed excellent correlations between actual and predicted concentrations of IND-SAC cocrystals (R(2) = 0.9981). The accuracy and reliability of the PXRD quantification method depend on the methods of sample preparation and handling. The crystallinity of the IND-SAC cocrystals was higher when larger amounts of methanol were used in the LAG method. The PXRD quantification method is suitable and reliable for verifying the purity of cocrystals in the final crystallization product.

  2. Improving of enzyme immunoassay for detection and quantification of the target molecules using silver nanoparticles

    NASA Astrophysics Data System (ADS)

    Syrvatka, Vasyl J.; Slyvchuk, Yurij I.; Rozgoni, Ivan I.; Gevkan, Ivan I.; Overchuk, Marta O.

    2014-02-01

    Modern routine enzyme immunoassays for detection and quantification of biomolecules have several disadvantages such as high cost, insufficient sensitivity, complexity and long-term execution. The surface plasmon resonance of silver nanoparticles gives reasons of creating new in the basis of simple, highly sensitive and low cost colorimetric assays that can be applied to the detection of small molecules, DNA, proteins and pollutants. The main aim of the study was the improving of enzyme immunoassay for detection and quantification of the target molecules using silver nanoparticles. For this purpose we developed method for synthesis of silver nanoparticles with hyaluronic acid and studied possibility of use these nanoparticles in direct determination of target molecules concentration (in particular proteins) and for improving of enzyme immunoassay. As model we used conventional enzyme immunoassays for determination of progesterone and estradiol concentration. We obtained the possibility to produce silver nanoparticles with hyaluronan homogeneous in size between 10 and 12 nm, soluble and stable in water during long term of storage using modified procedure of silver nanoparticles synthesis. New method allows to obtain silver nanoparticles with strong optical properties at the higher concentrations - 60-90 μg/ml with the peak of absorbance at the wavelength 400 nm. Therefore surface plasmon resonance of silver nanoparticles with hyaluronan and ultraviolet-visible spectroscopy provide an opportunity for rapid determination of target molecules concentration (especial protein). We used silver nanoparticles as enzyme carriers and signal enhancers. Our preliminary data show that silver nanoparticles increased absorbance of samples that allows improving upper limit of determination of estradiol and progesterone concentration.

  3. Robustness of Fat Quantification using Chemical Shift Imaging

    PubMed Central

    Hansen, Katie H; Schroeder, Michael E; Hamilton, Gavin; Sirlin, Claude B; Bydder, Mark

    2011-01-01

    This purpose of this study was to investigate the effect of parameter changes that can potentially lead to unreliable measurements in fat quantification. Chemical shift imaging was performed using spoiled gradient echo sequences with systematic variations in the following: 2D/3D sequence, number of echoes, delta echo time, fractional echo factor, slice thickness, repetition time, flip angle, bandwidth, matrix size, flow compensation and field strength. Results indicated no significant (or significant but small) changes in fat fraction with parameter. The significant changes can be attributed to known effects of T1 bias and the two forms of noise bias. PMID:22055856

  4. Geodetic results from ISAGEX data. [for obtaining center of mass coordinates for geodetic camera sites

    NASA Technical Reports Server (NTRS)

    Marsh, J. G.; Douglas, B. C.; Walls, D. M.

    1974-01-01

    Laser and camera data taken during the International Satellite Geodesy Experiment (ISAGEX) were used in dynamical solutions to obtain center-of-mass coordinates for the Astro-Soviet camera sites at Helwan, Egypt, and Oulan Bator, Mongolia, as well as the East European camera sites at Potsdam, German Democratic Republic, and Ondrejov, Czechoslovakia. The results are accurate to about 20m in each coordinate. The orbit of PEOLE (i=15) was also determined from ISAGEX data. Mean Kepler elements suitable for geodynamic investigations are presented.

  5. Quantification of meat proportions by measuring DNA contents in raw and boiled sausages using matrix-adapted calibrators and multiplex real-time PCR.

    PubMed

    Köppel, René; Eugster, Albert; Ruf, Jürg; Rentsch, Jürg

    2012-01-01

    The quantification of meat proportions in raw and boiled sausage according to the recipe was evaluated using three different calibrators. To measure the DNA contents from beef, pork, sheep (mutton), and horse, a tetraplex real-time PCR method was applied. Nineteen laboratories analyzed four meat products each made of different proportions of beef, pork, sheep, and horse meat. Three kinds of calibrators were used: raw and boiled sausages of known proportions ranging from 1 to 55% of meat, and a dilution series of DNA from muscle tissue. In general, results generated using calibration sausages were more accurate than those resulting from the use of DNA from muscle tissue, and exhibited smaller measurement uncertainties. Although differences between uses of raw and boiled calibration sausages were small, the most precise and accurate results were obtained by calibration with fine-textured boiled reference sausages.

  6. Multiplex quantification of 12 European Union authorized genetically modified maize lines with droplet digital polymerase chain reaction.

    PubMed

    Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Holst-Jensen, Arne; Žel, Jana

    2015-08-18

    Presence of genetically modified organisms (GMO) in food and feed products is regulated in many countries. The European Union (EU) has implemented a threshold for labeling of products containing more than 0.9% of authorized GMOs per ingredient. As the number of GMOs has increased over time, standard-curve based simplex quantitative polymerase chain reaction (qPCR) analyses are no longer sufficiently cost-effective, despite widespread use of initial PCR based screenings. Newly developed GMO detection methods, also multiplex methods, are mostly focused on screening and detection but not quantification. On the basis of droplet digital PCR (ddPCR) technology, multiplex assays for quantification of all 12 EU authorized GM maize lines (per April first 2015) were developed. Because of high sequence similarity of some of the 12 GM targets, two separate multiplex assays were needed. In both assays (4-plex and 10-plex), the transgenes were labeled with one fluorescence reporter and the endogene with another (GMO concentration = transgene/endogene ratio). It was shown that both multiplex assays produce specific results and that performance parameters such as limit of quantification, repeatability, and trueness comply with international recommendations for GMO quantification methods. Moreover, for samples containing GMOs, the throughput and cost-effectiveness is significantly improved compared to qPCR. Thus, it was concluded that the multiplex ddPCR assays could be applied for routine quantification of 12 EU authorized GM maize lines. In case of new authorizations, the events can easily be added to the existing multiplex assays. The presented principle of quantitative multiplexing can be applied to any other domain.

  7. UV photoprocessing of CO2 ice: a complete quantification of photochemistry and photon-induced desorption processes

    NASA Astrophysics Data System (ADS)

    Martín-Doménech, R.; Manzano-Santamaría, J.; Muñoz Caro, G. M.; Cruz-Díaz, G. A.; Chen, Y.-J.; Herrero, V. J.; Tanarro, I.

    2015-12-01

    Context. Ice mantles that formed on top of dust grains are photoprocessed by the secondary ultraviolet (UV) field in cold and dense molecular clouds. UV photons induce photochemistry and desorption of ice molecules. Experimental simulations dedicated to ice analogs under astrophysically relevant conditions are needed to understand these processes. Aims: We present UV-irradiation experiments of a pure CO2 ice analog. Calibration of the quadrupole mass spectrometer allowed us to quantify the photodesorption of molecules to the gas phase. This information was added to the data provided by the Fourier transform infrared spectrometer on the solid phase to obtain a complete quantitative study of the UV photoprocessing of an ice analog. Methods: Experimental simulations were performed in an ultra-high vacuum chamber. Ice samples were deposited onto an infrared transparent window at 8K and were subsequently irradiated with a microwave-discharged hydrogen flow lamp. After irradiation, ice samples were warmed up until complete sublimation was attained. Results: Photolysis of CO2 molecules initiates a network of photon-induced chemical reactions leading to the formation of CO, CO3, O2, and O3. During irradiation, photon-induced desorption of CO and, to a lesser extent, O2 and CO2 took place through a process called indirect desorption induced by electronic transitions, with maximum photodesorption yields (Ypd) of ~1.2 × 10-2 molecules incident photon-1, ~9.3 × 10-4 molecules incident photon-1, and ~1.1 × 10-4 molecules incident photon-1, respectively. Conclusions: Calibration of mass spectrometers allows a direct quantification of photodesorption yields instead of the indirect values that were obtained from infrared spectra in most previous works. Supplementary information provided by infrared spectroscopy leads to a complete quantification, and therefore a better understanding, of the processes taking place in UV-irradiated ice mantles. Appendix A is available in

  8. Automated Quantification of Volumetric Optic Disc Swelling in Papilledema Using Spectral-Domain Optical Coherence Tomography

    PubMed Central

    Wang, Jui-Kai; Kardon, Randy H.; Kupersmith, Mark J.; Garvin, Mona K.

    2012-01-01

    Purpose. To develop an automated method for the quantification of volumetric optic disc swelling in papilledema subjects using spectral-domain optical coherence tomography (SD-OCT) and to determine the extent that such volumetric measurements correlate with Frisén scale grades (from fundus photographs) and two-dimensional (2-D) peripapillary retinal nerve fiber layer (RNFL) and total retinal (TR) thickness measurements from SD-OCT. Methods. A custom image-analysis algorithm was developed to obtain peripapillary circular RNFL thickness, TR thickness, and TR volume measurements from SD-OCT volumes of subjects with papilledema. In addition, peripapillary RNFL thickness measures from the commercially available Zeiss SD-OCT machine were obtained. Expert Frisén scale grades were independently obtained from corresponding fundus photographs. Results. In 71 SD-OCT scans, the mean (± standard deviation) resulting TR volumes for Frisén scale 0 to scale 4 were 11.36 ± 0.56, 12.53 ± 1.21, 14.42 ± 2.11, 17.48 ± 2.63, and 21.81 ± 3.16 mm3, respectively. The Spearman's rank correlation coefficient was 0.737. Using 55 eyes with valid Zeiss RNFL measurements, Pearson's correlation coefficient (r) between the TR volume and the custom algorithm's TR thickness, the custom algorithm's RNFL thickness, and Zeiss' RNFL thickness was 0.980, 0.929, and 0.946, respectively. Between Zeiss' RNFL and the custom algorithm's RNFL, and the study's TR thickness, r was 0.901 and 0.961, respectively. Conclusions. Volumetric measurements of the degree of disc swelling in subjects with papilledema can be obtained from SD-OCT volumes, with the mean volume appearing to be roughly linearly related to the Frisén scale grade. Using such an approach can provide a more continuous, objective, and robust means for assessing the degree of disc swelling compared with presently available approaches. PMID:22599584

  9. Multiscale recurrence quantification analysis of order recurrence plots

    NASA Astrophysics Data System (ADS)

    Xu, Mengjia; Shang, Pengjian; Lin, Aijing

    2017-03-01

    In this paper, we propose a new method of multiscale recurrence quantification analysis (MSRQA) to analyze the structure of order recurrence plots. The MSRQA is based on order patterns over a range of time scales. Compared with conventional recurrence quantification analysis (RQA), the MSRQA can show richer and more recognizable information on the local characteristics of diverse systems which successfully describes their recurrence properties. Both synthetic series and stock market indexes exhibit their properties of recurrence at large time scales that quite differ from those at a single time scale. Some systems present more accurate recurrence patterns under large time scales. It demonstrates that the new approach is effective for distinguishing three similar stock market systems and showing some inherent differences.

  10. Ratiometric Raman Spectroscopy for Quantification of Protein Oxidative Damage

    PubMed Central

    Jiang, Dongping; Yanney, Michael; Zou, Sige; Sygula, Andrzej

    2009-01-01

    A novel ratiometric Raman spectroscopic (RMRS) method has been developed for quantitative determination of protein carbonyl levels. Oxidized bovine serum albumin (BSA) and oxidized lysozyme were used as model proteins to demonstrate this method. The technique involves conjugation of protein carbonyls with dinitrophenyl hydrazine (DNPH), followed by drop coating deposition Raman spectral acquisition (DCDR). The RMRS method is easy to implement as it requires only one conjugation reaction, a single spectral acquisition, and does not require sample calibration. Characteristic peaks from both protein and DNPH moieties are obtained in a single spectral acquisition, allowing the protein carbonyl level to be calculated from the peak intensity ratio. Detection sensitivity for the RMRS method is ~0.33 pmol carbonyl/measurement. Fluorescence and/or immunoassay based techniques only detect a signal from the labeling molecule and thus yield no structural or quantitative information for the modified protein while the RMRS technique provides for protein identification and protein carbonyl quantification in a single experiment. PMID:19457432

  11. Quantification of polychlorinated dibenzo-p-dioxins and dibenzofurans by direct injection of sample extract into the comprehensive multidimensional gas chromatograph/high-resolution time-of-flight mass spectrometer.

    PubMed

    Shunji, Hashimoto; Yoshikatsu, Takazawa; Akihiro, Fushimi; Hiroyasu, Ito; Kiyoshi, Tanabe; Yasuyuki, Shibata; Masa-aki, Ubukata; Akihiko, Kusai; Kazuo, Tanaka; Hideyuki, Otsuka; Katsunori, Anezaki

    2008-01-18

    Polychlorinated dibenzo-p-dioxins and dibenzofurans in crude extracts of fly ash and flue gas from municipal waste incinerators were quantified using a comprehensive multidimensional gas chromatograph (GC x GC) coupled to a high-resolution time-of-flight mass spectrometer (HR-TOFMS). For identification and quantification, we developed our own program to prepare 3D chromatograms of selected mass numbers from the data of the GC x GC/HR-TOFMS. Isolation of all congeners with a TCDD toxic equivalency factor from the other isomers by only one injection was confirmed. The instrumental detection limit of TCDD on the GC x GC/HR-TOFMS was 0.9 pg by the relative calibration method. Quantification of these substances in the crude extracts was achieved by direct injection to the GC x GC/HR-TOFMS. The results agree with the values obtained using a generic gas chromatography/high-resolution mass spectrometry (GC/HRMS) system. It was confirmed that measurement by high-resolution TOFMS and GC x GC effectively reduces interference from other chemicals.

  12. Superposition Quantification

    NASA Astrophysics Data System (ADS)

    Chang, Li-Na; Luo, Shun-Long; Sun, Yuan

    2017-11-01

    The principle of superposition is universal and lies at the heart of quantum theory. Although ever since the inception of quantum mechanics a century ago, superposition has occupied a central and pivotal place, rigorous and systematic studies of the quantification issue have attracted significant interests only in recent years, and many related problems remain to be investigated. In this work we introduce a figure of merit which quantifies superposition from an intuitive and direct perspective, investigate its fundamental properties, connect it to some coherence measures, illustrate it through several examples, and apply it to analyze wave-particle duality. Supported by Science Challenge Project under Grant No. TZ2016002, Laboratory of Computational Physics, Institute of Applied Physics and Computational Mathematics, Beijing, Key Laboratory of Random Complex Structures and Data Science, Chinese Academy of Sciences, Grant under No. 2008DP173182

  13. Critical assessment of digital PCR for the detection and quantification of genetically modified organisms.

    PubMed

    Demeke, Tigst; Dobnik, David

    2018-07-01

    The number of genetically modified organisms (GMOs) on the market is steadily increasing. Because of regulation of cultivation and trade of GMOs in several countries, there is pressure for their accurate detection and quantification. Today, DNA-based approaches are more popular for this purpose than protein-based methods, and real-time quantitative PCR (qPCR) is still the gold standard in GMO analytics. However, digital PCR (dPCR) offers several advantages over qPCR, making this new technique appealing also for GMO analysis. This critical review focuses on the use of dPCR for the purpose of GMO quantification and addresses parameters which are important for achieving accurate and reliable results, such as the quality and purity of DNA and reaction optimization. Three critical factors are explored and discussed in more depth: correct classification of partitions as positive, correctly determined partition volume, and dilution factor. This review could serve as a guide for all laboratories implementing dPCR. Most of the parameters discussed are applicable to fields other than purely GMO testing. Graphical abstract There are generally three different options for absolute quantification of genetically modified organisms (GMOs) using digital PCR: droplet- or chamber-based and droplets in chambers. All have in common the distribution of reaction mixture into several partitions, which are all subjected to PCR and scored at the end-point as positive or negative. Based on these results GMO content can be calculated.

  14. Quantification of 18F-fluorocholine kinetics in patients with prostate cancer.

    PubMed

    Verwer, Eline E; Oprea-Lager, Daniela E; van den Eertwegh, Alfons J M; van Moorselaar, Reindert J A; Windhorst, Albert D; Schwarte, Lothar A; Hendrikse, N Harry; Schuit, Robert C; Hoekstra, Otto S; Lammertsma, Adriaan A; Boellaard, Ronald

    2015-03-01

    Choline kinase is upregulated in prostate cancer, resulting in increased (18)F-fluoromethylcholine uptake. This study used pharmacokinetic modeling to validate the use of simplified methods for quantification of (18)F-fluoromethylcholine uptake in a routine clinical setting. Forty-minute dynamic PET/CT scans were acquired after injection of 204 ± 9 MBq of (18)F-fluoromethylcholine, from 8 patients with histologically proven metastasized prostate cancer. Plasma input functions were obtained using continuous arterial blood-sampling as well as using image-derived methods. Manual arterial blood samples were used for calibration and correction for plasma-to-blood ratio and metabolites. Time-activity curves were derived from volumes of interest in all visually detectable lymph node metastases. (18)F-fluoromethylcholine kinetics were studied by nonlinear regression fitting of several single- and 2-tissue plasma input models to the time-activity curves. Model selection was based on the Akaike information criterion and measures of robustness. In addition, the performance of several simplified methods, such as standardized uptake value (SUV), was assessed. Best fits were obtained using an irreversible compartment model with blood volume parameter. Parent fractions were 0.12 ± 0.4 after 20 min, necessitating individual metabolite corrections. Correspondence between venous and arterial parent fractions was low as determined by the intraclass correlation coefficient (0.61). Results for image-derived input functions that were obtained from volumes of interest in blood-pool structures distant from tissues of high (18)F-fluoromethylcholine uptake yielded good correlation to those for the blood-sampling input functions (R(2) = 0.83). SUV showed poor correlation to parameters derived from full quantitative kinetic analysis (R(2) < 0.34). In contrast, lesion activity concentration normalized to the integral of the blood activity concentration over time (SUVAUC) showed good

  15. Model Uncertainty Quantification Methods In Data Assimilation

    NASA Astrophysics Data System (ADS)

    Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.

    2017-12-01

    Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.

  16. Real-time quantitative PCR for retrovirus-like particle quantification in CHO cell culture.

    PubMed

    de Wit, C; Fautz, C; Xu, Y

    2000-09-01

    Chinese hamster ovary (CHO) cells have been widely used to manufacture recombinant proteins intended for human therapeutic uses. Retrovirus-like particles, which are apparently defective and non-infectious, have been detected in all CHO cells by electron microscopy (EM). To assure viral safety of CHO cell-derived biologicals, quantification of retrovirus-like particles in production cell culture and demonstration of sufficient elimination of such retrovirus-like particles by the down-stream purification process are required for product market registration worldwide. EM, with a detection limit of 1x10(6) particles/ml, is the standard retrovirus-like particle quantification method. The whole process, which requires a large amount of sample (3-6 litres), is labour intensive, time consuming, expensive, and subject to significant assay variability. In this paper, a novel real-time quantitative PCR assay (TaqMan assay) has been developed for the quantification of retrovirus-like particles. Each retrovirus particle contains two copies of the viral genomic particle RNA (pRNA) molecule. Therefore, quantification of retrovirus particles can be achieved by quantifying the pRNA copy number, i.e. every two copies of retroviral pRNA is equivalent to one retrovirus-like particle. The TaqMan assay takes advantage of the 5'-->3' exonuclease activity of Taq DNA polymerase and utilizes the PRISM 7700 Sequence Detection System of PE Applied Biosystems (Foster City, CA, U.S.A.) for automated pRNA quantification through a dual-labelled fluorogenic probe. The TaqMan quantification technique is highly comparable to the EM analysis. In addition, it offers significant advantages over the EM analysis, such as a higher sensitivity of less than 600 particles/ml, greater accuracy and reliability, higher sample throughput, more flexibility and lower cost. Therefore, the TaqMan assay should be used as a substitute for EM analysis for retrovirus-like particle quantification in CHO cell

  17. qRT-PCR quantification of the biological control agent Trichoderma harzianum in peat and compost-based growing media.

    PubMed

    Beaulieu, Robert; López-Mondéjar, Rubén; Tittarelli, Fabio; Ros, Margarita; Pascual, José Antonio

    2011-02-01

    To ensure proper use of Trichoderma harzianum in agriculture, accurate data must be obtained in population monitoring. The effectiveness of qRT-PCR to quantify T. harzianum in different growing media was compared to the commonly used techniques of colony counting and qPCR. Results showed that plate counting and qPCR offered similar T. harzianum quantification patterns of an initial rapid increase in fungal population that decreased over time. However, data from qRT-PCR showed a population curve of active T. harzianum with a delayed onset of initial growth which then increased throughout the experiment. Results demonstrated that T. harzianum can successfully grow in these media and that qRT-PCR can offer a more distinct representation of active T. harzianum populations. Additionally, compost amended with T. harzianum exhibited a lower Fusarium oxysporum infection rate (67%) and lower percentage of fresh weight loss (11%) in comparison to amended peat (90% infection rate, 23% fresh weight loss). Copyright © 2010 Elsevier Ltd. All rights reserved.

  18. Person-generated Data in Self-quantification. A Health Informatics Research Program.

    PubMed

    Gray, Kathleen; Martin-Sanchez, Fernando J; Lopez-Campos, Guillermo H; Almalki, Manal; Merolli, Mark

    2017-01-09

    The availability of internet-connected mobile, wearable and ambient consumer technologies, direct-to-consumer e-services and peer-to-peer social media sites far outstrips evidence about the efficiency, effectiveness and efficacy of using them in healthcare applications. The aim of this paper is to describe one approach to build a program of health informatics research, so as to generate rich and robust evidence about health data and information processing in self-quantification and associated healthcare and health outcomes. The paper summarises relevant health informatics research approaches in the literature and presents an example of developing a program of research in the Health and Biomedical Informatics Centre (HaBIC) at the University of Melbourne. The paper describes this program in terms of research infrastructure, conceptual models, research design, research reporting and knowledge sharing. The paper identifies key outcomes from integrative and multiple-angle approaches to investigating the management of information and data generated by use of this Centre's collection of wearable, mobiles and other devices in health self-monitoring experiments. These research results offer lessons for consumers, developers, clinical practitioners and biomedical and health informatics researchers. Health informatics is increasingly called upon to make sense of emerging self-quantification and other digital health phenomena that are well beyond the conventions of healthcare in which the field of informatics originated and consolidated. To make a substantial contribution to optimise the aims, processes and outcomes of health self-quantification needs further work at scale in multi-centre collaborations for this Centre and for health informatics researchers generally.

  19. High-performance Thin-layer Chromatographic-densitometric Quantification and Recovery of Bioactive Compounds for Identification of Elite Chemotypes of Gloriosa superba L. Collected from Sikkim Himalayas (India)

    PubMed Central

    Misra, Ankita; Shukla, Pushpendra Kumar; Kumar, Bhanu; Chand, Jai; Kushwaha, Poonam; Khalid, Md.; Singh Rawat, Ajay Kumar; Srivastava, Sharad

    2017-01-01

    Background: Gloriosa superba L. (Colchicaceae) is used as adjuvant therapy in gout for its potential antimitotic activity due to high colchicine(s) alkaloids. Objective: This study aimed to develop an easy, cheap, precise, and accurate high-performance thin-layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L. and to identify its elite chemotype(s) from Sikkim Himalayas (India). Methods: The HPTLC chromatographic method was developed using mobile phase of chloroform: acetone: diethyl amine (5:4:1) at λmax of 350 nm. Results: Five germplasms were collected from targeted region, and on morpho-anatomical inspection, no significant variation was observed among them. Quantification data reveal that content of colchicine (Rf: 0.72) and gloriosine (Rf: 0.61) varies from 0.035%–0.150% to 0.006%–0.032% (dry wt. basis). Linearity of method was obtained in the concentration range of 100–400 ng/spot of marker(s), exhibiting regression coefficient of 0.9987 (colchicine) and 0.9983 (gloriosine) with optimum recovery of 97.79 ± 3.86 and 100.023% ± 0.01%, respectively. Limit of detection and limit of quantification were analyzed, respectively, as 6.245, 18.926 and 8.024, 24.316 (ng). Two germplasms, namely NBG-27 and NBG-26, were found to be elite chemotype of both the markers. Conclusion: The developed method is validated in terms of accuracy, recovery, and precision studies as per the ICH guidelines (2005) and can be adopted for the simultaneous quantification of colchicine and gloriosine in phytopharmaceuticals. In addition, this study is relevant to explore the chemotypic variability in metabolite content for commercial and medicinal purposes. SUMMARY An easy, cheap, precise, and accurate high performance thin layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L.Five germplasms were

  20. Main results and experience obtained on Mir space station and experiment program for Russian segment of ISS.

    PubMed

    Utkin, V F; Lukjashchenko, V I; Borisov, V V; Suvorov, V V; Tsymbalyuk, M M

    2003-07-01

    This article presents main scientific and practical results obtained in course of scientific and applied research and experiments on Mir space station. Based on Mir experience, processes of research program formation for the Russian Segment of the ISS are briefly described. The major trends of activities planned in the frames of these programs as well as preliminary results of increment research programs implementation in the ISS' first missions are also presented. c2003 Elsevier Science Ltd. All rights reserved.

  1. Identification and quantification of VOCs by proton transfer reaction time of flight mass spectrometry: An experimental workflow for the optimization of specificity, sensitivity, and accuracy

    PubMed Central

    Hanna, George B.

    2018-01-01

    Abstract Proton transfer reaction time of flight mass spectrometry (PTR‐ToF‐MS) is a direct injection MS technique, allowing for the sensitive and real‐time detection, identification, and quantification of volatile organic compounds. When aiming to employ PTR‐ToF‐MS for targeted volatile organic compound analysis, some methodological questions must be addressed, such as the need to correctly identify product ions, or evaluating the quantitation accuracy. This work proposes a workflow for PTR‐ToF‐MS method development, addressing the main issues affecting the reliable identification and quantification of target compounds. We determined the fragmentation patterns of 13 selected compounds (aldehydes, fatty acids, phenols). Experiments were conducted under breath‐relevant conditions (100% humid air), and within an extended range of reduced electric field values (E/N = 48–144 Td), obtained by changing drift tube voltage. Reactivity was inspected using H3O+, NO+, and O2 + as primary ions. The results show that a relatively low (<90 Td) E/N often permits to reduce fragmentation enhancing sensitivity and identification capabilities, particularly in the case of aldehydes using NO+, where a 4‐fold increase in sensitivity is obtained by means of drift voltage reduction. We developed a novel calibration methodology, relying on diffusion tubes used as gravimetric standards. For each of the tested compounds, it was possible to define suitable conditions whereby experimental error, defined as difference between gravimetric measurements and calculated concentrations, was 8% or lower. PMID:29336521

  2. Matrix suppression as a guideline for reliable quantification of peptides by matrix-assisted laser desorption ionization.

    PubMed

    Ahn, Sung Hee; Bae, Yong Jin; Moon, Jeong Hee; Kim, Myung Soo

    2013-09-17

    We propose to divide matrix suppression in matrix-assisted laser desorption ionization into two parts, normal and anomalous. In quantification of peptides, the normal effect can be accounted for by constructing the calibration curve in the form of peptide-to-matrix ion abundance ratio versus concentration. The anomalous effect forbids reliable quantification and is noticeable when matrix suppression is larger than 70%. With this 70% rule, matrix suppression becomes a guideline for reliable quantification, rather than a nuisance. A peptide in a complex mixture can be quantified even in the presence of large amounts of contaminants, as long as matrix suppression is below 70%. The theoretical basis for the quantification method using a peptide as an internal standard is presented together with its weaknesses. A systematic method to improve quantification of high concentration analytes has also been developed.

  3. Detection and quantification of beef and pork materials in meat products by duplex droplet digital PCR.

    PubMed

    Cai, Yicun; He, Yuping; Lv, Rong; Chen, Hongchao; Wang, Qiang; Pan, Liangwen

    2017-01-01

    Meat products often consist of meat from multiple animal species, and inaccurate food product adulteration and mislabeling can negatively affect consumers. Therefore, a cost-effective and reliable method for identification and quantification of animal species in meat products is required. In this study, we developed a duplex droplet digital PCR (dddPCR) detection and quantification system to simultaneously identify and quantify the source of meat in samples containing a mixture of beef (Bos taurus) and pork (Sus scrofa) in a single digital PCR reaction tube. Mixed meat samples of known composition were used to test the accuracy and applicability of this method. The limit of detection (LOD) and the limit of quantification (LOQ) of this detection and quantification system were also identified. We conclude that our dddPCR detection and quantification system is suitable for quality control and routine analyses of meat products.

  4. Mathematical and Computational Foundations of Recurrence Quantifications

    NASA Astrophysics Data System (ADS)

    Marwan, Norbert; Webber, Charles L.

    Real-world systems possess deterministic trajectories, phase singularities and noise. Dynamic trajectories have been studied in temporal and frequency domains, but these are linear approaches. Basic to the field of nonlinear dynamics is the representation of trajectories in phase space. A variety of nonlinear tools such as the Lyapunov exponent, Kolmogorov-Sinai entropy, correlation dimension, etc. have successfully characterized trajectories in phase space, provided the systems studied were stationary in time. Ubiquitous in nature, however, are systems that are nonlinear and nonstationary, existing in noisy environments all of which are assumption breaking to otherwise powerful linear tools. What has been unfolding over the last quarter of a century, however, is the timely discovery and practical demonstration that the recurrences of system trajectories in phase space can provide important clues to the system designs from which they derive. In this chapter we will introduce the basics of recurrence plots (RP) and their quantification analysis (RQA). We will begin by summarizing the concept of phase space reconstructions. Then we will provide the mathematical underpinnings of recurrence plots followed by the details of recurrence quantifications. Finally, we will discuss computational approaches that have been implemented to make recurrence strategies feasible and useful. As computers become faster and computer languages advance, younger generations of researchers will be stimulated and encouraged to capture nonlinear recurrence patterns and quantification in even better formats. This particular branch of nonlinear dynamics remains wide open for the definition of new recurrence variables and new applications untouched to date.

  5. TRAP: automated classification, quantification and annotation of tandemly repeated sequences.

    PubMed

    Sobreira, Tiago José P; Durham, Alan M; Gruber, Arthur

    2006-02-01

    TRAP, the Tandem Repeats Analysis Program, is a Perl program that provides a unified set of analyses for the selection, classification, quantification and automated annotation of tandemly repeated sequences. TRAP uses the results of the Tandem Repeats Finder program to perform a global analysis of the satellite content of DNA sequences, permitting researchers to easily assess the tandem repeat content for both individual sequences and whole genomes. The results can be generated in convenient formats such as HTML and comma-separated values. TRAP can also be used to automatically generate annotation data in the format of feature table and GFF files.

  6. Experimental rill erosion research vs. model concepts - quantification of the hydraulic and erosional efficiency of rills

    NASA Astrophysics Data System (ADS)

    Wirtz, Stefan

    2014-05-01

    In soil erosion research, rills are believed to be one of the most efficient forms. They act as preferential flow paths for overland flow and hence become the most efficient sediment sources in a catchment. However their fraction of the overall detachment in a certain area compared to other soil erosion processes is contentious. The requirement for handling this subject is the standardization of the used measurement methods for rill erosion quantification. Only by using a standardized method, the results of different studies become comparable and can be synthesized to one overall statement. In rill erosion research, such a standardized field method was missing until now. Hence, the first aim of this study is to present an experimental setup that enables us to obtain comparable data about process dynamics in eroding rills under standardized conditions in the field. Using this rill experiment, the runoff efficiency of rills (second aim) and the fraction of rill erosion on total soil loss (third aim) in a catchment are quantified. The erosion rate [g m-2] in the rills is between twenty- and sixty-times higher compared to the interrill areas, the specific discharge [L s-1 m-2] in the rills is about 2000 times higher. The identification and quantification of different rill erosion processes are the fourth aim within this project. Gravitative processes like side wall failure, headcut- and knickpoint retreat provide up to 94 % of the detached sediment quantity. In soil erosion models, only the incision into the rill's bottom is considered, hence the modelled results are unsatisfactorily. Due to the low quality of soil erosion model results, the fifth aim of the study is to review two physical basic assumptions using the rill experiments. Contrasting with the model assumptions, there is no clear linear correlation between any hydraulic parameter and the detachment rate and the transport rate is capable of exceeding the transport capacity. In conclusion, the results clearly

  7. An international collaboration to standardize HIV-2 viral load assays: results from the 2009 ACHI(E)V(2E) quality control study.

    PubMed

    Damond, F; Benard, A; Balotta, Claudia; Böni, Jürg; Cotten, Matthew; Duque, Vitor; Ferns, Bridget; Garson, Jeremy; Gomes, Perpetua; Gonçalves, Fátima; Gottlieb, Geoffrey; Kupfer, Bernd; Ruelle, Jean; Rodes, Berta; Soriano, Vicente; Wainberg, Mark; Taieb, Audrey; Matheron, Sophie; Chene, Genevieve; Brun-Vezinet, Francoise

    2011-10-01

    Accurate HIV-2 plasma viral load quantification is crucial for adequate HIV-2 patient management and for the proper conduct of clinical trials and international cohort collaborations. This study compared the homogeneity of HIV-2 RNA quantification when using HIV-2 assays from ACHI(E)V(2E) study sites and either in-house PCR calibration standards or common viral load standards supplied to all collaborators. Each of the 12 participating laboratories quantified blinded HIV-2 samples, using its own HIV-2 viral load assay and standard as well as centrally validated and distributed common HIV-2 group A and B standards (http://www.hiv.lanl.gov/content/sequence/HelpDocs/subtypes-more.html). Aliquots of HIV-2 group A and B strains, each at 2 theoretical concentrations (2.7 and 3.7 log(10) copies/ml), were tested. Intralaboratory, interlaboratory, and overall variances of quantification results obtained with both standards were compared using F tests. For HIV-2 group A quantifications, overall and interlaboratory and/or intralaboratory variances were significantly lower when using the common standard than when using in-house standards at the concentration levels of 2.7 log(10) copies/ml and 3.7 log(10) copies/ml, respectively. For HIV-2 group B, a high heterogeneity was observed and the variances did not differ according to the type of standard used. In this international collaboration, the use of a common standard improved the homogeneity of HIV-2 group A RNA quantification only. The diversity of HIV-2 group B, particularly in PCR primer-binding regions, may explain the heterogeneity in quantification of this strain. Development of a validated HIV-2 viral load assay that accurately quantifies distinct circulating strains is needed.

  8. Quantification of vitamin B6 vitamers in human cerebrospinal fluid by ultra performance liquid chromatography-tandem mass spectrometry.

    PubMed

    van der Ham, M; Albersen, M; de Koning, T J; Visser, G; Middendorp, A; Bosma, M; Verhoeven-Duif, N M; de Sain-van der Velden, M G M

    2012-01-27

    Since vitamin B6 is essential for normal functioning of the central nervous system, there is growing need for sensitive analysis of B6 vitamers in cerebrospinal fluid (CSF). This manuscript describes the development and validation of a rapid, sensitive and accurate method for quantification of the vitamin B6 vitamers pyridoxal (PL), pyridoxamine (PM), pyridoxine (PN), pyridoxic acid (PA), pyridoxal 5'-phosphate (PLP), pyridoxamine 5'-phosphate (PMP) and pyridoxine 5'-phosphate (PNP) in human CSF. The method is based on ultra performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS) with a simple sample preparation procedure of protein precipitation using 50 g L(-1) trichloroacetic acid containing stable isotope labeled internal standards: PL-D(3) for PL and PM, PN-(13)C(4) for PN, PA-D(2) for PA and PLP-D(3) for the phosphorylated vitamers. B6 vitamers were separated (Acquity HSS-T3 UPLC column) with a buffer containing acetic acid, heptafluorobutyric acid and acetonitrile. Positive electrospray ionization was used to monitor transitions m/z 168.1→150.1 (PL), 169.1→134.1 (PM), 170.1→134.1 (PN), 184.1→148.1 (PA), 248.1→150.1 (PLP), 249.1→232.1 (PMP) and 250.1→134.1 (PNP). The method was validated at three concentration levels for each B6 vitamer in CSF. Recoveries of the internal standards were between 93% and 96%. Intra- and inter-assay variations were below 20%. Accuracy tests showed deviations from 3% (PN) to 39% (PMP). Limits of quantification were in the range of 0.03-5.37 nM. Poor results were obtained for quantification of PNP. The method was applied to CSF samples of 20 subjects and two patients on pyridoxine supplementation. Using minimal CSF volumes this method is suitable for implementation in a routine diagnostic setting. Copyright © 2011 Elsevier B.V. All rights reserved.

  9. 25 CFR 162.539 - Must I obtain a WEEL before obtaining a WSR lease?

    Code of Federal Regulations, 2014 CFR

    2014-04-01

    ... AND PERMITS Wind and Solar Resource Leases Wsr Leases § 162.539 Must I obtain a WEEL before obtaining... direct result of energy resource information gathered from a WEEL activity, obtaining a WEEL is not a...

  10. 25 CFR 162.539 - Must I obtain a WEEL before obtaining a WSR lease?

    Code of Federal Regulations, 2013 CFR

    2013-04-01

    ... AND PERMITS Wind and Solar Resource Leases Wsr Leases § 162.539 Must I obtain a WEEL before obtaining... direct result of energy resource information gathered from a WEEL activity, obtaining a WEEL is not a...

  11. Quantification of shape and cell polarity reveals a novel mechanism underlying malformations resulting from related FGF mutations during facial morphogenesis

    PubMed Central

    Li, Xin; Young, Nathan M.; Tropp, Stephen; Hu, Diane; Xu, Yanhua; Hallgrímsson, Benedikt; Marcucio, Ralph S.

    2013-01-01

    Fibroblast growth factor (FGF) signaling mutations are a frequent contributor to craniofacial malformations including midfacial anomalies and craniosynostosis. FGF signaling has been shown to control cellular mechanisms that contribute to facial morphogenesis and growth such as proliferation, survival, migration and differentiation. We hypothesized that FGF signaling not only controls the magnitude of growth during facial morphogenesis but also regulates the direction of growth via cell polarity. To test this idea, we infected migrating neural crest cells of chicken embryos with  replication-competent avian sarcoma virus expressing either FgfR2C278F, a receptor mutation found in Crouzon syndrome or the ligand Fgf8. Treated embryos exhibited craniofacial malformations resembling facial dysmorphologies in craniosynostosis syndrome. Consistent with our hypothesis, ectopic activation of FGF signaling resulted in decreased cell proliferation, increased expression of the Sprouty class of FGF signaling inhibitors, and repressed phosphorylation of ERK/MAPK. Furthermore, quantification of cell polarity in facial mesenchymal cells showed that while orientation of the Golgi body matches the direction of facial prominence outgrowth in normal cells, in FGF-treated embryos this direction is randomized, consistent with aberrant growth that we observed. Together, these data demonstrate that FGF signaling regulates cell proliferation and cell polarity and that these cell processes contribute to facial morphogenesis. PMID:23906837

  12. QUANTIFICATION OF NUCLEOLAR CHANNEL SYSTEMS: UNIFORM PRESENCE THROUGHOUT THE UPPER ENDOMETRIAL CAVITY

    PubMed Central

    Szmyga, Michael J.; Rybak, Eli A.; Nejat, Edward J.; Banks, Erika H.; Whitney, Kathleen D.; Polotsky, Alex J.; Heller, Debra S.; Meier, U. Thomas

    2014-01-01

    Objective To determine the prevalence of nucleolar channel systems (NCSs) by uterine region applying continuous quantification. Design Prospective clinical study. Setting Tertiary care academic medical center. Patients 42 naturally cycling women who underwent hysterectomy for benign indications. Intervention NCS presence was quantified by a novel method in six uterine regions, fundus, left cornu, right cornu, anterior body, posterior body, and lower uterine segment (LUS), using indirect immunofluorescence. Main Outcome Measures Percent of endometrial epithelial cells (EECs) with NCSs per uterine region. Results NCS quantification was observer-independent (intraclass correlation coefficient [ICC] = 0.96) and its intra-sample variability low (coefficient of variability [CV] = 0.06). 11/42 hysterectomy specimens were midluteal, 10 of which were analyzable with 9 containing over 5% EECs with NCSs in at least one region. The percent of EECs with NCSs varied significantly between the lower uterine segment (6.1%; IQR = 3.0-9.9) and the upper five regions (16.9%; IQR = 12.7-23.4) with fewer NCSs in the basal layer of the endometrium (17% +/−6%) versus the middle (46% +/−9%) and luminal layers (38% +/−9%) of all six regions). Conclusions NCS quantification during the midluteal phase demonstrates uniform presence throughout the endometrial cavity, excluding the LUS, with a preference for the functional, luminal layers. Our quantitative NCS evaluation provides a benchmark for future studies and further supports NCS presence as a potential marker for the window of implantation. PMID:23137760

  13. Quantification of birefringence readily measures the level of muscle damage in zebrafish

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Berger, Joachim, E-mail: Joachim.Berger@Monash.edu; Sztal, Tamar; Currie, Peter D.

    2012-07-13

    Highlights: Black-Right-Pointing-Pointer Report of an unbiased quantification of the birefringence of muscle of fish larvae. Black-Right-Pointing-Pointer Quantification method readily identifies level of overall muscle damage. Black-Right-Pointing-Pointer Compare zebrafish muscle mutants for level of phenotype severity. Black-Right-Pointing-Pointer Proposed tool to survey treatments that aim to ameliorate muscular dystrophy. -- Abstract: Muscular dystrophies are a group of genetic disorders that progressively weaken and degenerate muscle. Many zebrafish models for human muscular dystrophies have been generated and analysed, including dystrophin-deficient zebrafish mutants dmd that model Duchenne Muscular Dystrophy. Under polarised light the zebrafish muscle can be detected as a bright area in anmore » otherwise dark background. This light effect, called birefringence, results from the diffraction of polarised light through the pseudo-crystalline array of the muscle sarcomeres. Muscle damage, as seen in zebrafish models for muscular dystrophies, can readily be detected by a reduction in the birefringence. Therefore, birefringence is a very sensitive indicator of overall muscle integrity within larval zebrafish. Unbiased documentation of the birefringence followed by densitometric measurement enables the quantification of the birefringence of zebrafish larvae. Thereby, the overall level of muscle integrity can be detected, allowing the identification and categorisation of zebrafish muscle mutants. In addition, we propose that the establish protocol can be used to analyse treatments aimed at ameliorating dystrophic zebrafish models.« less

  14. Quantification of thymosin beta(4) in human cerebrospinal fluid using matrix-assisted laser desorption/ionization time-of-flight mass spectrometry.

    PubMed

    Urso, Elena; Le Pera, Maria; Bossio, Sabrina; Sprovieri, Teresa; Qualtieri, Antonio

    2010-07-01

    Matrix-assisted laser desorption/ionization time-of-flight mass spectrometry (MALDI-TOF-MS) has been applied to the analysis of a wide range of biomolecules. To date, there are two specific areas of application where MALDI-TOF-MS is viewed as impractical: analysis of low-mass analytes and relative quantitative applications. However, these limitations can be overcome and quantification can be routine. Increased levels of thymosin beta(4) (TB4) have been recently found in cerebrospinal fluid (CSF) from Creutzfeldt-Jakob disease (CJD) patients. Our objective was to apply a label-free quantitative application of MALDI-TOF-MS to measure TB4 levels in human CSF by adding the oxidized form of TB4 as an internal standard. The relative peak area or peak height ratios of the native TB4 to the added oxidized form were evaluated. Considering the relative peak area ratios, healthy individuals showed a mean value of 40.8+/-21.27 ng/ml, whereas CJD patients showed high values with a mean of 154+/-59.07 ng/ml, in agreement with the previous observation found in CJD patients. Similar results were obtained considering peak height ratios. The proposed method may provide a simple and rapid screening method for quantification on CSF of TB4 levels suitable for diagnostic purposes. 2010 Elsevier Inc. All rights reserved.

  15. Quantification of prairie restoration for phytostability at a remediated defense plant.

    PubMed

    Franson, Raymond L; Scholes, Chad M

    2011-01-01

    In June 2008 and 2009, cover, density, and species diversity were measured on two areas of the prairie at the U. S. Department of Energy Weldon Spring Site to begin quantification of the prairie establishment and the effects of a prairie burn. Sampling began by testing for the most appropriate transect length (cover) and quadrat size (density) for quantification of vegetation. Total cover increased in the first growing season after burning. Conversely, total cover decreased in the unburned area in one year. The trend in litter cover is the opposite with litter decreasing after burning, but increasing in one year in the unburned area. Bare ground decreased in one year in the unburned area, but was unchanged after burning. Species diversity tripled after fire, but was unchanged in one year in the unburned area. The results show that litter and fire both affect plant cover. If land reclamation activities are to be an integral part of hazardous waste remediation at contaminated sites, then the success of reclamation efforts needs to be quantified along with success criteria for waste remediation of the sites. The results show that plant cover can be easily quantified, but that density measures are more biased which makes it more difficult to achieve adequate sample size for plant density.

  16. Droplet digital PCR (ddPCR) vs quantitative real-time PCR (qPCR) approach for detection and quantification of Merkel cell polyomavirus (MCPyV) DNA in formalin fixed paraffin embedded (FFPE) cutaneous biopsies.

    PubMed

    Arvia, Rosaria; Sollai, Mauro; Pierucci, Federica; Urso, Carmelo; Massi, Daniela; Zakrzewska, Krystyna

    2017-08-01

    Merkel cell polyomavirus (MCPyV) is associated with Merkel cell carcinoma and high viral load in the skin was proposed as a risk factor for the occurrence of this tumour. MCPyV DNA was detected, with lower frequency, in different skin cancers but since the viral load was usually low, the real prevalence of viral DNA could be underestimated. To evaluate the performance of two assays (qPCR and ddPCR) for MCPyV detection and quantification in formalin fixed paraffin embedded (FFPE) tissue samples. Both assays were designed to simultaneous detection and quantification of both MCPyV as well as house-keeping DNA in clinical samples. The performance of MCPyV quantification was investigated using serial dilutions of cloned target DNA. We also evaluated the applicability of both tests for the analysis of 76 FFPE cutaneous biopsies. The two approaches resulted equivalent with regard to the reproducibility and repeatability and showed a high degree of linearity in the dynamic range tested in the present study. Moreover, qPCR was able to quantify ≥10 5 copies per reaction, while the upper limit of ddPCR was 10 4 copies. There was not significant difference between viral load measured by the two methods The detection limit of both tests was 0,15 copies per reaction, however, the number of positive samples obtained by ddPCR was higher than that obtained by qPCR (45% and 37% respectively). The ddPCR represents a better method for detection of MCPyV in FFPE biopsies, mostly these containing low copies number of viral genome. Copyright © 2017 Elsevier B.V. All rights reserved.

  17. Quantification of L-Citrulline and other physiologic amino acids in watermelon and selected cucurbits

    USDA-ARS?s Scientific Manuscript database

    High performance liquid chromatography of dabsyl derivatives of amino acids was employed for quantification of physiologic amino acids in cucurbits. This method is particularly useful because the dabsyl derivatives of glutamine and citrulline are sufficiently separated to allow quantification of ea...

  18. Stringlike Pulse Quantification Study by Pulse Wave in 3D Pulse Mapping

    PubMed Central

    Chung, Yu-Feng; Yeh, Cheng-Chang; Si, Xiao-Chen; Chang, Chien-Chen; Hu, Chung-Shing; Chu, Yu-Wen

    2012-01-01

    Abstract Background A stringlike pulse is highly related to hypertension, and many classification approaches have been proposed in which the differentiation pulse wave (dPW) can effectively classify the stringlike pulse indicating hypertension. Unfortunately, the dPW method cannot distinguish the spring stringlike pulse from the stringlike pulse so labeled by physicians in clinics. Design By using a Bi-Sensing Pulse Diagnosis Instrument (BSPDI), this study proposed a novel Plain Pulse Wave (PPW) to classify a stringlike pulse based on an array of pulse signals, mimicking a Traditional Chinese Medicine physician's finger-reading skill. Results In comparison to PPWs at different pulse taking positions, phase delay Δθand correlation coefficient r can be elucidated as the quantification parameters of stringlike pulse. As a result, the recognition rates of a hypertensive stringlike pulse, spring stringlike pulse, and non–stringlike pulse are 100%, 100%, 77% for PPW and 70%, 0%, 59% for dPW, respectively. Conclusions Integrating dPW and PPW can unify the classification of stringlike pulse including hypertensive stringlike pulse and spring stringlike pulse. Hence, the proposed novel method, PPW, enhances quantification of stringlike pulse. PMID:23057481

  19. Recommendations and Standardization of Biomarker Quantification Using NMR-Based Metabolomics with Particular Focus on Urinary Analysis

    PubMed Central

    2016-01-01

    NMR-based metabolomics has shown considerable promise in disease diagnosis and biomarker discovery because it allows one to nondestructively identify and quantify large numbers of novel metabolite biomarkers in both biofluids and tissues. Precise metabolite quantification is a prerequisite to move any chemical biomarker or biomarker panel from the lab to the clinic. Among the biofluids commonly used for disease diagnosis and prognosis, urine has several advantages. It is abundant, sterile, and easily obtained, needs little sample preparation, and does not require invasive medical procedures for collection. Furthermore, urine captures and concentrates many “unwanted” or “undesirable” compounds throughout the body, providing a rich source of potentially useful disease biomarkers; however, incredible variation in urine chemical concentrations makes analysis of urine and identification of useful urinary biomarkers by NMR challenging. We discuss a number of the most significant issues regarding NMR-based urinary metabolomics with specific emphasis on metabolite quantification for disease biomarker applications and propose data collection and instrumental recommendations regarding NMR pulse sequences, acceptable acquisition parameter ranges, relaxation effects on quantitation, proper handling of instrumental differences, sample preparation, and biomarker assessment. PMID:26745651

  20. Antibiotic Resistome: Improving Detection and Quantification Accuracy for Comparative Metagenomics.

    PubMed

    Elbehery, Ali H A; Aziz, Ramy K; Siam, Rania

    2016-04-01

    The unprecedented rise of life-threatening antibiotic resistance (AR), combined with the unparalleled advances in DNA sequencing of genomes and metagenomes, has pushed the need for in silico detection of the resistance potential of clinical and environmental metagenomic samples through the quantification of AR genes (i.e., genes conferring antibiotic resistance). Therefore, determining an optimal methodology to quantitatively and accurately assess AR genes in a given environment is pivotal. Here, we optimized and improved existing AR detection methodologies from metagenomic datasets to properly consider AR-generating mutations in antibiotic target genes. Through comparative metagenomic analysis of previously published AR gene abundance in three publicly available metagenomes, we illustrate how mutation-generated resistance genes are either falsely assigned or neglected, which alters the detection and quantitation of the antibiotic resistome. In addition, we inspected factors influencing the outcome of AR gene quantification using metagenome simulation experiments, and identified that genome size, AR gene length, total number of metagenomics reads and selected sequencing platforms had pronounced effects on the level of detected AR. In conclusion, our proposed improvements in the current methodologies for accurate AR detection and resistome assessment show reliable results when tested on real and simulated metagenomic datasets.

  1. Nitric Oxide Analyzer Quantification of Plant S-Nitrosothiols.

    PubMed

    Hussain, Adil; Yun, Byung-Wook; Loake, Gary J

    2018-01-01

    Nitric oxide (NO) is a small diatomic molecule that regulates multiple physiological processes in animals, plants, and microorganisms. In animals, it is involved in vasodilation and neurotransmission and is present in exhaled breath. In plants, it regulates both plant immune function and numerous developmental programs. The high reactivity and short half-life of NO and cross-reactivity of its various derivatives make its quantification difficult. Different methods based on calorimetric, fluorometric, and chemiluminescent detection of NO and its derivatives are available, but all of them have significant limitations. Here we describe a method for the chemiluminescence-based quantification of NO using ozone-chemiluminescence technology in plants. This approach provides a sensitive, robust, and flexible approach for determining the levels of NO and its signaling products, protein S-nitrosothiols.

  2. Disease quantification on PET/CT images without object delineation

    NASA Astrophysics Data System (ADS)

    Tong, Yubing; Udupa, Jayaram K.; Odhner, Dewey; Wu, Caiyun; Fitzpatrick, Danielle; Winchell, Nicole; Schuster, Stephen J.; Torigian, Drew A.

    2017-03-01

    The derivation of quantitative information from images to make quantitative radiology (QR) clinically practical continues to face a major image analysis hurdle because of image segmentation challenges. This paper presents a novel approach to disease quantification (DQ) via positron emission tomography/computed tomography (PET/CT) images that explores how to decouple DQ methods from explicit dependence on object segmentation through the use of only object recognition results to quantify disease burden. The concept of an object-dependent disease map is introduced to express disease severity without performing explicit delineation and partial volume correction of either objects or lesions. The parameters of the disease map are estimated from a set of training image data sets. The idea is illustrated on 20 lung lesions and 20 liver lesions derived from 18F-2-fluoro-2-deoxy-D-glucose (FDG)-PET/CT scans of patients with various types of cancers and also on 20 NEMA PET/CT phantom data sets. Our preliminary results show that, on phantom data sets, "disease burden" can be estimated to within 2% of known absolute true activity. Notwithstanding the difficulty in establishing true quantification on patient PET images, our results achieve 8% deviation from "true" estimates, with slightly larger deviations for small and diffuse lesions where establishing ground truth becomes really questionable, and smaller deviations for larger lesions where ground truth set up becomes more reliable. We are currently exploring extensions of the approach to include fully automated body-wide DQ, extensions to just CT or magnetic resonance imaging (MRI) alone, to PET/CT performed with radiotracers other than FDG, and other functional forms of disease maps.

  3. Electrochemical Quantification of the Antioxidant Capacity of Medicinal Plants Using Biosensors

    PubMed Central

    Rodríguez-Sevilla, Erika; Ramírez-Silva, María-Teresa; Romero-Romo, Mario; Ibarra-Escutia, Pedro; Palomar-Pardavé, Manuel

    2014-01-01

    The working area of a screen-printed electrode, SPE, was modified with the enzyme tyrosinase (Tyr) using different immobilization methods, namely entrapment with water-soluble polyvinyl alcohol (PVA), cross-linking using glutaraldehyde (GA), and cross-linking using GA and human serum albumin (HSA); the resulting electrodes were termed SPE/Tyr/PVA, SPE/Tyr/GA and SPE/Tyr/HSA/GA, respectively. These biosensors were characterized by means of amperometry and EIS techniques. From amperometric evaluations, the apparent Michaelis-Menten constant, Km′, of each biosensor was evaluated while the respective charge transfer resistance, Rct, was assessed from impedance measurements. It was found that the SPE/Tyr/GA had the smallest Km′ (57 ± 7) μM and Rct values. This electrode also displayed both the lowest detection and quantification limits for catechol quantification. Using the SPE/Tyr/GA, the Trolox Equivalent Antioxidant Capacity (TEAC) was determined from infusions prepared with “mirto” (Salvia microphylla), “hHierba dulce” (Lippia dulcis) and “salve real” (Lippia alba), medicinal plants commonly used in Mexico. PMID:25111237

  4. Simultaneous Quantification of Forskolin and Iso-Forskolin in Coleus forskohlii (Wild.) Briq. and Identification of Elite Chemotype, Collected from Eastern Ghats (India)

    PubMed Central

    Shukla, Pushpendra Kumar; Misra, Ankita; Kumar, Manish; Jaichand; Singh, Kuldeep; Akhtar, Juber; Srivastava, Sharad; Agrawal, Pawan K; Singh Rawat, Ajay K

    2017-01-01

    Background: Coleus forskohlii is a well-known industrially important medicinal plant, for its high forskolin content. Objective: A simple, selective, and sensitive high-performance thin layer chromatography (HPTLC) method was developed and validated for simultaneous quantification of forskolin and iso-forskolin in C. forskohlii germplasm collected from the Eastern Ghats, India. Materials and Methods: Chromatographic separation of the targeted marker(s) was obtained on precoated silica plates using toluene: ethyl acetate: methanol (90:30:0.5, v/v/v) as the mobile phase. Results: Densitometric quantification of forskolin and iso-forskolin was carried out at 545 nm. Forskolin and iso-forskolin were identified by comparing the ultraviolet spectra of standard and sample track at Rf of 0.64 ± 0.02 and 0.36 ± 0.01, after derivatization with anisaldehyde sulfuric acid reagent. The linearity of both the analytes was obtained in the range of 300–1200 ng/spot with the regression coefficient (R2) of 0.991 and 0.986. Recovery of analyte (s) at three levels, namely, 100, 150, and 200 ng/spot was found to be 100.46% ± 0.29%, 99.64% ± 0.33%, 100.02% ± 0.76% and 99.76% ± 0.62%, 99.56% ± 0.35%, 100.02% ± 0.22%, respectively, for forskolin and iso-forskolin. The content of forskolin and iso-forskolin varies from 0.046% to 0.187% and 0.002% to 0.077%, respectively (dry weight basis), the maximum content of both the markers was found in NBC-31, from Thakurwada, Maharashtra. Conclusion: The developed HPTLC method was linear, accurate, and reliable as per the International Council for Harmonisation of Technical Requirements for Registration of Pharmaceuticals for Human Use guidelines. The study aids in the identification of elite chemotype for commercial prospection of industrially viable medicinal crop. SUMMARY 12 Samples are collected from different locations of the eastern ghat regionsQuantification of two major marker forskolin and iso forskolinThe maximum content of both

  5. Parsing and Quantification of Raw Orbitrap Mass Spectrometer Data Using RawQuant.

    PubMed

    Kovalchik, Kevin A; Moggridge, Sophie; Chen, David D Y; Morin, Gregg B; Hughes, Christopher S

    2018-06-01

    Effective analysis of protein samples by mass spectrometry (MS) requires careful selection and optimization of a range of experimental parameters. As the output from the primary detection device, the "raw" MS data file can be used to gauge the success of a given sample analysis. However, the closed-source nature of the standard raw MS file can complicate effective parsing of the data contained within. To ease and increase the range of analyses possible, the RawQuant tool was developed to enable parsing of raw MS files derived from Thermo Orbitrap instruments to yield meta and scan data in an openly readable text format. RawQuant can be commanded to export user-friendly files containing MS 1 , MS 2 , and MS 3 metadata as well as matrices of quantification values based on isobaric tagging approaches. In this study, the utility of RawQuant is demonstrated in several scenarios: (1) reanalysis of shotgun proteomics data for the identification of the human proteome, (2) reanalysis of experiments utilizing isobaric tagging for whole-proteome quantification, and (3) analysis of a novel bacterial proteome and synthetic peptide mixture for assessing quantification accuracy when using isobaric tags. Together, these analyses successfully demonstrate RawQuant for the efficient parsing and quantification of data from raw Thermo Orbitrap MS files acquired in a range of common proteomics experiments. In addition, the individual analyses using RawQuant highlights parametric considerations in the different experimental sets and suggests targetable areas to improve depth of coverage in identification-focused studies and quantification accuracy when using isobaric tags.

  6. Interference-free spectrofluorometric quantification of aristolochic acid I and aristololactam I in five Chinese herbal medicines using chemical derivatization enhancement and second-order calibration methods

    NASA Astrophysics Data System (ADS)

    Hu, Yong; Wu, Hai-Long; Yin, Xiao-Li; Gu, Hui-Wen; Xiao, Rong; Wang, Li; Fang, Huan; Yu, Ru-Qin

    2017-03-01

    A rapid interference-free spectrofluorometric method combined with the excitation-emission matrix fluorescence and the second-order calibration methods based on the alternating penalty trilinear decomposition (APTLD) and the self-weighted alternating trilinear decomposition (SWATLD) algorithms, was proposed for the simultaneous determination of nephrotoxic aristolochic acid I (AA-I) and aristololactam I (AL-I) in five Chinese herbal medicines. The method was based on a chemical derivatization that converts the non-fluorescent AA-I to high-fluorescent AL-I, achieving a high sensitive and simultaneous quantification of the analytes. The variables of the derivatization reaction that conducted by using zinc powder in acetose methanol aqueous solution, were studied and optimized for best quantification results of AA-I and AL-I. The satisfactory results of AA-I and AL-I for the spiked recovery assay were achieved with average recoveries in the range of 100.4-103.8% and RMSEPs < 0.78 ng mL- 1, which validate the accuracy and reliability of the proposed method. The contents of AA-I and AL-I in five herbal medicines obtained from the proposed method were also in good accordance with those of the validated LC-MS/MS method. In light of high sensitive fluorescence detection, the limits of detection (LODs) of AA-I and AL-I for the proposed method compare favorably with that of the LC-MS/MS method, with the LODs < 0.35 and 0.29 ng mL- 1, respectively. The proposed strategy based on the APTLD and SWATLD algorithms by virtue of the "second-order advantage", can be considered as an attractive and green alternative for the quantification of AA-I and AL-I in complex herbal medicine matrices without any prior separations and clear-up processes.

  7. Nondestructive Detection and Quantification of Blueberry Bruising using Near-infrared (NIR) Hyperspectral Reflectance Imaging

    NASA Astrophysics Data System (ADS)

    Jiang, Yu; Li, Changying; Takeda, Fumiomi

    2016-10-01

    Currently, blueberry bruising is evaluated by either human visual/tactile inspection or firmness measurement instruments. These methods are destructive, time-consuming, and subjective. The goal of this paper was to develop a non-destructive approach for blueberry bruising detection and quantification. Experiments were conducted on 300 samples of southern highbush blueberry (Camellia, Rebel, and Star) and on 1500 samples of northern highbush blueberry (Bluecrop, Jersey, and Liberty) for hyperspectral imaging analysis, firmness measurement, and human evaluation. An algorithm was developed to automatically calculate a bruise ratio index (ratio of bruised to whole fruit area) for bruise quantification. The spectra of bruised and healthy tissues were statistically separated and the separation was independent of cultivars. Support vector machine (SVM) classification of the spectra from the regions of interest (ROIs) achieved over 94%, 92%, and 96% accuracy on the training set, independent testing set, and combined set, respectively. The statistical results showed that the bruise ratio index was equivalent to the measured firmness but better than the predicted firmness in regard to effectiveness of bruise quantification, and the bruise ratio index had a strong correlation with human assessment (R2 = 0.78 - 0.83). Therefore, the proposed approach and the bruise ratio index are effective to non-destructively detect and quantify blueberry bruising.

  8. Nondestructive Detection and Quantification of Blueberry Bruising using Near-infrared (NIR) Hyperspectral Reflectance Imaging.

    PubMed

    Jiang, Yu; Li, Changying; Takeda, Fumiomi

    2016-10-21

    Currently, blueberry bruising is evaluated by either human visual/tactile inspection or firmness measurement instruments. These methods are destructive, time-consuming, and subjective. The goal of this paper was to develop a non-destructive approach for blueberry bruising detection and quantification. Experiments were conducted on 300 samples of southern highbush blueberry (Camellia, Rebel, and Star) and on 1500 samples of northern highbush blueberry (Bluecrop, Jersey, and Liberty) for hyperspectral imaging analysis, firmness measurement, and human evaluation. An algorithm was developed to automatically calculate a bruise ratio index (ratio of bruised to whole fruit area) for bruise quantification. The spectra of bruised and healthy tissues were statistically separated and the separation was independent of cultivars. Support vector machine (SVM) classification of the spectra from the regions of interest (ROIs) achieved over 94%, 92%, and 96% accuracy on the training set, independent testing set, and combined set, respectively. The statistical results showed that the bruise ratio index was equivalent to the measured firmness but better than the predicted firmness in regard to effectiveness of bruise quantification, and the bruise ratio index had a strong correlation with human assessment (R2 = 0.78 - 0.83). Therefore, the proposed approach and the bruise ratio index are effective to non-destructively detect and quantify blueberry bruising.

  9. Adaptive polynomial chaos techniques for uncertainty quantification of a gas cooled fast reactor transient

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Perko, Z.; Gilli, L.; Lathouwers, D.

    2013-07-01

    Uncertainty quantification plays an increasingly important role in the nuclear community, especially with the rise of Best Estimate Plus Uncertainty methodologies. Sensitivity analysis, surrogate models, Monte Carlo sampling and several other techniques can be used to propagate input uncertainties. In recent years however polynomial chaos expansion has become a popular alternative providing high accuracy at affordable computational cost. This paper presents such polynomial chaos (PC) methods using adaptive sparse grids and adaptive basis set construction, together with an application to a Gas Cooled Fast Reactor transient. Comparison is made between a new sparse grid algorithm and the traditionally used techniquemore » proposed by Gerstner. An adaptive basis construction method is also introduced and is proved to be advantageous both from an accuracy and a computational point of view. As a demonstration the uncertainty quantification of a 50% loss of flow transient in the GFR2400 Gas Cooled Fast Reactor design was performed using the CATHARE code system. The results are compared to direct Monte Carlo sampling and show the superior convergence and high accuracy of the polynomial chaos expansion. Since PC techniques are easy to implement, they can offer an attractive alternative to traditional techniques for the uncertainty quantification of large scale problems. (authors)« less

  10. Uncertainty Quantification in Alchemical Free Energy Methods.

    PubMed

    Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V

    2018-06-12

    Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.

  11. Simple and rapid quantification of serotonin transporter binding using [11C]DASB bolus plus constant infusion.

    PubMed

    Gryglewski, G; Rischka, L; Philippe, C; Hahn, A; James, G M; Klebermass, E; Hienert, M; Silberbauer, L; Vanicek, T; Kautzky, A; Berroterán-Infante, N; Nics, L; Traub-Weidinger, T; Mitterhauser, M; Wadsak, W; Hacker, M; Kasper, S; Lanzenberger, R

    2017-04-01

    .74-0.87; BP ND-120 had ICCs of 0.73-0.90. Low-binding cortical regions and cerebellar gray matter showed a positive bias of ~8% and ICCs 0.57-0.68 at V T-90 . Cortical BP ND suffered from high variability and bias, best results were obtained for olfactory cortex and anterior cingulate cortex with ICC=0.74-0.75 for BP ND-90 . High-density regions amygdala and midbrain had a negative bias of -5.5% and -22.5% at V T-90 with ICC 0.70 and 0.63, respectively. We have optimized the equilibrium method with [ 11 C]DASB bolus plus constant infusion and demonstrated good inter-method reliability with accepted standard methods and for SERT quantification using both V T and BP ND in a range of different brain regions. With as little as 10-15min of scanning valid estimates of SERT V T and BP ND in thalamus, amygdala, striatal and high-binding cortical regions could be obtained. Blood sampling seems vital for valid quantification of SERT in low-binding cortical regions. These methods allow the investigation of up to three subjects with a single radiosynthesis. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  13. MR/PET quantification tools: Registration, segmentation, classification, and MR-based attenuation correction

    PubMed Central

    Fei, Baowei; Yang, Xiaofeng; Nye, Jonathon A.; Aarsvold, John N.; Raghunath, Nivedita; Cervo, Morgan; Stark, Rebecca; Meltzer, Carolyn C.; Votaw, John R.

    2012-01-01

    Purpose: Combined MR/PET is a relatively new, hybrid imaging modality. A human MR/PET prototype system consisting of a Siemens 3T Trio MR and brain PET insert was installed and tested at our institution. Its present design does not offer measured attenuation correction (AC) using traditional transmission imaging. This study is the development of quantification tools including MR-based AC for quantification in combined MR/PET for brain imaging. Methods: The developed quantification tools include image registration, segmentation, classification, and MR-based AC. These components were integrated into a single scheme for processing MR/PET data. The segmentation method is multiscale and based on the Radon transform of brain MR images. It was developed to segment the skull on T1-weighted MR images. A modified fuzzy C-means classification scheme was developed to classify brain tissue into gray matter, white matter, and cerebrospinal fluid. Classified tissue is assigned an attenuation coefficient so that AC factors can be generated. PET emission data are then reconstructed using a three-dimensional ordered sets expectation maximization method with the MR-based AC map. Ten subjects had separate MR and PET scans. The PET with [11C]PIB was acquired using a high-resolution research tomography (HRRT) PET. MR-based AC was compared with transmission (TX)-based AC on the HRRT. Seventeen volumes of interest were drawn manually on each subject image to compare the PET activities between the MR-based and TX-based AC methods. Results: For skull segmentation, the overlap ratio between our segmented results and the ground truth is 85.2 ± 2.6%. Attenuation correction results from the ten subjects show that the difference between the MR and TX-based methods was <6.5%. Conclusions: MR-based AC compared favorably with conventional transmission-based AC. Quantitative tools including registration, segmentation, classification, and MR-based AC have been developed for use in combined MR

  14. Phenotypic feature quantification of patient derived 3D cancer spheroids in fluorescence microscopy image

    NASA Astrophysics Data System (ADS)

    Kang, Mi-Sun; Rhee, Seon-Min; Seo, Ji-Hyun; Kim, Myoung-Hee

    2017-03-01

    Patients' responses to a drug differ at the cellular level. Here, we present an image-based cell phenotypic feature quantification method for predicting the responses of patient-derived glioblastoma cells to a particular drug. We used high-content imaging to understand the features of patient-derived cancer cells. A 3D spheroid culture formation resembles the in vivo environment more closely than 2D adherent cultures do, and it allows for the observation of cellular aggregate characteristics. However, cell analysis at the individual level is more challenging. In this paper, we demonstrate image-based phenotypic screening of the nuclei of patient-derived cancer cells. We first stitched the images of each well of the 384-well plate with the same state. We then used intensity information to detect the colonies. The nuclear intensity and morphological characteristics were used for the segmentation of individual nuclei. Next, we calculated the position of each nucleus that is appeal of the spatial pattern of cells in the well environment. Finally, we compared the results obtained using 3D spheroid culture cells with those obtained using 2D adherent culture cells from the same patient being treated with the same drugs. This technique could be applied for image-based phenotypic screening of cells to determine the patient's response to the drug.

  15. Generic method for the absolute quantification of glutathione S-conjugates: Application to the conjugates of acetaminophen, clozapine and diclofenac.

    PubMed

    den Braver, Michiel W; Vermeulen, Nico P E; Commandeur, Jan N M

    2017-03-01

    Modification of cellular macromolecules by reactive drug metabolites is considered to play an important role in the initiation of tissue injury by many drugs. Detection and identification of reactive intermediates is often performed by analyzing the conjugates formed after trapping by glutathione (GSH). Although sensitivity of modern mass spectrometrical methods is extremely high, absolute quantification of GSH-conjugates is critically dependent on the availability of authentic references. Although 1 H NMR is currently the method of choice for quantification of metabolites formed biosynthetically, its intrinsically low sensitivity can be a limiting factor in quantification of GSH-conjugates which generally are formed at low levels. In the present study, a simple but sensitive and generic method for absolute quantification of GSH-conjugates is presented. The method is based on quantitative alkaline hydrolysis of GSH-conjugates and subsequent quantification of glutamic acid and glycine by HPLC after precolumn derivatization with o-phthaldialdehyde/N-acetylcysteine (OPA/NAC). Because of the lower stability of the glycine OPA/NAC-derivate, quantification of the glutamic acid OPA/NAC-derivate appeared most suitable for quantification of GSH-conjugates. The novel method was used to quantify the concentrations of GSH-conjugates of diclofenac, clozapine and acetaminophen and quantification was consistent with 1 H NMR, but with a more than 100-fold lower detection limit for absolute quantification. Copyright © 2017. Published by Elsevier B.V.

  16. Uncertainty quantification for complex systems with very high dimensional response using Grassmann manifold variations

    NASA Astrophysics Data System (ADS)

    Giovanis, D. G.; Shields, M. D.

    2018-07-01

    This paper addresses uncertainty quantification (UQ) for problems where scalar (or low-dimensional vector) response quantities are insufficient and, instead, full-field (very high-dimensional) responses are of interest. To do so, an adaptive stochastic simulation-based methodology is introduced that refines the probability space based on Grassmann manifold variations. The proposed method has a multi-element character discretizing the probability space into simplex elements using a Delaunay triangulation. For every simplex, the high-dimensional solutions corresponding to its vertices (sample points) are projected onto the Grassmann manifold. The pairwise distances between these points are calculated using appropriately defined metrics and the elements with large total distance are sub-sampled and refined. As a result, regions of the probability space that produce significant changes in the full-field solution are accurately resolved. An added benefit is that an approximation of the solution within each element can be obtained by interpolation on the Grassmann manifold. The method is applied to study the probability of shear band formation in a bulk metallic glass using the shear transformation zone theory.

  17. Detection and quantification of Plasmodium falciparum in blood samples using quantitative nucleic acid sequence-based amplification.

    PubMed

    Schoone, G J; Oskam, L; Kroon, N C; Schallig, H D; Omar, S A

    2000-11-01

    A quantitative nucleic acid sequence-based amplification (QT-NASBA) assay for the detection of Plasmodium parasites has been developed. Primers and probes were selected on the basis of the sequence of the small-subunit rRNA gene. Quantification was achieved by coamplification of the RNA in the sample with one modified in vitro RNA as a competitor in a single-tube NASBA reaction. Parasite densities ranging from 10 to 10(8) Plasmodium falciparum parasites per ml could be demonstrated and quantified in whole blood. This is approximately 1,000 times more sensitive than conventional microscopy analysis of thick blood smears. Comparison of the parasite densities obtained by microscopy and QT-NASBA with 120 blood samples from Kenyan patients with clinical malaria revealed that for 112 of 120 (93%) of the samples results were within a 1-log difference. QT-NASBA may be especially useful for the detection of low parasite levels in patients with early-stage malaria and for the monitoring of the efficacy of drug treatment.

  18. Quantification of silver nanoparticle uptake and distribution within individual human macrophages by FIB/SEM slice and view.

    PubMed

    Guehrs, Erik; Schneider, Michael; Günther, Christian M; Hessing, Piet; Heitz, Karen; Wittke, Doreen; López-Serrano Oliver, Ana; Jakubowski, Norbert; Plendl, Johanna; Eisebitt, Stefan; Haase, Andrea

    2017-03-21

    Quantification of nanoparticle (NP) uptake in cells or tissues is very important for safety assessment. Often, electron microscopy based approaches are used for this purpose, which allow imaging at very high resolution. However, precise quantification of NP numbers in cells and tissues remains challenging. The aim of this study was to present a novel approach, that combines precise quantification of NPs in individual cells together with high resolution imaging of their intracellular distribution based on focused ion beam/ scanning electron microscopy (FIB/SEM) slice and view approaches. We quantified cellular uptake of 75 nm diameter citrate stabilized silver NPs (Ag 75 Cit) into an individual human macrophage derived from monocytic THP-1 cells using a FIB/SEM slice and view approach. Cells were treated with 10 μg/ml for 24 h. We investigated a single cell and found in total 3138 ± 722 silver NPs inside this cell. Most of the silver NPs were located in large agglomerates, only a few were found in clusters of fewer than five NPs. Furthermore, we cross-checked our results by using inductively coupled plasma mass spectrometry and could confirm the FIB/SEM results. Our approach based on FIB/SEM slice and view is currently the only one that allows the quantification of the absolute dose of silver NPs in individual cells and at the same time to assess their intracellular distribution at high resolution. We therefore propose to use FIB/SEM slice and view to systematically analyse the cellular uptake of various NPs as a function of size, concentration and incubation time.

  19. Ferromagnetic resonance for the quantification of superparamagnetic iron oxide nanoparticles in biological materials

    PubMed Central

    Gamarra, Lionel F; daCosta-Filho, Antonio J; Mamani, Javier B; de Cassia Ruiz, Rita; Pavon, Lorena F; Sibov, Tatiana T; Vieira, Ernanni D; Silva, André C; Pontuschka, Walter M; Amaro, Edson

    2010-01-01

    The aim of the present work is the presentation of a quantification methodology for the control of the amount of superparamagnetic iron oxide nanoparticles (SPIONs) administered in biological materials by means of the ferromagnetic resonance technique (FMR) applied to studies both in vivo and in vitro. The in vivo study consisted in the analysis of the elimination and biodistribution kinetics of SPIONs after intravenous administration in Wistar rats. The results were corroborated by X-ray fluorescence. For the in vitro study, a quantitative analysis of the concentration of SPIONs bound to the specific AC133 monoclonal antibodies was carried out in order to detect the expression of the antigenic epitopes (CD133) in stem cells from human umbilical cord blood. In both studies FMR has proven to be an efficient technique for the SPIONs quantification per volume unit (in vivo) or per labeled cell (in vitro). PMID:20463936

  20. Quantification of Histidine-Rich Protein 3 of Plasmodium falciparum.

    PubMed

    Palani, Balraj

    2018-04-01

    Malaria is a life-threatening infectious disease and continues to be a major public health crisis in many parts of the tropical world. Plasmodium falciparum is responsible for the majority of mortality and morbidity associated with malaria. During the intraerythrocytic cycle, P. falciparum releases three proteins with high histidine content as follows: histidine-rich protein 1 (HRP1), histidine-rich protein 2 (HRP2), and histidine-rich protein 3 (HRP3). Currently, most of the diagnostic tests of P. falciparum infection target HRP2, and a number of monoclonal antibodies (mAbs) against HRP2 have been developed for use in HRP2 detection and quantification. When parasites have HRP2 deletions, the detection of HRP3 could augment the sensitivity of the detection system. The combination of both HRP2 and HRP3 mAbs in the detection system will enhance the test sensitivity. In the HRP quantitative enzyme-linked immunosorbent assay (ELISA), both HRP2 and HRP3 contribute to the result, but the relative contribution of HRP2 and HRP3 was unable to investigate, because of the nonavailability of HRP3 specific antibody ELISA. Hence an ELISA test system based on HRP3 is also essential for detection and quantification. There is not much documented in the literature on HRP3 antigen and HRP3 specific mAbs and polyclonal antibodies (pAbs). In the present study, recombinant HRP3 was expressed in Escherichia coli and purified with Ni-NTA agarose column. The purified rHRP3 was used for the generation and characterization of monoclonal and pAbs. The purification of monoclonal and pAbs was done using a mixed-mode chromatography sorbent, phenylpropylamine HyperCel™. With the purified antibodies, a sandwich ELISA was developed. The sandwich ELISA method was explored to detect and quantify HRP3 of P. falciparum in the spent medium. The generated mAbs could be potentially used for the detection and quantification of P. falciparum HRP3.

  1. Quantification of fatal helium exposure following self-administration.

    PubMed

    Malbranque, S; Mauillon, D; Turcant, A; Rouge-Maillart, C; Mangin, P; Varlet, V

    2016-11-01

    Helium is nontoxic at standard conditions, plays no biological role, and is found in trace amounts in human blood. Helium can be dangerous if inhaled to excess, since it is a simple tissue hypoxia and so displaces the oxygen needed for normal respiration. This report presents a fatal case of a middle-aged male victim who died from self-administered helium exposure. For the first time, the quantification of the helium levels in gastric and lung air and in blood samples was achieved using gas chromatography-mass spectrometry after airtight sampling. The results of the toxicological investigation showed that death was caused directly by helium exposure. However, based on the pathomorphological changes detected during the forensic autopsy, we suppose that the fatal outcome was the result of the lack of oxygen after inhalation.

  2. Bone marrow cells obtained from cirrhotic rats do not improve function or reduce fibrosis in a chronic liver disease model.

    PubMed

    Mannheimer, Elida Gripp; Quintanilha, Luiz Fernando; Carvalho, Adriana Bastos; Paredes, Bruno Diaz; Gonçalves de Carvalho, Felipe; Takyia, Cristina Maeda; Resende, Célia Maria Coelho; Ferreira da Motta Rezende, Guilherme; Campos de Carvalho, Antonio Carlos; Schanaider, Alberto; dos Santos Goldenberg, Regina Coeli

    2011-01-01

    The objective of this study was to evaluate the therapeutic potential of bone marrow cells (BMCs) obtained from cirrhotic donors in a model of chronic liver disease. Chronic liver injury was induced in female Wistar rats by the association of an alcoholic diet with intraperitoneal injections of carbon tetrachloride. BMCs obtained from cirrhotic donors or placebo were injected through the portal vein. Blood analysis of alanine aminotransferase (ALT) and albumin levels, ultrasound assessment including the measurement of the portal vein diameter (PVD) and liver echogenicity, histologic evaluation with hematoxylin and eosin and Sirius red staining, and quantification of collagen deposition were performed. ALT and albumin blood levels showed no significant differences between the experimental groups two months after injection. Additionally, no significant variation in PVD and liver echogenicity was found. Histological analysis also showed no significant variation in collagen deposition two months after placebo or BMC injection. This study suggests that, even though BMC therapy using cells from healthy donors has previously shown to be effective, this is not the case when BMCs are obtained from cirrhotic animals. This result has major clinical implications when considering the use of autologous BMCs from patients with chronic liver diseases. © 2009 John Wiley & Sons A/S.

  3. ARFI-based tissue elasticity quantification and kidney graft dysfunction: first clinical experiences.

    PubMed

    Stock, K F; Klein, B S; Cong, M T Vo; Regenbogen, C; Kemmner, S; Büttner, M; Wagenpfeil, S; Matevossian, E; Renders, L; Heemann, U; Küchle, C

    2011-01-01

    Beyond the medical history, the clinical exam and lab findings, non-invasive ultrasound parameters such as kidney size and Doppler values (e.g. the resistive index) are important tools assisting clinical decision making in the monitoring of renal allografts. The gold standard for the diagnosis of renal allograft dysfunction remains the renal biopsy; while an invasive procedure, the justifiable necessity for this derives from its definitive nature a requirement beyond the synopses of all non-invasive tools. "Acoustic Radiation Force Impulse Imaging"(ARFI)-quantification is a novel ultrasound-based technology measuring tissue elasticity properties. So far experience related to this new method has not been reported in renal transplant follow-up. The purpose of this study was to evaluate changes in ARFI-measurements between clinically stable renal allografts and biopsy-proven transplant dysfunction. We employed "Virtual Touch™ tissue quantification" (Siemens Acuson, S2000) for the quantitative measurement of tissue stiffness in the cortex of transplant kidneys. We performed initial baseline and later disease-evaluative ultrasound examinations in 8 renal transplant patients in a prospective study design. Patients were first examined during stable allograft function with a routine post-transplant renal ultrasound protocol. A second follow-up examination was carried out on subsequent presentation with transplant dysfunction prior to allograft biopsy and histological evaluation. All patiens were examined using ARFI-quantification (15 measurements/kidney). Resistive indices (RI) were calculated using pulsed-wave Doppler ultrasound, and transplant kidney size was measured on B-mode ultrasound images. All biopsies were evaluated histologically by a reference nephropathologist unaware of the results of the ultrasound studies. Histopathological diagnoses were based on biopsy results, taking clinical and laboratory findings into account. Finally we calculated the relative

  4. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  5. A rapid Fourier-transform infrared (FTIR) spectroscopic method for direct quantification of paracetamol content in solid pharmaceutical formulations

    NASA Astrophysics Data System (ADS)

    Mallah, Muhammad Ali; Sherazi, Syed Tufail Hussain; Bhanger, Muhammad Iqbal; Mahesar, Sarfaraz Ahmed; Bajeer, Muhammad Ashraf

    2015-04-01

    A transmission FTIR spectroscopic method was developed for direct, inexpensive and fast quantification of paracetamol content in solid pharmaceutical formulations. In this method paracetamol content is directly analyzed without solvent extraction. KBr pellets were formulated for the acquisition of FTIR spectra in transmission mode. Two chemometric models: simple Beer's law and partial least squares employed over the spectral region of 1800-1000 cm-1 for quantification of paracetamol content had a regression coefficient of (R2) of 0.999. The limits of detection and quantification using FTIR spectroscopy were 0.005 mg g-1 and 0.018 mg g-1, respectively. Study for interference was also done to check effect of the excipients. There was no significant interference from the sample matrix. The results obviously showed the sensitivity of transmission FTIR spectroscopic method for pharmaceutical analysis. This method is green in the sense that it does not require large volumes of hazardous solvents or long run times and avoids prior sample preparation.

  6. Quantified Differentiation of Surface Topography for Nano-materials As-Obtained from Atomic Force Microscopy Images

    NASA Astrophysics Data System (ADS)

    Gupta, Mousumi; Chatterjee, Somenath

    2018-04-01

    Surface texture is an important issue to realize the nature (crest and trough) of surfaces. Atomic force microscopy (AFM) image is a key analysis for surface topography. However, in nano-scale, the nature (i.e., deflection or crack) as well as quantification (i.e., height or depth) of deposited layers is essential information for material scientist. In this paper, a gradient-based K-means algorithm is used to differentiate the layered surfaces depending on their color contrast of as-obtained from AFM images. A transformation using wavelet decomposition is initiated to extract the information about deflection or crack on the material surfaces from the same images. Z-axis depth analysis from wavelet coefficients provides information about the crack present in the material. Using the above method corresponding surface information for the material is obtained. In addition, the Gaussian filter is applied to remove the unwanted lines, which occurred during AFM scanning. Few known samples are taken as input, and validity of the above approaches is shown.

  7. Effect of windowing on lithosphere elastic thickness estimates obtained via the coherence method: Results from northern South America

    NASA Astrophysics Data System (ADS)

    Ojeda, GermáN. Y.; Whitman, Dean

    2002-11-01

    The effective elastic thickness (Te) of the lithosphere is a parameter that describes the flexural strength of a plate. A method routinely used to quantify this parameter is to calculate the coherence between the two-dimensional gravity and topography spectra. Prior to spectra calculation, data grids must be "windowed" in order to avoid edge effects. We investigated the sensitivity of Te estimates obtained via the coherence method to mirroring, Hanning and multitaper windowing techniques on synthetic data as well as on data from northern South America. These analyses suggest that the choice of windowing technique plays an important role in Te estimates and may result in discrepancies of several kilometers depending on the selected windowing method. Te results from mirrored grids tend to be greater than those from Hanning smoothed or multitapered grids. Results obtained from mirrored grids are likely to be over-estimates. This effect may be due to artificial long wavelengths introduced into the data at the time of mirroring. Coherence estimates obtained from three subareas in northern South America indicate that the average effective elastic thickness is in the range of 29-30 km, according to Hanning and multitaper windowed data. Lateral variations across the study area could not be unequivocally determined from this study. We suggest that the resolution of the coherence method does not permit evaluation of small (i.e., ˜5 km), local Te variations. However, the efficiency and robustness of the coherence method in rendering continent-scale estimates of elastic thickness has been confirmed.

  8. Theoretical limitations of quantification for noncompetitive sandwich immunoassays.

    PubMed

    Woolley, Christine F; Hayes, Mark A; Mahanti, Prasun; Douglass Gilman, S; Taylor, Tom

    2015-11-01

    Immunoassays exploit the highly selective interaction between antibodies and antigens to provide a vital method for biomolecule detection at low concentrations. Developers and practitioners of immunoassays have long known that non-specific binding often restricts immunoassay limits of quantification (LOQs). Aside from non-specific binding, most efforts by analytical chemists to reduce the LOQ for these techniques have focused on improving the signal amplification methods and minimizing the limitations of the detection system. However, with detection technology now capable of sensing single-fluorescence molecules, this approach is unlikely to lead to dramatic improvements in the future. Here, fundamental interactions based on the law of mass action are analytically connected to signal generation, replacing the four- and five-parameter fittings commercially used to approximate sigmoidal immunoassay curves and allowing quantitative consideration of non-specific binding and statistical limitations in order to understand the ultimate detection capabilities of immunoassays. The restrictions imposed on limits of quantification by instrumental noise, non-specific binding, and counting statistics are discussed based on equilibrium relations for a sandwich immunoassay. Understanding the maximal capabilities of immunoassays for each of these regimes can greatly assist in the development and evaluation of immunoassay platforms. While many studies suggest that single molecule detection is possible through immunoassay techniques, here, it is demonstrated that the fundamental limit of quantification (precision of 10 % or better) for an immunoassay is approximately 131 molecules and this limit is based on fundamental and unavoidable statistical limitations.

  9. Self-aliquoting micro-grooves in combination with laser ablation-ICP-mass spectrometry for the analysis of challenging liquids: quantification of lead in whole blood.

    PubMed

    Nischkauer, Winfried; Vanhaecke, Frank; Limbeck, Andreas

    2016-08-01

    We present a technique for the fast screening of the lead concentration in whole blood samples using laser ablation-inductively coupled plasma-mass spectrometry (LA-ICP-MS). The whole blood sample is deposited on a polymeric surface and wiped across a set of micro-grooves previously engraved into the surface. The engraving of the micro-grooves was accomplished with the same laser system used for LA-ICP-MS analysis. In each groove, a part of the liquid blood is trapped, and thus, the sample is divided into sub-aliquots. These aliquots dry quasi instantly and are then investigated by means of LA-ICP-MS. For quantification, external calibration against aqueous standard solutions was relied on, with iron as an internal standard to account for varying volumes of the sample aliquots. The (208)Pb/(57)Fe nuclide ratio used for quantification was obtained via a data treatment protocol so far only used in the context of isotope ratio determination involving transient signals. The method presented here was shown to provide reliable results for Recipe ClinChek® Whole Blood Control levels I-III (nos. 8840-8842), with a repeatability of typically 3 % relative standard deviation (n = 6, for Pb at 442 μg L(-1)). Spiked and non-spiked real whole blood was analysed as well, and the results were compared with those obtained via dilution and sectorfield ICP-MS. A good agreement between both methods was observed. The detection limit (3 s) for lead in whole blood was established to be 10 μg L(-1) for the laser ablation method presented here. Graphical Abstract Micro-grooves are filled with whole blood, dried, and analyzed by laser ablation ICP-mass spectrometry. Notice that the laser moves in perpendicular direction with regard to the micro-grooves.

  10. Lesion detection and quantification performance of the Tachyon-I time-of-flight PET scanner: phantom and human studies.

    PubMed

    Zhang, Xuezhu; Peng, Qiyu; Zhou, Jian; Huber, Jennifer S; Moses, William W; Qi, Jinyi

    2018-03-16

    The first generation Tachyon PET (Tachyon-I) is a demonstration single-ring PET scanner that reaches a coincidence timing resolution of 314 ps using LSO scintillator crystals coupled to conventional photomultiplier tubes. The objective of this study was to quantify the improvement in both lesion detection and quantification performance resulting from the improved time-of-flight (TOF) capability of the Tachyon-I scanner. We developed a quantitative TOF image reconstruction method for the Tachyon-I and evaluated its TOF gain for lesion detection and quantification. Scans of either a standard NEMA torso phantom or healthy volunteers were used as the normal background data. Separately scanned point source and sphere data were superimposed onto the phantom or human data after accounting for the object attenuation. We used the bootstrap method to generate multiple independent noisy datasets with and without a lesion present. The signal-to-noise ratio (SNR) of a channelized hotelling observer (CHO) was calculated for each lesion size and location combination to evaluate the lesion detection performance. The bias versus standard deviation trade-off of each lesion uptake was also calculated to evaluate the quantification performance. The resulting CHO-SNR measurements showed improved performance in lesion detection with better timing resolution. The detection performance was also dependent on the lesion size and location, in addition to the background object size and shape. The results of bias versus noise trade-off showed that the noise (standard deviation) reduction ratio was about 1.1-1.3 over the TOF 500 ps and 1.5-1.9 over the non-TOF modes, similar to the SNR gains for lesion detection. In conclusion, this Tachyon-I PET study demonstrated the benefit of improved time-of-flight capability on lesion detection and ROI quantification for both phantom and human subjects.

  11. Lesion detection and quantification performance of the Tachyon-I time-of-flight PET scanner: phantom and human studies

    NASA Astrophysics Data System (ADS)

    Zhang, Xuezhu; Peng, Qiyu; Zhou, Jian; Huber, Jennifer S.; Moses, William W.; Qi, Jinyi

    2018-03-01

    The first generation Tachyon PET (Tachyon-I) is a demonstration single-ring PET scanner that reaches a coincidence timing resolution of 314 ps using LSO scintillator crystals coupled to conventional photomultiplier tubes. The objective of this study was to quantify the improvement in both lesion detection and quantification performance resulting from the improved time-of-flight (TOF) capability of the Tachyon-I scanner. We developed a quantitative TOF image reconstruction method for the Tachyon-I and evaluated its TOF gain for lesion detection and quantification. Scans of either a standard NEMA torso phantom or healthy volunteers were used as the normal background data. Separately scanned point source and sphere data were superimposed onto the phantom or human data after accounting for the object attenuation. We used the bootstrap method to generate multiple independent noisy datasets with and without a lesion present. The signal-to-noise ratio (SNR) of a channelized hotelling observer (CHO) was calculated for each lesion size and location combination to evaluate the lesion detection performance. The bias versus standard deviation trade-off of each lesion uptake was also calculated to evaluate the quantification performance. The resulting CHO-SNR measurements showed improved performance in lesion detection with better timing resolution. The detection performance was also dependent on the lesion size and location, in addition to the background object size and shape. The results of bias versus noise trade-off showed that the noise (standard deviation) reduction ratio was about 1.1–1.3 over the TOF 500 ps and 1.5–1.9 over the non-TOF modes, similar to the SNR gains for lesion detection. In conclusion, this Tachyon-I PET study demonstrated the benefit of improved time-of-flight capability on lesion detection and ROI quantification for both phantom and human subjects.

  12. An Update on Phased Array Results Obtained on the GE Counter-Rotating Open Rotor Model

    NASA Technical Reports Server (NTRS)

    Podboy, Gary; Horvath, Csaba; Envia, Edmane

    2013-01-01

    Beamform maps have been generated from 1) simulated data generated by the LINPROP code and 2) actual experimental phased array data obtained on the GE Counter-rotating open rotor model. The beamform maps show that many of the tones in the experimental data come from their corresponding Mach radius. If the phased array points to the Mach radius associated with a tone then it is likely that the tone is a result of the loading and thickness noise on the blades. In this case, the phased array correctly points to where the noise is coming from and indicates the axial location of the loudest source in the image but not necessarily the correct vertical location. If the phased array does not point to the Mach radius associated with a tone then some mechanism other than loading and thickness noise may control the amplitude of the tone. In this case, the phased array may or may not point to the actual source. If the source is not rotating it is likely that the phased array points to the source. If the source is rotating it is likely that the phased array indicates the axial location of the loudest source but not necessarily the correct vertical location. These results indicate that you have to be careful in how you interpret phased array data obtained on an open rotor since they may show the tones coming from a location other than the source location. With a subsonic tip speed open rotor the tones can come form locations outboard of the blade tips. This has implications regarding noise shielding.

  13. Linear-array-based photoacoustic tomography for label-free high-throughput detection and quantification of circulating melanoma tumor cell clusters

    NASA Astrophysics Data System (ADS)

    Hai, Pengfei; Zhou, Yong; Zhang, Ruiying; Ma, Jun; Li, Yang; Wang, Lihong V.

    2017-03-01

    Circulating tumor cell (CTC) clusters arise from multicellular grouping in the primary tumor and elevate the metastatic potential by 23 to 50 fold compared to single CTCs. High throughout detection and quantification of CTC clusters is critical for understanding the tumor metastasis process and improving cancer therapy. In this work, we report a linear-array-based photoacoustic tomography (LA-PAT) system capable of label-free high-throughput CTC cluster detection and quantification in vivo. LA-PAT detects CTC clusters and quantifies the number of cells in them based on the contrast-to-noise ratios (CNRs) of photoacoustic signals. The feasibility of LA-PAT was first demonstrated by imaging CTC clusters ex vivo. LA-PAT detected CTC clusters in the blood-filled microtubes and computed the number of cells in the clusters. The size distribution of the CTC clusters measured by LA-PAT agreed well with that obtained by optical microscopy. We demonstrated the ability of LA-PAT to detect and quantify CTC clusters in vivo by imaging injected CTC clusters in rat tail veins. LA-PAT detected CTC clusters immediately after injection as well as when they were circulating in the rat bloodstreams. Similarly, the numbers of cells in the clusters were computed based on the CNRs of the photoacoustic signals. The data showed that larger CTC clusters disappear faster than the smaller ones. The results prove the potential of LA-PAT as a promising tool for both preclinical tumor metastasis studies and clinical cancer therapy evaluation.

  14. Absolute quantification by droplet digital PCR versus analog real-time PCR

    PubMed Central

    Hindson, Christopher M; Chevillet, John R; Briggs, Hilary A; Gallichotte, Emily N; Ruf, Ingrid K; Hindson, Benjamin J; Vessella, Robert L; Tewari, Muneesh

    2014-01-01

    Nanoliter-sized droplet technology paired with digital PCR (ddPCR) holds promise for highly precise, absolute nucleic acid quantification. Our comparison of microRNA quantification by ddPCR and real-time PCR revealed greater precision (coefficients of variation decreased by 37–86%) and improved day-to-day reproducibility (by a factor of seven) of ddPCR but with comparable sensitivity. When we applied ddPCR to serum microRNA biomarker analysis, this translated to superior diagnostic performance for identifying individuals with cancer. PMID:23995387

  15. GoIFISH: a system for the quantification of single cell heterogeneity from IFISH images.

    PubMed

    Trinh, Anne; Rye, Inga H; Almendro, Vanessa; Helland, Aslaug; Russnes, Hege G; Markowetz, Florian

    2014-08-26

    Molecular analysis has revealed extensive intra-tumor heterogeneity in human cancer samples, but cannot identify cell-to-cell variations within the tissue microenvironment. In contrast, in situ analysis can identify genetic aberrations in phenotypically defined cell subpopulations while preserving tissue-context specificity. GoIFISHGoIFISH is a widely applicable, user-friendly system tailored for the objective and semi-automated visualization, detection and quantification of genomic alterations and protein expression obtained from fluorescence in situ analysis. In a sample set of HER2-positive breast cancers GoIFISHGoIFISH is highly robust in visual analysis and its accuracy compares favorably to other leading image analysis methods. GoIFISHGoIFISH is freely available at www.sourceforge.net/projects/goifish/.

  16. Carbon Nanotubes Released from an Epoxy-Based Nanocomposite: Quantification and Particle Toxicity.

    PubMed

    Schlagenhauf, Lukas; Buerki-Thurnherr, Tina; Kuo, Yu-Ying; Wichser, Adrian; Nüesch, Frank; Wick, Peter; Wang, Jing

    2015-09-01

    Studies combining both the quantification of free nanoparticle release and the toxicological investigations of the released particles from actual nanoproducts in a real-life exposure scenario are urgently needed, yet very rare. Here, a new measurement method was established to quantify the amount of free-standing and protruding multiwalled carbon nanotubes (MWCNTs) in the respirable fraction of particles abraded from a MWCNT-epoxy nanocomposite. The quantification approach involves the prelabeling of MWCNTs with lead ions, nanocomposite production, abrasion and collection of the inhalable particle fraction, and quantification of free-standing and protruding MWCNTs by measuring the concentration of released lead ions. In vitro toxicity studies for genotoxicity, reactive oxygen species formation, and cell viability were performed using A549 human alveolar epithelial cells and THP-1 monocyte-derived macrophages. The quantification experiment revealed that in the respirable fraction of the abraded particles, approximately 4000 ppm of the MWCNTs were released as exposed MWCNTs (which could contact lung cells upon inhalation) and approximately 40 ppm as free-standing MWCNTs in the worst-case scenario. The release of exposed MWCNTs was lower for nanocomposites containing agglomerated MWCNTs. The toxicity tests revealed that the abraded particles did not induce any acute cytotoxic effects.

  17. A Bayes network approach to uncertainty quantification in hierarchically developed computational models

    DOE PAGES

    Urbina, Angel; Mahadevan, Sankaran; Paez, Thomas L.

    2012-03-01

    Here, performance assessment of complex systems is ideally accomplished through system-level testing, but because they are expensive, such tests are seldom performed. On the other hand, for economic reasons, data from tests on individual components that are parts of complex systems are more readily available. The lack of system-level data leads to a need to build computational models of systems and use them for performance prediction in lieu of experiments. Because their complexity, models are sometimes built in a hierarchical manner, starting with simple components, progressing to collections of components, and finally, to the full system. Quantification of uncertainty inmore » the predicted response of a system model is required in order to establish confidence in the representation of actual system behavior. This paper proposes a framework for the complex, but very practical problem of quantification of uncertainty in system-level model predictions. It is based on Bayes networks and uses the available data at multiple levels of complexity (i.e., components, subsystem, etc.). Because epistemic sources of uncertainty were shown to be secondary, in this application, aleatoric only uncertainty is included in the present uncertainty quantification. An example showing application of the techniques to uncertainty quantification of measures of response of a real, complex aerospace system is included.« less

  18. Volatile organic silicon compounds in biogases: development of sampling and analytical methods for total silicon quantification by ICP-OES.

    PubMed

    Chottier, Claire; Chatain, Vincent; Julien, Jennifer; Dumont, Nathalie; Lebouil, David; Germain, Patrick

    2014-01-01

    Current waste management policies favor biogases (digester gases (DGs) and landfill gases (LFGs)) valorization as it becomes a way for energy politics. However, volatile organic silicon compounds (VOSiCs) contained into DGs/LFGs severely damage combustion engines and endanger the conversion into electricity by power plants, resulting in a high purification level requirement. Assessing treatment efficiency is still difficult. No consensus has been reached to provide a standardized sampling and quantification of VOSiCs into gases because of their diversity, their physicochemical properties, and the omnipresence of silicon in analytical chains. Usually, samplings are done by adsorption or absorption and quantification made by gas chromatography-mass spectrometry (GC-MS) or inductively coupled plasma-optical emission spectrometry (ICP-OES). In this objective, this paper presents and discusses the optimization of a patented method consisting in VOSiCs sampling by absorption of 100% ethanol and quantification of total Si by ICP-OES.

  19. Uncertainty quantification in fission cross section measurements at LANSCE

    DOE PAGES

    Tovesson, F.

    2015-01-09

    Neutron-induced fission cross sections have been measured for several isotopes of uranium and plutonium at the Los Alamos Neutron Science Center (LANSCE) over a wide range of incident neutron energies. The total uncertainties in these measurements are in the range 3–5% above 100 keV of incident neutron energy, which results from uncertainties in the target, neutron source, and detector system. The individual sources of uncertainties are assumed to be uncorrelated, however correlation in the cross section across neutron energy bins are considered. The quantification of the uncertainty contributions will be described here.

  20. Quantification, morphology, and viability of equine preantral follicles obtained via the Biopsy Pick-Up method.

    PubMed

    Haag, K T; Magalhães-Padilha, D M; Fonseca, G R; Wischral, A; Gastal, M O; King, S S; Jones, K L; Figueiredo, J R; Gastal, E L

    2013-03-01

    A Biopsy Pick-Up (BPU) method was tested to determine the feasibility of retrieving preantral follicles from mare ovaries in vivo. A total of 33 ovarian biopsy procedures were performed on 18 mares during the breeding season. Mares were 5 to 21 years old and biopsies were performed during the estrous and/or diestrous phase, as confirmed by transrectal ultrasonography. Follicles were mechanically isolated using a tissue chopper, counted, and classified as normal or abnormal and primordial or primary. Viability of isolated follicles was determined by Trypan Blue dye. A total of 256 biopsy attempts were made resulting in 185 successful tissue sample collections (72% success rate). The mean weight of ovarian tissue collected per procedure was 25.0 ± 1.6 mg. Overall, 620 preantral follicles were collected and isolated (95% primordial and 5% primary). The mean (±SEM) number of follicles isolated per biopsy procedure was 18.8 ± 1.9. Primordial and primary follicles had an average diameter of 31.3 ± 6.2 and 41.1 ± 6.6 μm, respectively. Viability rate was higher (P < 0.001) for primordial follicles (91%) compared with primary follicles (50%). Primordial follicles tended (P < 0.06) to have a higher rate of morphological normality (96%) compared with primary follicles (80%). The total number of follicles isolated, amount of tissue harvested, and number of follicles per mg of tissue did not differ (P > 0.05) according to phase of the estrous cycle. Younger mares (5 to 7 years old) had more (P < 0.05) follicles isolated per procedure than older mares (14 to 21 years old). The length of the interovulatory interval was not affected (P > 0.05) by any biopsy procedure, and there were no adverse effects on cyclicity or general reproductive health. In conclusion, the BPU method provided large numbers of normal and viable preantral follicles for the study of early follicular development in mares. The BPU method might be used in the future to obtain preantral follicles for in

  1. Quantification of Structural Isomers via Mode-Selective Irmpd

    NASA Astrophysics Data System (ADS)

    Polfer, Nicolas C.

    2016-06-01

    Mixtures of structural isomers can pose a challenge for vibrational ion spectroscopy. In cases where particular structures display diagnostic vibrations, these structures can be selectively "burned away". In ion traps, the ion population can be subjected to multiple laser shots, in order to fully deplete a particular structure, in effect allowing a quantification of this structure. Protonated para-amino benzoic acid (PABA) serves as an illustrative example. PABA is known to preferentially exist in the N-protonated (N-prot) form in solution, but in the gas phase it is energetically favorable in the O-protonated (O-prot) form. As shown in Figure 1, the N-prot structure can be kinetically trapped in the gas phase when sprayed from non-protic solvent, whereas the O-prot structure is obtained when sprayed from protic solvents, analogous to results by others [1,2]. y parking the light source on the diagnostic 3440 wn mode, the percentage of the O-prot structure can be determined, and by default the remainder is assumed to adopt the N-prot structure. It will be shown that the relative percentages of O-prot vs N-prot are highly dependent on the solvent mixture, going from close to 0% O-prot in non-protic solvents, to 99% in protic solvents. Surprisingly, water behaves much more like a non-protic solvent than methanol. It is observed that the capillary temperature, which aids droplet desolvation by black-body radiation in the ESI source, is critical to promote the appearance of O-prot structures. These results are consistent with the picture that a protic bridge mechanism is at play to facilitate proton transfer, and thus allow conversion from N-prot to O-prot, but that this mechanism is subject to appreciable kinetic barriers on the timescale of solvent evaporation. 1. J. Phys. Chem. A 2011, 115, 7625. 2. Anal. Chem. 2012, 84, 7857.

  2. Quantification of Triacylglycerol Molecular Species in Edible Fats and Oils by Gas Chromatography-Flame Ionization Detector Using Correction Factors.

    PubMed

    Yoshinaga, Kazuaki; Obi, Junji; Nagai, Toshiharu; Iioka, Hiroyuki; Yoshida, Akihiko; Beppu, Fumiaki; Gotoh, Naohiro

    2017-03-01

    In the present study, the resolution parameters and correction factors (CFs) of triacylglycerol (TAG) standards were estimated by gas chromatography-flame ionization detector (GC-FID) to achieve the precise quantification of the TAG composition in edible fats and oils. Forty seven TAG standards comprising capric acid, lauric acid, myristic acid, pentadecanoic acid, palmitic acid, palmitoleic acid, stearic acid, oleic acid, linoleic acid, and/or linolenic acid were analyzed, and the CFs of these TAGs were obtained against tripentadecanoyl glycerol as the internal standard. The capillary column was Ultra ALLOY + -65 (30 m × 0.25 mm i.d., 0.10 μm thickness) and the column temperature was programmed to rise from 250°C to 360°C at 4°C/min and then hold for 25 min. The limit of detection (LOD) and limit of quantification (LOQ) values of the TAG standards were > 0.10 mg and > 0.32 mg per 100 mg fat and oil, respectively, except for LnLnLn, and the LOD and LOQ values of LnLnLn were 0.55 mg and 1.84 mg per 100 mg fat and oil, respectively. The CFs of TAG standards decreased with increasing total acyl carbon number and degree of desaturation of TAG molecules. Also, there were no remarkable differences in the CFs between TAG positional isomers such as 1-palmitoyl-2-oleoyl-3-stearoyl-rac-glycerol, 1-stearoyl-2-palmitoyl-3-oleoyl-rac-glycerol, and 1-palmitoyl-2-stearoyl-3-oleoyl-rac-glycerol, which cannot be separated by GC-FID. Furthermore, this method was able to predict the CFs of heterogeneous (AAB- and ABC-type) TAGs from the CFs of homogenous (AAA-, BBB-, and CCC-type) TAGs. In addition, the TAG composition in cocoa butter, palm oil, and canola oil was determined using CFs, and the results were found to be in good agreement with those reported in the literature. Therefore, the GC-FID method using CFs can be successfully used for the quantification of TAG molecular species in natural fats and oils.

  3. Chemical Composition and Antioxidant Properties of Powders Obtained from Different Plum Juice Formulations.

    PubMed

    Michalska, Anna; Wojdyło, Aneta; Łysiak, Grzegorz P; Figiel, Adam

    2017-01-17

    Among popular crops, plum ( Prunus domestica L.) has received special attention due to its health-promoting properties. The seasonality of this fruit makes it impossible to consume it throughout the year, so new products in a powder form may offer an alternative to fresh consumption and may be used as high-quality natural food ingredients. A 100% plum (cultivar "Valor") juice was mixed with three different concentrations of maltodextrin or subjected to sugars removal by amberlite-XAD column, and dried using the freeze, spray, and vacuum (40, 60, and 80 °C) drying techniques. The identification and quantification of phenolic acids, flavonols, and anthocyanins in plum powders was performed by LC-MS QTof and UPLC-PDA, respectively. l-ascorbic acid, hydroxymethylfurfural, and antioxidant capacity were measured by the Trolox equivalent antioxidant capacity (TEAC) ABTS and ferric reducing antioxidant potential (FRAP) methods in order to compare the influence of the drying methods on product quality. The results indicated that the profile of polyphenolic compounds in the plum juice powders significantly differed from the whole plum powders. The drying of a sugar free plum extract resulted in higher content of polyphenolic compounds, l-ascorbic acid and antioxidant capacity, but lower content of hydroxymethylfurfural, regardless of drying method applied. Thus, the formulation of plum juice before drying and the drying method should be carefully selected in order to obtain high-quality powders.

  4. Emphysema quantification from CT scans using novel application of diaphragm curvature estimation: comparison with standard quantification methods and pulmonary function data

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Yankelevitz, David F.; Henschke, Claudia I.; Barr, R. Graham

    2009-02-01

    Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for the imaging of the anatomical basis of emphysema and quantification of the underlying disease state. Several measures have been introduced for the quantification emphysema directly from CT data; most,however, are based on the analysis of density information provided by the CT scans, which vary by scanner and can be hard to standardize across sites and time. Given that one of the anatomical variations associated with the progression of emphysema is the flatting of the diaphragm due to the loss of elasticity in the lung parenchyma, curvature analysis of the diaphragm would provide information about emphysema from CT. Therefore, we propose a new, non-density based measure of the curvature of the diaphragm that would allow for further quantification methods in a robust manner. To evaluate the new method, 24 whole-lung scans were analyzed using the ratios of the lung height and diaphragm width to diaphragm height as curvature estimates as well as using the emphysema index as comparison. Pearson correlation coefficients showed a strong trend of several of the proposed diaphragm curvature measures to have higher correlations, of up to r=0.57, with DLCO% and VA than did the emphysema index. Furthermore, we found emphysema index to have only a 0.27 correlation to the proposed measures, indicating that the proposed measures evaluate different aspects of the disease.

  5. Integrating Internal Standards into Disposable Capillary Electrophoresis Devices To Improve Quantification

    PubMed Central

    2017-01-01

    To improve point-of-care quantification using microchip capillary electrophoresis (MCE), the chip-to-chip variabilities inherent in disposable, single-use devices must be addressed. This work proposes to integrate an internal standard (ISTD) into the microchip by adding it to the background electrolyte (BGE) instead of the sample—thus eliminating the need for additional sample manipulation, microchip redesigns, and/or system expansions required for traditional ISTD usage. Cs and Li ions were added as integrated ISTDs to the BGE, and their effects on the reproducibility of Na quantification were explored. Results were then compared to the conclusions of our previous publication which used Cs and Li as traditional ISTDs. The in-house fabricated microchips, electrophoretic protocols, and solution matrixes were kept constant, allowing the proposed method to be reliably compared to the traditional method. Using the integrated ISTDs, both Cs and Li improved the Na peak area reproducibility approximately 2-fold, to final RSD values of 2.2–4.7% (n = 900). In contrast (to previous work), Cs as a traditional ISTD resulted in final RSDs of 2.5–8.8%, while the traditional Li ISTD performed poorly with RSDs of 6.3–14.2%. These findings suggest integrated ISTDs are a viable method to improve the precision of disposable MCE devices—giving matched or superior results to the traditional method in this study while neither increasing system cost nor complexity. PMID:28192985

  6. MetaQuant: a tool for the automatic quantification of GC/MS-based metabolome data.

    PubMed

    Bunk, Boyke; Kucklick, Martin; Jonas, Rochus; Münch, Richard; Schobert, Max; Jahn, Dieter; Hiller, Karsten

    2006-12-01

    MetaQuant is a Java-based program for the automatic and accurate quantification of GC/MS-based metabolome data. In contrast to other programs MetaQuant is able to quantify hundreds of substances simultaneously with minimal manual intervention. The integration of a self-acting calibration function allows the parallel and fast calibration for several metabolites simultaneously. Finally, MetaQuant is able to import GC/MS data in the common NetCDF format and to export the results of the quantification into Systems Biology Markup Language (SBML), Comma Separated Values (CSV) or Microsoft Excel (XLS) format. MetaQuant is written in Java and is available under an open source license. Precompiled packages for the installation on Windows or Linux operating systems are freely available for download. The source code as well as the installation packages are available at http://bioinformatics.org/metaquant

  7. Quantification of the memory imprint effect for a charged particle environment

    NASA Technical Reports Server (NTRS)

    Bhuva, B. L.; Johnson, R. L., Jr.; Gyurcsik, R. S.; Kerns, S. E.; Fernald, K. W.

    1987-01-01

    The effects of total accumulated dose on the single-event vulnerability of NMOS resistive-load SRAMs are investigated. The bias-dependent shifts in device parameters can imprint the memory state present during exposure or erase the imprinted state. Analysis of these effects is presented along with an analytic model developed for the quantification of these effects. The results indicate that the imprint effect is dominated by the difference in the threshold voltage of the n-channel devices.

  8. Surface Enhanced Raman Spectroscopy (SERS) methods for endpoint and real-time quantification of miRNA assays

    NASA Astrophysics Data System (ADS)

    Restaino, Stephen M.; White, Ian M.

    2017-03-01

    Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.

  9. Dried blood spot assay for the quantification of phenytoin using Liquid Chromatography-Mass Spectrometry.

    PubMed

    Villanelli, Fabio; Giocaliere, Elisa; Malvagia, Sabrina; Rosati, Anna; Forni, Giulia; Funghini, Silvia; Shokry, Engy; Ombrone, Daniela; Della Bona, Maria Luisa; Guerrini, Renzo; la Marca, Giancarlo

    2015-02-02

    Phenytoin (PHT) is one of the most commonly used anticonvulsant drugs for the treatment of epilepsy and bipolar disorders. The large amount of plasma required by conventional methods for drug quantification makes mass spectrometry combined with dried blood spot (DBS) sampling crucial for pediatric patients where therapeutic drug monitoring or pharmacokinetic studies may be difficult to realize. DBS represents a new convenient sampling support requiring minimally invasive blood drawing and providing long-term stability of samples and less expensive shipment and storage. The aim of this study was to develop a LC-MS/MS method for the quantification of PHT on DBS. This analytical method was validated and gave good linearity (r(2)=0.999) in the range of 0-100mg/l. LOQ and LOD were 1.0mg/l and 0.3mg/l, respectively. The drug extraction from paper was performed in a few minutes using a mixture composed of organic solvent for 80%. The recovery ranged from 85 to 90%; PHT in DBS showed to be stable at different storage temperatures for one month. A good correlation was also obtained between PHT plasma and DBS concentrations. This method is both precise and accurate and appears to be particularly suitable to monitor treatment with a simple and convenient sample collection procedure. Copyright © 2014 Elsevier B.V. All rights reserved.

  10. Quantification of Carbohydrates in Grape Tissues Using Capillary Zone Electrophoresis

    PubMed Central

    Zhao, Lu; Chanon, Ann M.; Chattopadhyay, Nabanita; Dami, Imed E.; Blakeslee, Joshua J.

    2016-01-01

    Soluble sugars play an important role in freezing tolerance in both herbaceous and woody plants, functioning in both the reduction of freezing-induced dehydration and the cryoprotection of cellular constituents. The quantification of soluble sugars in plant tissues is, therefore, essential in understanding freezing tolerance. While a number of analytical techniques and methods have been used to quantify sugars, most of these are expensive and time-consuming due to complex sample preparation procedures which require the derivatization of the carbohydrates being analyzed. Analysis of soluble sugars using capillary zone electrophoresis (CZE) under alkaline conditions with direct UV detection has previously been used to quantify simple sugars in fruit juices. However, it was unclear whether CZE-based methods could be successfully used to quantify the broader range of sugars present in complex plant extracts. Here, we present the development of an optimized CZE method capable of separating and quantifying mono-, di-, and tri-saccharides isolated from plant tissues. This optimized CZE method employs a column electrolyte buffer containing 130 mM NaOH, pH 13.0, creating a current of 185 μA when a separation voltage of 10 kV is employed. The optimized CZE method provides limits-of-detection (an average of 1.5 ng/μL) for individual carbohydrates comparable or superior to those obtained using gas chromatography–mass spectrometry, and allows resolution of non-structural sugars and cell wall components (structural sugars). The optimized CZE method was successfully used to quantify sugars from grape leaves and buds, and is a robust tool for the quantification of plant sugars found in vegetative and woody tissues. The increased analytical efficiency of this CZE method makes it ideal for use in high-throughput metabolomics studies designed to quantify plant sugars. PMID:27379118

  11. SU-D-218-05: Material Quantification in Spectral X-Ray Imaging: Optimization and Validation.

    PubMed

    Nik, S J; Thing, R S; Watts, R; Meyer, J

    2012-06-01

    To develop and validate a multivariate statistical method to optimize scanning parameters for material quantification in spectral x-rayimaging. An optimization metric was constructed by extensively sampling the thickness space for the expected number of counts for m (two or three) materials. This resulted in an m-dimensional confidence region ofmaterial quantities, e.g. thicknesses. Minimization of the ellipsoidal confidence region leads to the optimization of energy bins. For the given spectrum, the minimum counts required for effective material separation can be determined by predicting the signal-to-noise ratio (SNR) of the quantification. A Monte Carlo (MC) simulation framework using BEAM was developed to validate the metric. Projection data of the m-materials was generated and material decomposition was performed for combinations of iodine, calcium and water by minimizing the z-score between the expected spectrum and binned measurements. The mean square error (MSE) and variance were calculated to measure the accuracy and precision of this approach, respectively. The minimum MSE corresponds to the optimal energy bins in the BEAM simulations. In the optimization metric, this is equivalent to the smallest confidence region. The SNR of the simulated images was also compared to the predictions from the metric. TheMSE was dominated by the variance for the given material combinations,which demonstrates accurate material quantifications. The BEAMsimulations revealed that the optimization of energy bins was accurate to within 1keV. The SNRs predicted by the optimization metric yielded satisfactory agreement but were expectedly higher for the BEAM simulations due to the inclusion of scattered radiation. The validation showed that the multivariate statistical method provides accurate material quantification, correct location of optimal energy bins and adequateprediction of image SNR. The BEAM code system is suitable for generating spectral x- ray imaging simulations.

  12. Archetypal Analysis for Sparse Representation-Based Hyperspectral Sub-Pixel Quantification

    NASA Astrophysics Data System (ADS)

    Drees, L.; Roscher, R.

    2017-05-01

    This paper focuses on the quantification of land cover fractions in an urban area of Berlin, Germany, using simulated hyperspectral EnMAP data with a spatial resolution of 30m×30m. For this, sparse representation is applied, where each pixel with unknown surface characteristics is expressed by a weighted linear combination of elementary spectra with known land cover class. The elementary spectra are determined from image reference data using simplex volume maximization, which is a fast heuristic technique for archetypal analysis. In the experiments, the estimation of class fractions based on the archetypal spectral library is compared to the estimation obtained by a manually designed spectral library by means of reconstruction error, mean absolute error of the fraction estimates, sum of fractions and the number of used elementary spectra. We will show, that a collection of archetypes can be an adequate and efficient alternative to the spectral library with respect to mentioned criteria.

  13. Detection and quantification of extracellular microRNAs in murine biofluids

    PubMed Central

    2014-01-01

    Background MicroRNAs (miRNAs) are short RNA molecules which regulate gene expression in eukaryotic cells, and are abundant and stable in biofluids such as blood serum and plasma. As such, there has been heightened interest in the utility of extracellular miRNAs as minimally invasive biomarkers for diagnosis and monitoring of a wide range of human pathologies. However, quantification of extracellular miRNAs is subject to a number of specific challenges, including the relatively low RNA content of biofluids, the possibility of contamination with serum proteins (including RNases and PCR inhibitors), hemolysis, platelet contamination/activation, a lack of well-established reference miRNAs and the biochemical properties of miRNAs themselves. Protocols for the detection and quantification of miRNAs in biofluids are therefore of high interest. Results The following protocol was validated by quantifying miRNA abundance in C57 (wild-type) and dystrophin-deficient (mdx) mice. Important differences in miRNA abundance were observed depending on whether blood was taken from the jugular or tail vein. Furthermore, efficiency of miRNA recovery was reduced when sample volumes greater than 50 μl were used. Conclusions Here we describe robust and novel procedures to harvest murine serum/plasma, extract biofluid RNA, amplify specific miRNAs by RT-qPCR and analyze the resulting data, enabling the determination of relative and absolute miRNA abundance in extracellular biofluids with high accuracy, specificity and sensitivity. PMID:24629058

  14. Quantification of Kryptofix 2.2.2 in [18F]fluorine-labelled radiopharmaceuticals by rapid-resolution liquid chromatography.

    PubMed

    Lao, Yexing; Yang, Cuiping; Zou, Wei; Gan, Manquan; Chen, Ping; Su, Weiwei

    2012-05-01

    The cryptand Kryptofix 2.2.2 is used extensively as a phase-transfer reagent in the preparation of [18F]fluoride-labelled radiopharmaceuticals. However, it has considerable acute toxicity. The aim of this study was to develop and validate a method for rapid (within 1 min), specific and sensitive quantification of Kryptofix 2.2.2 at trace levels. Chromatographic separations were carried out by rapid-resolution liquid chromatography (Agilent ZORBAX SB-C18 rapid-resolution column, 2.1 × 30 mm, 3.5 μm). Tandem mass spectra were acquired using a triple quadrupole mass spectrometer equipped with an electrospray ionization interface. Quantitative mass spectrometric analysis was conducted in positive ion mode and multiple reaction monitoring mode for the m/z 377.3 → 114.1 transition for Kryptofix 2.2.2. The external standard method was used for quantification. The method met the precision and efficiency requirements for PET radiopharmaceuticals, providing satisfactory results for specificity, matrix effect, stability, linearity (0.5-100 ng/ml, r(2)=0.9975), precision (coefficient of variation < 5%), accuracy (relative error < ± 3%), sensitivity (lower limit of quantification=0.5 ng) and detection time (<1 min). Fluorodeoxyglucose (n=6) was analysed, and the Kryptofix 2.2.2 content was found to be well below the maximum permissible levels approved by the US Food and Drug Administration. The developed method has a short analysis time (<1 min) and high sensitivity (lower limit of quantification=0.5 ng/ml) and can be successfully applied to rapid quantification of Kryptofix 2.2.2 at trace levels in fluorodeoxyglucose. This method could also be applied to other [18F]fluorine-labelled radiopharmaceuticals that use Kryptofix 2.2.2 as a phase-transfer reagent.

  15. Event-specific real-time detection and quantification of genetically modified Roundup Ready soybean.

    PubMed

    Huang, Chia-Chia; Pan, Tzu-Ming

    2005-05-18

    The event-specific real-time detection and quantification of Roundup Ready soybean (RRS) using an ABI PRISM 7700 sequence detection system with light upon extension (LUX) primer was developed in this study. The event-specific primers were designed, targeting the junction of the RRS 5' integration site and the endogenous gene lectin1. Then, a standard reference plasmid was constructed that carried both of the targeted sequences for quantitative analysis. The detection limit of the LUX real-time PCR system was 0.05 ng of 100% RRS genomic DNA, which was equal to 20.5 copies. The range of quantification was from 0.1 to 100%. The sensitivity and range of quantification successfully met the requirement of the labeling rules in the European Union and Taiwan.

  16. A reliable and rapid tool for plasma quantification of 18 psychotropic drugs by ESI tandem mass spectrometry.

    PubMed

    Vecchione, Gennaro; Casetta, Bruno; Chiapparino, Antonella; Bertolino, Alessandro; Tomaiuolo, Michela; Cappucci, Filomena; Gatta, Raffaella; Margaglione, Maurizio; Grandone, Elvira

    2012-01-01

    A simple liquid chromatographic tandem mass spectrometry (LC-MS/MS) method has been developed for simultaneous analysis of 17 basic and one acid psychotropic drugs in human plasma. The method relies on a protein precipitation step for sample preparation and offers high sensitivity, wide linearity without interferences from endogenous matrix components. Chromatography was run on a reversed-phase column with an acetonitrile-H₂O mixture. The quantification of target compounds was performed in multiple reaction monitoring (MRM) and by switching the ionization polarity within the analytical run. A further sensitivity increase was obtained by implementing the functionality "scheduled multiple reaction monitoring" (sMRM) offered by the recent version of the software package managing the instrument. The overall injection interval was less than 5.5 min. Regression coefficients of the calibration curves and limits of quantification (LOQ) showed a good coverage of over-therapeutic, therapeutic and sub-therapeutic ranges. Recovery rates, measured as percentage of recovery of spiked plasma samples, were ≥ 94%. Precision and accuracy data have been satisfactory for a therapeutic drug monitoring (TDM) service as for managing plasma samples from patients receiving psycho-pharmacological treatment. Copyright © 2012 Elsevier B.V. All rights reserved.

  17. Second-order advantage obtained from standard addition first-order instrumental data and multivariate curve resolution-alternating least squares. Calculation of the feasible bands of results.

    PubMed

    Mohseni, Naimeh; Bahram, Morteza; Olivieri, Alejandro C

    2014-03-25

    In order to achieve the second-order advantage, second-order data per sample is usually required, e.g., kinetic-spectrophotometric data. In this study, instead of monitoring the time evolution of spectra (and collecting the kinetic-spectrophotometric data) replicate spectra are used to build a virtual second order data. This data matrix (replicate mode×λ) is rank deficient. Augmentation of these data with standard addition data [or standard sample(s)] will break the rank deficiency, making the quantification of the analyte of interest possible. The MCR-ALS algorithm was applied for the resolution and quantitation of the analyte in both simulated and experimental data sets. In order to evaluate the rotational ambiguity in the retrieved solutions, the MCR-BANDS algorithm was employed. It has been shown that the reliability of the quantitative results significantly depends on the amount of spectral overlap in the spectral region of occurrence of the compound of interest and the remaining constituent(s). Copyright © 2013 Elsevier B.V. All rights reserved.

  18. GC-MS quantification of suspected volatile allergens in fragrances. 2. Data treatment strategies and method performances.

    PubMed

    Bassereau, Maud; Chaintreau, Alain; Duperrex, Stéphanie; Joulain, Daniel; Leijs, Hans; Loesing, Gerd; Owen, Neil; Sherlock, Alan; Schippa, Christine; Thorel, Pierre-Jean; Vey, Matthias

    2007-01-10

    The performances of the GC-MS determination of suspected allergens in fragrance concentrates have been investigated. The limit of quantification was experimentally determined (10 mg/L), and the variability was investigated for three different data treatment strategies: (1) two columns and three quantification ions; (2) two columns and one quantification ion; and (3) one column and three quantification ions. The first strategy best minimizes the risk of determination bias due to coelutions. This risk was evaluated by calculating the probability of coeluting a suspected allergen with perfume constituents exhibiting ions in common. For hydroxycitronellal, when using a two-column strategy, this may statistically occur more than once every 36 analyses for one ion or once every 144 analyses for three ions in common.

  19. PET Quantification of the Norepinephrine Transporter in Human Brain with (S,S)-18F-FMeNER-D2.

    PubMed

    Moriguchi, Sho; Kimura, Yasuyuki; Ichise, Masanori; Arakawa, Ryosuke; Takano, Harumasa; Seki, Chie; Ikoma, Yoko; Takahata, Keisuke; Nagashima, Tomohisa; Yamada, Makiko; Mimura, Masaru; Suhara, Tetsuya

    2017-07-01

    Norepinephrine transporter (NET) in the brain plays important roles in human cognition and the pathophysiology of psychiatric disorders. Two radioligands, ( S , S )- 11 C-MRB and ( S , S )- 18 F-FMeNER-D 2 , have been used for imaging NETs in the thalamus and midbrain (including locus coeruleus) using PET in humans. However, NET density in the equally important cerebral cortex has not been well quantified because of unfavorable kinetics with ( S , S )- 11 C-MRB and defluorination with ( S , S )- 18 F-FMeNER-D 2 , which can complicate NET quantification in the cerebral cortex adjacent to the skull containing defluorinated 18 F radioactivity. In this study, we have established analysis methods of quantification of NET density in the brain including the cerebral cortex using ( S , S )- 18 F-FMeNER-D 2 PET. Methods: We analyzed our previous ( S , S )- 18 F-FMeNER-D 2 PET data of 10 healthy volunteers dynamically acquired for 240 min with arterial blood sampling. The effects of defluorination on the NET quantification in the superficial cerebral cortex was evaluated by establishing a time stability of NET density estimations with an arterial input 2-tissue-compartment model, which guided the less-invasive reference tissue model and area under the time-activity curve methods to accurately quantify NET density in all brain regions including the cerebral cortex. Results: Defluorination of ( S , S )- 18 F-FMeNER-D 2 became prominent toward the latter half of the 240-min scan. Total distribution volumes in the superficial cerebral cortex increased with the scan duration beyond 120 min. We verified that 90-min dynamic scans provided a sufficient amount of data for quantification of NET density unaffected by defluorination. Reference tissue model binding potential values from the 90-min scan data and area under the time-activity curve ratios of 70- to 90-min data allowed for the accurate quantification of NET density in the cerebral cortex. Conclusion: We have established

  20. Quantification In Situ of Crystalline Cholesterol and Calcium Phosphate Hydroxyapatite in Human Atherosclerotic Plaques by Solid-State Magic Angle Spinning NMR

    PubMed Central

    Guo, Wen; Morrisett, Joel D.; DeBakey, Michael E.; Lawrie, Gerald M.; Hamilton, James A.

    2010-01-01

    Because of renewed interest in the progression, stabilization, and regression of atherosclerotic plaques, it has become important to develop methods for characterizing structural features of plaques in situ and noninvasively. We present a nondestructive method for ex vivo quantification of 2 solid-phase components of plaques: crystalline cholesterol and calcium phosphate salts. Magic angle spinning (MAS) nuclear magnetic resonance (NMR) spectra of human carotid endarterectomy plaques revealed 13C resonances of crystalline cholesterol monohydrate and a 31P resonance of calcium phosphate hydroxyapatite (CPH). The spectra were obtained under conditions in which there was little or no interference from other chemical components and were suitable for quantification in situ of the crystalline cholesterol and CPH. Carotid atherosclerotic plaques showed a wide variation in their crystalline cholesterol content. The calculated molar ratio of liquid-crystalline cholesterol to phospholipid ranged from 1.1 to 1.7, demonstrating different capabilities of the phospholipids to reduce crystallization of cholesterol. The spectral properties of the phosphate groups in CPH in carotid plaques were identical to those of CPH in bone. 31P MAS NMR is a simple, rapid method for quantification of calcium phosphate salts in tissue without extraction and time-consuming chemical analysis. Crystalline phases in intact atherosclerotic plaques (ex vivo) can be quantified accurately by solid-state 13C and 31PMAS NMR spectroscopy. PMID:10845882

  1. [Comparison between the LightCycler CMV Quant Kit (Roche Diagnostics) with a standardized in-house Taqman assay for cytomegalovirus blood viral load quantification].

    PubMed

    Alain, S; Lachaise, V; Hantz, S; Denis, F

    2010-04-01

    The broad use of cytomegalovirus (CMV) viral load quantification in blood to follow immunosuppressed patients need standardized assays. Choice of whole blood allows follow-up for several viruses and simplifies pretreatment and storage of samples. We therefore evaluated the LightCycler CMV Quant Kit (Roche Diagnostics) assay on whole blood after a manual extraction (High Pure viral nucleic acid kit, Roche Diagnostics), using as a reference an in-house Taqman assay (LC1UL83) which has been validated in various clinical situations. A panel obtained by serial dilutions of a virion stock in CMV whole blood, a commercial plasma quality control (VQC, Argène, France) crude or diluted in whole blood, infected cells extracts and 46 clinical samples from transplanted patients were tested simultaneously by both techniques. For plasma quality controls, both PCR assays are correlated VQC (R(2)=0.93). On whole blood or infected cells dilutions, correlation shows an overestimation by the LC1UL83 assay (mean 1.2 log copies/ml) over 3 log though R(2)=0.94. Results with CMV Quant Kit are closer to expected values. Results on clinical samples are close to quality controls with a lower variation of quantification (0.76 log copies/ml). CMV Quant Kit performs well when compared with a clinically validated PCR. Quality control results showed discrepancies between plasma and whole blood, demonstrating the need for whole blood standardized panels to compare the methods. This underlines the need to follow a patient with the same technique during his follow-up. Copyright 2009 Elsevier Masson SAS. All rights reserved.

  2. A Probabilistic Framework for Peptide and Protein Quantification from Data-Dependent and Data-Independent LC-MS Proteomics Experiments

    PubMed Central

    Richardson, Keith; Denny, Richard; Hughes, Chris; Skilling, John; Sikora, Jacek; Dadlez, Michał; Manteca, Angel; Jung, Hye Ryung; Jensen, Ole Nørregaard; Redeker, Virginie; Melki, Ronald; Langridge, James I.; Vissers, Johannes P.C.

    2013-01-01

    A probability-based quantification framework is presented for the calculation of relative peptide and protein abundance in label-free and label-dependent LC-MS proteomics data. The results are accompanied by credible intervals and regulation probabilities. The algorithm takes into account data uncertainties via Poisson statistics modified by a noise contribution that is determined automatically during an initial normalization stage. Protein quantification relies on assignments of component peptides to the acquired data. These assignments are generally of variable reliability and may not be present across all of the experiments comprising an analysis. It is also possible for a peptide to be identified to more than one protein in a given mixture. For these reasons the algorithm accepts a prior probability of peptide assignment for each intensity measurement. The model is constructed in such a way that outliers of any type can be automatically reweighted. Two discrete normalization methods can be employed. The first method is based on a user-defined subset of peptides, while the second method relies on the presence of a dominant background of endogenous peptides for which the concentration is assumed to be unaffected. Normalization is performed using the same computational and statistical procedures employed by the main quantification algorithm. The performance of the algorithm will be illustrated on example data sets, and its utility demonstrated for typical proteomics applications. The quantification algorithm supports relative protein quantification based on precursor and product ion intensities acquired by means of data-dependent methods, originating from all common isotopically-labeled approaches, as well as label-free ion intensity-based data-independent methods. PMID:22871168

  3. Precision and accuracy of clinical quantification of myocardial blood flow by dynamic PET: A technical perspective.

    PubMed

    Moody, Jonathan B; Lee, Benjamin C; Corbett, James R; Ficaro, Edward P; Murthy, Venkatesh L

    2015-10-01

    A number of exciting advances in PET/CT technology and improvements in methodology have recently converged to enhance the feasibility of routine clinical quantification of myocardial blood flow and flow reserve. Recent promising clinical results are pointing toward an important role for myocardial blood flow in the care of patients. Absolute blood flow quantification can be a powerful clinical tool, but its utility will depend on maintaining precision and accuracy in the face of numerous potential sources of methodological errors. Here we review recent data and highlight the impact of PET instrumentation, image reconstruction, and quantification methods, and we emphasize (82)Rb cardiac PET which currently has the widest clinical application. It will be apparent that more data are needed, particularly in relation to newer PET technologies, as well as clinical standardization of PET protocols and methods. We provide recommendations for the methodological factors considered here. At present, myocardial flow reserve appears to be remarkably robust to various methodological errors; however, with greater attention to and more detailed understanding of these sources of error, the clinical benefits of stress-only blood flow measurement may eventually be more fully realized.

  4. A multifractal approach to space-filling recovery for PET quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Willaime, Julien M. Y., E-mail: julien.willaime@siemens.com; Aboagye, Eric O.; Tsoumpas, Charalampos

    2014-11-01

    Purpose: A new image-based methodology is developed for estimating the apparent space-filling properties of an object of interest in PET imaging without need for a robust segmentation step and used to recover accurate estimates of total lesion activity (TLA). Methods: A multifractal approach and the fractal dimension are proposed to recover the apparent space-filling index of a lesion (tumor volume, TV) embedded in nonzero background. A practical implementation is proposed, and the index is subsequently used with mean standardized uptake value (SUV {sub mean}) to correct TLA estimates obtained from approximate lesion contours. The methodology is illustrated on fractal andmore » synthetic objects contaminated by partial volume effects (PVEs), validated on realistic {sup 18}F-fluorodeoxyglucose PET simulations and tested for its robustness using a clinical {sup 18}F-fluorothymidine PET test–retest dataset. Results: TLA estimates were stable for a range of resolutions typical in PET oncology (4–6 mm). By contrast, the space-filling index and intensity estimates were resolution dependent. TLA was generally recovered within 15% of ground truth on postfiltered PET images affected by PVEs. Volumes were recovered within 15% variability in the repeatability study. Results indicated that TLA is a more robust index than other traditional metrics such as SUV {sub mean} or TV measurements across imaging protocols. Conclusions: The fractal procedure reported here is proposed as a simple and effective computational alternative to existing methodologies which require the incorporation of image preprocessing steps (i.e., partial volume correction and automatic segmentation) prior to quantification.« less

  5. Rapid quantification of plant-powdery mildew interactions by qPCR and conidiospore counts.

    PubMed

    Weßling, Ralf; Panstruga, Ralph

    2012-08-31

    The powdery mildew disease represents a valuable patho-system to study the interaction between plant hosts and obligate biotrophic fungal pathogens. Numerous discoveries have been made on the basis of the quantitative evaluation of plant-powdery mildew interactions, especially in the context of hyper-susceptible and/or resistant plant mutants. However, the presently available methods to score the pathogenic success of powdery mildew fungi are laborious and thus not well suited for medium- to high-throughput analysis. Here we present two new protocols that allow the rapid quantitative assessment of powdery mildew disease development. One procedure depends on quantitative polymerase chain reaction (qPCR)-based evaluation of fungal biomass, while the other relies on the quantification of fungal conidiospores. We validated both techniques using the powdery mildew pathogen Golovinomyces orontii on a set of hyper-susceptible and resistant Arabidopsis thaliana mutants and found that both cover a wide dynamic range of one to two (qPCR) and four to five (quantification of conidia) orders of magnitude, respectively. The two approaches yield reproducible results and are easy to perform without specialized equipment. The qPCR and spore count assays rapidly and reproducibly quantify powdery mildew pathogenesis. Our methods are performed at later stages of infection and discern mutant phenotypes accurately. The assays therefore complement currently used procedures of powdery mildew quantification and can overcome some of their limitations. In addition, they can easily be adapted to other plant-powdery mildew patho-systems.

  6. High sensitivity mass spectrometric quantification of serum growth hormone by amphiphilic peptide conjugation

    NASA Astrophysics Data System (ADS)

    Arsene, Cristian G.; Schulze, Dirk; Kratzsch, Jürgen; Henrion, André

    2012-12-01

    Amphiphilic peptide conjugation affords a significant increase in sensitivity with protein quantification by electrospray-ionization mass spectrometry. This has been demonstrated here for human growth hormone in serum using N-(3-iodopropyl)-N,N,N-dimethyloctylammonium iodide (IPDOA-iodide) as derivatizing reagent. The signal enhancement achieved in comparison to the method without derivatization enables extension of the applicable concentration range down to the very low concentrations as encountered with clinical glucose suppression tests for patients with acromegaly. The method has been validated using a set of serum samples spiked with known amounts of recombinant 22 kDa growth hormone in the range of 0.48 to 7.65 \\mug/L. The coefficient of variation (CV) calculated, based on the deviation of results from the expected concentrations, was 3.5% and the limit of quantification (LoQ) was determined as 0.4 \\mug/L. The potential of the method as a tool in clinical practice has been demonstrated with patient samples of about 1 \\mug/L.

  7. Strawberry: Fast and accurate genome-guided transcript reconstruction and quantification from RNA-Seq.

    PubMed

    Liu, Ruolin; Dickerson, Julie

    2017-11-01

    We propose a novel method and software tool, Strawberry, for transcript reconstruction and quantification from RNA-Seq data under the guidance of genome alignment and independent of gene annotation. Strawberry consists of two modules: assembly and quantification. The novelty of Strawberry is that the two modules use different optimization frameworks but utilize the same data graph structure, which allows a highly efficient, expandable and accurate algorithm for dealing large data. The assembly module parses aligned reads into splicing graphs, and uses network flow algorithms to select the most likely transcripts. The quantification module uses a latent class model to assign read counts from the nodes of splicing graphs to transcripts. Strawberry simultaneously estimates the transcript abundances and corrects for sequencing bias through an EM algorithm. Based on simulations, Strawberry outperforms Cufflinks and StringTie in terms of both assembly and quantification accuracies. Under the evaluation of a real data set, the estimated transcript expression by Strawberry has the highest correlation with Nanostring probe counts, an independent experiment measure for transcript expression. Strawberry is written in C++14, and is available as open source software at https://github.com/ruolin/strawberry under the MIT license.

  8. An expert system for the quantification of fault rates in construction fall accidents.

    PubMed

    Talat Birgonul, M; Dikmen, Irem; Budayan, Cenk; Demirel, Tuncay

    2016-01-01

    Expert witness reports, prepared with the aim of quantifying fault rates among parties, play an important role in a court's final decision. However, conflicting fault rates assigned by different expert witness boards lead to iterative objections raised by the related parties. This unfavorable situation mainly originates due to the subjectivity of expert judgments and unavailability of objective information about the causes of accidents. As a solution to this shortcoming, an expert system based on a rule-based system was developed for the quantification of fault rates in construction fall accidents. The aim of developing DsSafe is decreasing the subjectivity inherent in expert witness reports. Eighty-four inspection reports prepared by the official and authorized inspectors were examined and root causes of construction fall accidents in Turkey were identified. Using this information, an evaluation form was designed and submitted to the experts. Experts were asked to evaluate the importance level of the factors that govern fall accidents and determine the fault rates under different scenarios. Based on expert judgments, a rule-based expert system was developed. The accuracy and reliability of DsSafe were tested with real data as obtained from finalized court cases. DsSafe gives satisfactory results.

  9. Reductive amination derivatization for the quantification of garlic components by isotope dilution analysis.

    PubMed

    Lin, Yi-Reng; Huang, Mei-Fang; Wu, You-Ying; Liu, Meng-Chieh; Huang, Jing-Heng; Chen, Ziyu; Shiue, Yow-Ling; Wu, Chia-En; Liang, Shih-Shin

    2017-09-01

    In this work, we synthesized internal standards for four garlic organosulfur compounds (OSCs) by reductive amination with 13 C, D 2 -formaldehyde, and developed an isotope dilution analysis method to quantitate these organosulfur components in garlic samples. Internal standards were synthesized for internal absolute quantification of S-allylcysteine (SAC), S-allylcysteine sulfoxide (alliin), S-methylcysteine (SMC), and S-ethylcysteine (SEC). We used a multiple reaction monitoring (MRM) to detect 13 C, D 2 -formaldehyde-modified OSCs by ultrahigh-performance liquid phase chromatography coupled with tandem mass spectrometry (UHPLC-MS/MS) and obtained MS spectra showing different ratios of 13 C, D 2 -formaldehyde-modified and H 2 -formaldehyde-modified compounds. The resulting labeled and unlabeled OSCs were exhibited correlation coefficient (R 2 ) ranged from 0.9989 to 0.9994, respectively. The average recoveries for four OSCs at three concentration levels ranged from 89% to 105%. By 13 C, D 2 -formaldehyde and sodium cyanoborohydride, the reductive amination-based method can be utilized to generate novel internal standard for isotope dilution and to extend the quantitative application. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. HPAEC-PAD quantification of Haemophilus influenzae type b polysaccharide in upstream and downstream samples.

    PubMed

    van der Put, Robert M F; de Haan, Alex; van den IJssel, Jan G M; Hamidi, Ahd; Beurret, Michel

    2015-11-27

    Due to the rapidly increasing introduction of Haemophilus influenzae type b (Hib) and other conjugate vaccines worldwide during the last decade, reliable and robust analytical methods are needed for the quantitative monitoring of intermediate samples generated during fermentation (upstream processing, USP) and purification (downstream processing, DSP) of polysaccharide vaccine components. This study describes the quantitative characterization of in-process control (IPC) samples generated during the fermentation and purification of the capsular polysaccharide (CPS), polyribosyl-ribitol-phosphate (PRP), derived from Hib. Reliable quantitative methods are necessary for all stages of production; otherwise accurate process monitoring and validation is not possible. Prior to the availability of high performance anion exchange chromatography methods, this polysaccharide was predominantly quantified either with immunochemical methods, or with the colorimetric orcinol method, which shows interference from fermentation medium components and reagents used during purification. Next to an improved high performance anion exchange chromatography-pulsed amperometric detection (HPAEC-PAD) method, using a modified gradient elution, both the orcinol assay and high performance size exclusion chromatography (HPSEC) analyses were evaluated. For DSP samples, it was found that the correlation between the results obtained by HPAEC-PAD specific quantification of the PRP monomeric repeat unit released by alkaline hydrolysis, and those from the orcinol method was high (R(2)=0.8762), and that it was lower between HPAEC-PAD and HPSEC results. Additionally, HPSEC analysis of USP samples yielded surprisingly comparable results to those obtained by HPAEC-PAD. In the early part of the fermentation, medium components interfered with the different types of analysis, but quantitative HPSEC data could still be obtained, although lacking the specificity of the HPAEC-PAD method. Thus, the HPAEC

  11. Identification and quantification of virulence factors of enterotoxigenic Escherichia coli by high-resolution melting curve quantitative PCR.

    PubMed

    Wang, Weilan; Zijlstra, Ruurd T; Gänzle, Michael G

    2017-05-15

    Diagnosis of enterotoxigenic E. coli (ETEC) associated diarrhea is complicated by the diversity of E.coli virulence factors. This study developed a multiplex quantitative PCR assay based on high-resolution melting curves analysis (HRM-qPCR) to identify and quantify genes encoding five ETEC fimbriae related to diarrhea in swine, i.e. K99, F41, F18, F6 and K88. Five fimbriae expressed by ETEC were amplified in multiple HRM-qPCR reactions to allow simultaneous identification and quantification of five target genes. The assay was calibrated to allow quantification of the most abundant target gene, and validated by analysis of 30 samples obtained from piglets with diarrhea and healthy controls, and comparison to standard qPCR detection. The five amplicons with melting temperatures (Tm) ranging from 74.7 ± 0.06 to 80.5 ± 0.15 °C were well-separated by HRM-qPCR. The area of amplicons under the melting peak correlated linearly to the proportion of the template in the calibration mixture if the proportion exceeded 4.8% (K88) or <1% (all other amplicons). The suitability of the method was evaluated using 30 samples from weaned pigs aged 6-7 weeks; 14 of these animals suffered from diarrhea in consequence of poor sanitary conditions. Genes encoding fimbriae and enterotoxins were quantified by HRM-qPCR and/or qPCR. The multiplex HRM-qPCR allowed accurate analysis when the total gene copy number of targets was more than 1 × 10 5 / g wet feces and the HRM curves were able to simultaneously distinguish fimbriae genes in the fecal samples. The relative quantification of the most abundant F18 based on melting peak area was highly correlated (P < 0.001; r 2  = 0.956) with that of individual qPCR result but the correlation for less abundant fimbriae was much lower. The multiplex HRM assay identifies ETEC virulence factors specifically and efficiently. It correctly indicated the predominant fimbriae type and additionally provides information of presence/ absence of

  12. Literacy and Language Education: The Quantification of Learning

    ERIC Educational Resources Information Center

    Gibb, Tara

    2015-01-01

    This chapter describes international policy contexts of adult literacy and language assessment and the shift toward standardization through measurement tools. It considers the implications the quantification of learning outcomes has for pedagogy and practice and for the social inclusion of transnational migrants.

  13. Nondestructive Detection and Quantification of Blueberry Bruising using Near-infrared (NIR) Hyperspectral Reflectance Imaging

    PubMed Central

    Jiang, Yu; Li, Changying; Takeda, Fumiomi

    2016-01-01

    Currently, blueberry bruising is evaluated by either human visual/tactile inspection or firmness measurement instruments. These methods are destructive, time-consuming, and subjective. The goal of this paper was to develop a non-destructive approach for blueberry bruising detection and quantification. Experiments were conducted on 300 samples of southern highbush blueberry (Camellia, Rebel, and Star) and on 1500 samples of northern highbush blueberry (Bluecrop, Jersey, and Liberty) for hyperspectral imaging analysis, firmness measurement, and human evaluation. An algorithm was developed to automatically calculate a bruise ratio index (ratio of bruised to whole fruit area) for bruise quantification. The spectra of bruised and healthy tissues were statistically separated and the separation was independent of cultivars. Support vector machine (SVM) classification of the spectra from the regions of interest (ROIs) achieved over 94%, 92%, and 96% accuracy on the training set, independent testing set, and combined set, respectively. The statistical results showed that the bruise ratio index was equivalent to the measured firmness but better than the predicted firmness in regard to effectiveness of bruise quantification, and the bruise ratio index had a strong correlation with human assessment (R2 = 0.78 − 0.83). Therefore, the proposed approach and the bruise ratio index are effective to non-destructively detect and quantify blueberry bruising. PMID:27767050

  14. Quantification of Humic Substances in Natural Water Using Nitrogen-Doped Carbon Dots.

    PubMed

    Guan, Yan-Fang; Huang, Bao-Cheng; Qian, Chen; Yu, Han-Qing

    2017-12-19

    Dissolved organic matter (DOM) is ubiquitous in aqueous environments and plays a significant role in pollutant mitigation, transformation and organic geochemical circulation. DOM is also capable of forming carcinogenic byproducts in the disinfection treatment processes of drinking water. Thus, efficient methods for DOM quantification are highly desired. In this work, a novel sensor for rapid and selective detection of humic substances (HS), a key component of DOM, based on fluorescence quenching of nitrogen-doped carbon quantum dots was developed. The experimental results show that the HS detection range could be broadened to 100 mg/L with a detection limit of 0.2 mg/L. Moreover, the detection was effective within a wide pH range of 3.0 to 12.0, and the interferences of ions on the HS measurement were negligible. A good detection result for real surface water samples further validated the feasibility of the developed detection method. Furthermore, a nonradiation electron transfer mechanism for quenching the nitrogen-doped carbon-dots fluorescence by HS was elucidated. In addition, we prepared a test paper and proved its effectiveness. This work provides a new efficient method for the HS quantification than the frequently used modified Lowry method in terms of sensitivity and detection range.

  15. AQuA: An Automated Quantification Algorithm for High-Throughput NMR-Based Metabolomics and Its Application in Human Plasma.

    PubMed

    Röhnisch, Hanna E; Eriksson, Jan; Müllner, Elisabeth; Agback, Peter; Sandström, Corine; Moazzami, Ali A

    2018-02-06

    A key limiting step for high-throughput NMR-based metabolomics is the lack of rapid and accurate tools for absolute quantification of many metabolites. We developed, implemented, and evaluated an algorithm, AQuA (Automated Quantification Algorithm), for targeted metabolite quantification from complex 1 H NMR spectra. AQuA operates based on spectral data extracted from a library consisting of one standard calibration spectrum for each metabolite. It uses one preselected NMR signal per metabolite for determining absolute concentrations and does so by effectively accounting for interferences caused by other metabolites. AQuA was implemented and evaluated using experimental NMR spectra from human plasma. The accuracy of AQuA was tested and confirmed in comparison with a manual spectral fitting approach using the ChenomX software, in which 61 out of 67 metabolites quantified in 30 human plasma spectra showed a goodness-of-fit (r 2 ) close to or exceeding 0.9 between the two approaches. In addition, three quality indicators generated by AQuA, namely, occurrence, interference, and positional deviation, were studied. These quality indicators permit evaluation of the results each time the algorithm is operated. The efficiency was tested and confirmed by implementing AQuA for quantification of 67 metabolites in a large data set comprising 1342 experimental spectra from human plasma, in which the whole computation took less than 1 s.

  16. STATISTICAL ANALYSIS OF TANK 18F FLOOR SAMPLE RESULTS

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Harris, S.

    2010-09-02

    Representative sampling has been completed for characterization of the residual material on the floor of Tank 18F as per the statistical sampling plan developed by Shine [1]. Samples from eight locations have been obtained from the tank floor and two of the samples were archived as a contingency. Six samples, referred to in this report as the current scrape samples, have been submitted to and analyzed by SRNL [2]. This report contains the statistical analysis of the floor sample analytical results to determine if further data are needed to reduce uncertainty. Included are comparisons with the prior Mantis samples resultsmore » [3] to determine if they can be pooled with the current scrape samples to estimate the upper 95% confidence limits (UCL{sub 95%}) for concentration. Statistical analysis revealed that the Mantis and current scrape sample results are not compatible. Therefore, the Mantis sample results were not used to support the quantification of analytes in the residual material. Significant spatial variability among the current sample results was not found. Constituent concentrations were similar between the North and South hemispheres as well as between the inner and outer regions of the tank floor. The current scrape sample results from all six samples fall within their 3-sigma limits. In view of the results from numerous statistical tests, the data were pooled from all six current scrape samples. As such, an adequate sample size was provided for quantification of the residual material on the floor of Tank 18F. The uncertainty is quantified in this report by an upper 95% confidence limit (UCL{sub 95%}) on each analyte concentration. The uncertainty in analyte concentration was calculated as a function of the number of samples, the average, and the standard deviation of the analytical results. The UCL{sub 95%} was based entirely on the six current scrape sample results (each averaged across three analytical determinations).« less

  17. Experiences with an identification and quantification program for inhibitor-positive milk samples.

    PubMed

    Kress, Claudia; Seidler, Caroline; Kerp, Bianca; Schneider, Elisabeth; Usleber, Ewald

    2007-03-14

    Beta-lactam antibiotics (penicillins, cephalosporins) are still the most commonly used antibiotics for dairy cows in Germany. In routine milk testing, according to the German milk quality regulation, a positive result obtained for bulk tank milk by microbiological inhibitor tests needs no further confirmation, but results in reduced milk payment of 0.05 euros kg(-1) for one month. In some cases, however, further identification of the causative agent can be of interest, either if antimicrobial drugs have not knowingly been used recently, or if improper use of such drugs is denied. As a service for milk producers, our laboratory offers further analyses of violative milk samples, aiming at the identification and quantification of the inhibitor(s). In this program, a panel of microbiological inhibitor tests, receptor tests, and enzyme immunoassays (EIA) is used in a step-by-step analysis, which primarily focusses on beta-lactams, but also includes other compounds such as sulfonamides or tetracyclines, respectively. Here we report results for violative milk samples (n=63) analysed between 2003 and 2005. In most cases (95%), beta-lactam antibiotics could be identified, although not always at levels exceeding the respective MRL values. Penicillin G (mostly together with benzylpenicilloyl metabolites) could be identified in 74.6% of all samples. Other compounds identified were, in decreasing order, ceftiofur (11%), ampicillin/amoxicillin (6.3%), isoxazolyl penicillins (3.2%), and sulfonamides (1.6%). The results indicate that penicillin G is still the predominant antibiotic responsible for violative bulk tank milk samples as detected during regulatory control.

  18. [Performance evaluation of Abbott RealTime HBV Quantification Kit for HBV viral load by real-time PCR].

    PubMed

    Kim, Myeong Hee; Cha, Choong Hwan; An, Dongheui; Choi, Sung Eun; Oh, Heung Bum

    2008-04-01

    Hepatitis B virus (HBV) DNA quantification is necessary for starting and monitoring of antiviral therapy in patients with chronic hepatitis B. This study was intended to assess the clinical performance of Abbott RealTime HBV Quantification kit (Abbott Laboratories, USA). The performance was evaluated in terms of precision, linearity, detection sensitivity, cross-reactivity, and carry-over. A correlation with the Real-Q HBV Quantification kit (BioSewoom Inc., Korea) was also examined using serum samples from 64 patients diagnosed with chronic hepatitis B and underwent lamivudine therapy in Asan Medical Center. We verified the trueness of the system by comparing the outputs with the assigned values of the BBI panel (BBI Diagnostics, USA). Within-run and between-run coefficients of variation (CV) were 3.56-4.71% and 3.03-4.98%, respectively. Linearity was manifested ranging from 53 to 10(9)copies/mL and the detection sensitivity was verified to be 51 copies/mL. None of hepatitis C virus showed cross-reactivity. No cross-contamination occurred when negative and positive samples were alternatively placed in a row. It showed a good correlation with the Real-Q HBV (r(2)=0.9609) and the test results for the BBI panel were also well agreed to the assigned values (r(2)=0.9933). The performance of Abbott RealTime HBV Quantification kit was excellent; thus, it should be widely used in starting and monitoring of antiviral therapy in Korean patients with chronic hepatitis B.

  19. Quantification of integrated HIV DNA by repetitive-sampling Alu-HIV PCR on the basis of poisson statistics.

    PubMed

    De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos

    2014-06-01

    Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.

  20. Self-digitization microfluidic chip for absolute quantification of mRNA in single cells.

    PubMed

    Thompson, Alison M; Gansen, Alexander; Paguirigan, Amy L; Kreutz, Jason E; Radich, Jerald P; Chiu, Daniel T

    2014-12-16

    Quantification of mRNA in single cells provides direct insight into how intercellular heterogeneity plays a role in disease progression and outcomes. Quantitative polymerase chain reaction (qPCR), the current gold standard for evaluating gene expression, is insufficient for providing absolute measurement of single-cell mRNA transcript abundance. Challenges include difficulties in handling small sample volumes and the high variability in measurements. Microfluidic digital PCR provides far better sensitivity for minute quantities of genetic material, but the typical format of this assay does not allow for counting of the absolute number of mRNA transcripts samples taken from single cells. Furthermore, a large fraction of the sample is often lost during sample handling in microfluidic digital PCR. Here, we report the absolute quantification of single-cell mRNA transcripts by digital, one-step reverse transcription PCR in a simple microfluidic array device called the self-digitization (SD) chip. By performing the reverse transcription step in digitized volumes, we find that the assay exhibits a linear signal across a wide range of total RNA concentrations and agrees well with standard curve qPCR. The SD chip is found to digitize a high percentage (86.7%) of the sample for single-cell experiments. Moreover, quantification of transferrin receptor mRNA in single cells agrees well with single-molecule fluorescence in situ hybridization experiments. The SD platform for absolute quantification of single-cell mRNA can be optimized for other genes and may be useful as an independent control method for the validation of mRNA quantification techniques.