Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.
Gerold, Chase T; Bakker, Eric; Henry, Charles S
2018-04-03
In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.
A phase quantification method based on EBSD data for a continuously cooled microalloyed steel
DOE Office of Scientific and Technical Information (OSTI.GOV)
Zhao, H.; Wynne, B.P.; Palmiere, E.J., E-mail: e.j
2017-01-15
Mechanical properties of steels depend on the phase constitutions of the final microstructures which can be related to the processing parameters. Therefore, accurate quantification of different phases is necessary to investigate the relationships between processing parameters, final microstructures and mechanical properties. Point counting on micrographs observed by optical or scanning electron microscopy is widely used as a phase quantification method, and different phases are discriminated according to their morphological characteristics. However, it is difficult to differentiate some of the phase constituents with similar morphology. Differently, for EBSD based phase quantification methods, besides morphological characteristics, other parameters derived from the orientationmore » information can also be used for discrimination. In this research, a phase quantification method based on EBSD data in the unit of grains was proposed to identify and quantify the complex phase constitutions of a microalloyed steel subjected to accelerated coolings. Characteristics of polygonal ferrite/quasi-polygonal ferrite, acicular ferrite and bainitic ferrite on grain averaged misorientation angles, aspect ratios, high angle grain boundary fractions and grain sizes were analysed and used to develop the identification criteria for each phase. Comparing the results obtained by this EBSD based method and point counting, it was found that this EBSD based method can provide accurate and reliable phase quantification results for microstructures with relatively slow cooling rates. - Highlights: •A phase quantification method based on EBSD data in the unit of grains was proposed. •The critical grain area above which GAM angles are valid parameters was obtained. •Grain size and grain boundary misorientation were used to identify acicular ferrite. •High cooling rates deteriorate the accuracy of this EBSD based method.« less
NASA Astrophysics Data System (ADS)
Lee, Hyun-Seok; Heun Kim, Sook; Jeong, Ji-Seon; Lee, Yong-Moon; Yim, Yong-Hyeon
2015-10-01
An element-based reductive approach provides an effective means of realizing International System of Units (SI) traceability for high-purity biological standards. Here, we develop an absolute protein quantification method using double isotope dilution (ID) inductively coupled plasma mass spectrometry (ICP-MS) combined with microwave-assisted acid digestion for the first time. We validated the method and applied it to certify the candidate protein certified reference material (CRM) of human growth hormone (hGH). The concentration of hGH was determined by analysing the total amount of sulfur in hGH. Next, the size-exclusion chromatography method was used with ICP-MS to characterize and quantify sulfur-containing impurities. By subtracting the contribution of sulfur-containing impurities from the total sulfur content in the hGH CRM, we obtained a SI-traceable certification value. The quantification result obtained with the present method based on sulfur analysis was in excellent agreement with the result determined via a well-established protein quantification method based on amino acid analysis using conventional acid hydrolysis combined with an ID liquid chromatography-tandem mass spectrometry. The element-based protein quantification method developed here can be generally used for SI-traceable absolute quantification of proteins, especially pure-protein standards.
Yan, Xiaowen; Yang, Limin; Wang, Qiuquan
2013-07-01
Much progress has been made in identification of the proteins in proteomes, and quantification of these proteins has attracted much interest. In addition to popular tandem mass spectrometric methods based on soft ionization, inductively coupled plasma mass spectrometry (ICPMS), a typical example of mass spectrometry based on hard ionization, usually used for analysis of elements, has unique advantages in absolute quantification of proteins by determination of an element with a definite stoichiometry in a protein or attached to the protein. In this Trends article, we briefly describe state-of-the-art ICPMS-based methods for quantification of proteins, emphasizing protein-labeling and element-tagging strategies developed on the basis of chemically selective reactions and/or biospecific interactions. Recent progress from protein to cell quantification by use of ICPMS is also discussed, and the possibilities and challenges of ICPMS-based protein quantification for universal, selective, or targeted quantification of proteins and cells in a biological sample are also discussed critically. We believe ICPMS-based protein quantification will become ever more important in targeted quantitative proteomics and bioanalysis in the near future.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Zhou; Adams, Rachel M; Chourey, Karuna
2012-01-01
A variety of quantitative proteomics methods have been developed, including label-free, metabolic labeling, and isobaric chemical labeling using iTRAQ or TMT. Here, these methods were compared in terms of the depth of proteome coverage, quantification accuracy, precision, and reproducibility using a high-performance hybrid mass spectrometer, LTQ Orbitrap Velos. Our results show that (1) the spectral counting method provides the deepest proteome coverage for identification, but its quantification performance is worse than labeling-based approaches, especially the quantification reproducibility; (2) metabolic labeling and isobaric chemical labeling are capable of accurate, precise, and reproducible quantification and provide deep proteome coverage for quantification. Isobaricmore » chemical labeling surpasses metabolic labeling in terms of quantification precision and reproducibility; (3) iTRAQ and TMT perform similarly in all aspects compared in the current study using a CID-HCD dual scan configuration. Based on the unique advantages of each method, we provide guidance for selection of the appropriate method for a quantitative proteomics study.« less
2014-01-01
Background Various computer-based methods exist for the detection and quantification of protein spots in two dimensional gel electrophoresis images. Area-based methods are commonly used for spot quantification: an area is assigned to each spot and the sum of the pixel intensities in that area, the so-called volume, is used a measure for spot signal. Other methods use the optical density, i.e. the intensity of the most intense pixel of a spot, or calculate the volume from the parameters of a fitted function. Results In this study we compare the performance of different spot quantification methods using synthetic and real data. We propose a ready-to-use algorithm for spot detection and quantification that uses fitting of two dimensional Gaussian function curves for the extraction of data from two dimensional gel electrophoresis (2-DE) images. The algorithm implements fitting using logical compounds and is computationally efficient. The applicability of the compound fitting algorithm was evaluated for various simulated data and compared with other quantification approaches. We provide evidence that even if an incorrect bell-shaped function is used, the fitting method is superior to other approaches, especially when spots overlap. Finally, we validated the method with experimental data of urea-based 2-DE of Aβ peptides andre-analyzed published data sets. Our methods showed higher precision and accuracy than other approaches when applied to exposure time series and standard gels. Conclusion Compound fitting as a quantification method for 2-DE spots shows several advantages over other approaches and could be combined with various spot detection methods. The algorithm was scripted in MATLAB (Mathworks) and is available as a supplemental file. PMID:24915860
Federal Register 2010, 2011, 2012, 2013, 2014
2013-08-27
... DEPARTMENT OF AGRICULTURE [Docket Number: USDA-2013-0003] Science-Based Methods for Entity-Scale Quantification of Greenhouse Gas Sources and Sinks From Agriculture and Forestry Practices AGENCY: Office of the... of Agriculture (USDA) has prepared a report containing methods for quantifying entity-scale...
NASA Astrophysics Data System (ADS)
Bourgeat, Pierrick; Dore, Vincent; Fripp, Jurgen; Villemagne, Victor L.; Rowe, Chris C.; Salvado, Olivier
2015-03-01
With the advances of PET tracers for β-Amyloid (Aβ) detection in neurodegenerative diseases, automated quantification methods are desirable. For clinical use, there is a great need for PET-only quantification method, as MR images are not always available. In this paper, we validate a previously developed PET-only quantification method against MR-based quantification using 6 tracers: 18F-Florbetaben (N=148), 18F-Florbetapir (N=171), 18F-NAV4694 (N=47), 18F-Flutemetamol (N=180), 11C-PiB (N=381) and 18F-FDG (N=34). The results show an overall mean absolute percentage error of less than 5% for each tracer. The method has been implemented as a remote service called CapAIBL (http://milxcloud.csiro.au/capaibl). PET images are uploaded to a cloud platform where they are spatially normalised to a standard template and quantified. A report containing global as well as local quantification, along with surface projection of the β-Amyloid deposition is automatically generated at the end of the pipeline and emailed to the user.
Alves, L P S; Almeida, A T; Cruz, L M; Pedrosa, F O; de Souza, E M; Chubatsu, L S; Müller-Santos, M; Valdameri, G
2017-01-16
The conventional method for quantification of polyhydroxyalkanoates based on whole-cell methanolysis and gas chromatography (GC) is laborious and time-consuming. In this work, a method based on flow cytometry of Nile red stained bacterial cells was established to quantify poly-3-hydroxybutyrate (PHB) production by the diazotrophic and plant-associated bacteria, Herbaspirillum seropedicae and Azospirillum brasilense. The method consists of three steps: i) cell permeabilization, ii) Nile red staining, and iii) analysis by flow cytometry. The method was optimized step-by-step and can be carried out in less than 5 min. The final results indicated a high correlation coefficient (R2=0.99) compared to a standard method based on methanolysis and GC. This method was successfully applied to the quantification of PHB in epiphytic bacteria isolated from rice roots.
RNA-Skim: a rapid method for RNA-Seq quantification at transcript level
Zhang, Zhaojun; Wang, Wei
2014-01-01
Motivation: RNA-Seq technique has been demonstrated as a revolutionary means for exploring transcriptome because it provides deep coverage and base pair-level resolution. RNA-Seq quantification is proven to be an efficient alternative to Microarray technique in gene expression study, and it is a critical component in RNA-Seq differential expression analysis. Most existing RNA-Seq quantification tools require the alignments of fragments to either a genome or a transcriptome, entailing a time-consuming and intricate alignment step. To improve the performance of RNA-Seq quantification, an alignment-free method, Sailfish, has been recently proposed to quantify transcript abundances using all k-mers in the transcriptome, demonstrating the feasibility of designing an efficient alignment-free method for transcriptome quantification. Even though Sailfish is substantially faster than alternative alignment-dependent methods such as Cufflinks, using all k-mers in the transcriptome quantification impedes the scalability of the method. Results: We propose a novel RNA-Seq quantification method, RNA-Skim, which partitions the transcriptome into disjoint transcript clusters based on sequence similarity, and introduces the notion of sig-mers, which are a special type of k-mers uniquely associated with each cluster. We demonstrate that the sig-mer counts within a cluster are sufficient for estimating transcript abundances with accuracy comparable with any state-of-the-art method. This enables RNA-Skim to perform transcript quantification on each cluster independently, reducing a complex optimization problem into smaller optimization tasks that can be run in parallel. As a result, RNA-Skim uses <4% of the k-mers and <10% of the CPU time required by Sailfish. It is able to finish transcriptome quantification in <10 min per sample by using just a single thread on a commodity computer, which represents >100 speedup over the state-of-the-art alignment-based methods, while delivering comparable or higher accuracy. Availability and implementation: The software is available at http://www.csbio.unc.edu/rs. Contact: weiwang@cs.ucla.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24931995
den Braver, Michiel W; Vermeulen, Nico P E; Commandeur, Jan N M
2017-03-01
Modification of cellular macromolecules by reactive drug metabolites is considered to play an important role in the initiation of tissue injury by many drugs. Detection and identification of reactive intermediates is often performed by analyzing the conjugates formed after trapping by glutathione (GSH). Although sensitivity of modern mass spectrometrical methods is extremely high, absolute quantification of GSH-conjugates is critically dependent on the availability of authentic references. Although 1 H NMR is currently the method of choice for quantification of metabolites formed biosynthetically, its intrinsically low sensitivity can be a limiting factor in quantification of GSH-conjugates which generally are formed at low levels. In the present study, a simple but sensitive and generic method for absolute quantification of GSH-conjugates is presented. The method is based on quantitative alkaline hydrolysis of GSH-conjugates and subsequent quantification of glutamic acid and glycine by HPLC after precolumn derivatization with o-phthaldialdehyde/N-acetylcysteine (OPA/NAC). Because of the lower stability of the glycine OPA/NAC-derivate, quantification of the glutamic acid OPA/NAC-derivate appeared most suitable for quantification of GSH-conjugates. The novel method was used to quantify the concentrations of GSH-conjugates of diclofenac, clozapine and acetaminophen and quantification was consistent with 1 H NMR, but with a more than 100-fold lower detection limit for absolute quantification. Copyright © 2017. Published by Elsevier B.V.
Shimizu, Eri; Kato, Hisashi; Nakagawa, Yuki; Kodama, Takashi; Futo, Satoshi; Minegishi, Yasutaka; Watanabe, Takahiro; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi
2008-07-23
A novel type of quantitative competitive polymerase chain reaction (QC-PCR) system for the detection and quantification of the Roundup Ready soybean (RRS) was developed. This system was designed based on the advantage of a fully validated real-time PCR method used for the quantification of RRS in Japan. A plasmid was constructed as a competitor plasmid for the detection and quantification of genetically modified soy, RRS. The plasmid contained the construct-specific sequence of RRS and the taxon-specific sequence of lectin1 (Le1), and both had 21 bp oligonucleotide insertion in the sequences. The plasmid DNA was used as a reference molecule instead of ground seeds, which enabled us to precisely and stably adjust the copy number of targets. The present study demonstrated that the novel plasmid-based QC-PCR method could be a simple and feasible alternative to the real-time PCR method used for the quantification of genetically modified organism contents.
Quantifying construction and demolition waste: An analytical review
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Zezhou; Yu, Ann T.W., E-mail: bsannyu@polyu.edu.hk; Shen, Liyin
2014-09-15
Highlights: • Prevailing C and D waste quantification methodologies are identified and compared. • One specific methodology cannot fulfill all waste quantification scenarios. • A relevance tree for appropriate quantification methodology selection is proposed. • More attentions should be paid to civil and infrastructural works. • Classified information is suggested for making an effective waste management plan. - Abstract: Quantifying construction and demolition (C and D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C and D waste generation at both regional and projectmore » levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C and D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested.« less
Alves, L.P.S.; Almeida, A.T.; Cruz, L.M.; Pedrosa, F.O.; de Souza, E.M.; Chubatsu, L.S.; Müller-Santos, M.; Valdameri, G.
2017-01-01
The conventional method for quantification of polyhydroxyalkanoates based on whole-cell methanolysis and gas chromatography (GC) is laborious and time-consuming. In this work, a method based on flow cytometry of Nile red stained bacterial cells was established to quantify poly-3-hydroxybutyrate (PHB) production by the diazotrophic and plant-associated bacteria, Herbaspirillum seropedicae and Azospirillum brasilense. The method consists of three steps: i) cell permeabilization, ii) Nile red staining, and iii) analysis by flow cytometry. The method was optimized step-by-step and can be carried out in less than 5 min. The final results indicated a high correlation coefficient (R2=0.99) compared to a standard method based on methanolysis and GC. This method was successfully applied to the quantification of PHB in epiphytic bacteria isolated from rice roots. PMID:28099582
Wang, Hanghang; Muehlbauer, Michael J.; O’Neal, Sara K.; Newgard, Christopher B.; Hauser, Elizabeth R.; Shah, Svati H.
2017-01-01
The field of metabolomics as applied to human disease and health is rapidly expanding. In recent efforts of metabolomics research, greater emphasis has been placed on quality control and method validation. In this study, we report an experience with quality control and a practical application of method validation. Specifically, we sought to identify and modify steps in gas chromatography-mass spectrometry (GC-MS)-based, non-targeted metabolomic profiling of human plasma that could influence metabolite identification and quantification. Our experimental design included two studies: (1) a limiting-dilution study, which investigated the effects of dilution on analyte identification and quantification; and (2) a concentration-specific study, which compared the optimal plasma extract volume established in the first study with the volume used in the current institutional protocol. We confirmed that contaminants, concentration, repeatability and intermediate precision are major factors influencing metabolite identification and quantification. In addition, we established methods for improved metabolite identification and quantification, which were summarized to provide recommendations for experimental design of GC-MS-based non-targeted profiling of human plasma. PMID:28841195
Lamb Wave Damage Quantification Using GA-Based LS-SVM.
Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong
2017-06-12
Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.
Lamb Wave Damage Quantification Using GA-Based LS-SVM
Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong
2017-01-01
Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification. PMID:28773003
Recent application of quantification II in Japanese medical research.
Suzuki, T; Kudo, A
1979-01-01
Hayashi's Quantification II is a method of multivariate discrimination analysis to manipulate attribute data as predictor variables. It is very useful in the medical research field for estimation, diagnosis, prognosis, evaluation of epidemiological factors, and other problems based on multiplicity of attribute data. In Japan, this method is so well known that most of the computer program packages include the Hayashi Quantification, but it seems to be yet unfamiliar with the method for researchers outside Japan. In view of this situation, we introduced 19 selected articles of recent applications of the Quantification II in Japanese medical research. In reviewing these papers, special mention is made to clarify how the researchers were satisfied with findings provided by the method. At the same time, some recommendations are made about terminology and program packages. Also a brief discussion of the background of the quantification methods is given with special reference to the Behaviormetric Society of Japan. PMID:540587
Richardson, Keith; Denny, Richard; Hughes, Chris; Skilling, John; Sikora, Jacek; Dadlez, Michał; Manteca, Angel; Jung, Hye Ryung; Jensen, Ole Nørregaard; Redeker, Virginie; Melki, Ronald; Langridge, James I.; Vissers, Johannes P.C.
2013-01-01
A probability-based quantification framework is presented for the calculation of relative peptide and protein abundance in label-free and label-dependent LC-MS proteomics data. The results are accompanied by credible intervals and regulation probabilities. The algorithm takes into account data uncertainties via Poisson statistics modified by a noise contribution that is determined automatically during an initial normalization stage. Protein quantification relies on assignments of component peptides to the acquired data. These assignments are generally of variable reliability and may not be present across all of the experiments comprising an analysis. It is also possible for a peptide to be identified to more than one protein in a given mixture. For these reasons the algorithm accepts a prior probability of peptide assignment for each intensity measurement. The model is constructed in such a way that outliers of any type can be automatically reweighted. Two discrete normalization methods can be employed. The first method is based on a user-defined subset of peptides, while the second method relies on the presence of a dominant background of endogenous peptides for which the concentration is assumed to be unaffected. Normalization is performed using the same computational and statistical procedures employed by the main quantification algorithm. The performance of the algorithm will be illustrated on example data sets, and its utility demonstrated for typical proteomics applications. The quantification algorithm supports relative protein quantification based on precursor and product ion intensities acquired by means of data-dependent methods, originating from all common isotopically-labeled approaches, as well as label-free ion intensity-based data-independent methods. PMID:22871168
Rauniyar, Navin
2015-01-01
The parallel reaction monitoring (PRM) assay has emerged as an alternative method of targeted quantification. The PRM assay is performed in a high resolution and high mass accuracy mode on a mass spectrometer. This review presents the features that make PRM a highly specific and selective method for targeted quantification using quadrupole-Orbitrap hybrid instruments. In addition, this review discusses the label-based and label-free methods of quantification that can be performed with the targeted approach. PMID:26633379
Liu, Junyan; Liu, Yang; Gao, Mingxia; Zhang, Xiangmin
2012-08-01
A facile proteomic quantification method, fluorescent labeling absolute quantification (FLAQ), was developed. Instead of using MS for quantification, the FLAQ method is a chromatography-based quantification in combination with MS for identification. Multidimensional liquid chromatography (MDLC) with laser-induced fluorescence (LIF) detection with high accuracy and tandem MS system were employed for FLAQ. Several requirements should be met for fluorescent labeling in MS identification: Labeling completeness, minimum side-reactions, simple MS spectra, and no extra tandem MS fragmentations for structure elucidations. A fluorescence dye, 5-iodoacetamidofluorescein, was finally chosen to label proteins on all cysteine residues. The fluorescent dye was compatible with the process of the trypsin digestion and MALDI MS identification. Quantitative labeling was achieved with optimization of reacting conditions. A synthesized peptide and model proteins, BSA (35 cysteines), OVA (five cysteines), were used for verifying the completeness of labeling. Proteins were separated through MDLC and quantified based on fluorescent intensities, followed by MS identification. High accuracy (RSD% < 1.58) and wide linearity of quantification (1-10(5) ) were achieved by LIF detection. The limit of quantitation for the model protein was as low as 0.34 amol. Parts of proteins in human liver proteome were quantified and demonstrated using FLAQ. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Quantitative proteome analysis using isobaric peptide termini labeling (IPTL).
Arntzen, Magnus O; Koehler, Christian J; Treumann, Achim; Thiede, Bernd
2011-01-01
The quantitative comparison of proteome level changes across biological samples has become an essential feature in proteomics that remains challenging. We have recently introduced isobaric peptide termini labeling (IPTL), a novel strategy for isobaric quantification based on the derivatization of peptide termini with complementary isotopically labeled reagents. Unlike non-isobaric quantification methods, sample complexity at the MS level is not increased, providing improved sensitivity and protein coverage. The distinguishing feature of IPTL when comparing it to more established isobaric labeling methods (iTRAQ and TMT) is the presence of quantification signatures in all sequence-determining ions in MS/MS spectra, not only in the low mass reporter ion region. This makes IPTL a quantification method that is accessible to mass spectrometers with limited capabilities in the low mass range. Also, the presence of several quantification points in each MS/MS spectrum increases the robustness of the quantification procedure.
MRI-based methods for quantification of the cerebral metabolic rate of oxygen
Rodgers, Zachary B; Detre, John A
2016-01-01
The brain depends almost entirely on oxidative metabolism to meet its significant energy requirements. As such, the cerebral metabolic rate of oxygen (CMRO2) represents a key measure of brain function. Quantification of CMRO2 has helped elucidate brain functional physiology and holds potential as a clinical tool for evaluating neurological disorders including stroke, brain tumors, Alzheimer’s disease, and obstructive sleep apnea. In recent years, a variety of magnetic resonance imaging (MRI)-based CMRO2 quantification methods have emerged. Unlike positron emission tomography – the current “gold standard” for measurement and mapping of CMRO2 – MRI is non-invasive, relatively inexpensive, and ubiquitously available in modern medical centers. All MRI-based CMRO2 methods are based on modeling the effect of paramagnetic deoxyhemoglobin on the magnetic resonance signal. The various methods can be classified in terms of the MRI contrast mechanism used to quantify CMRO2: T2*, T2′, T2, or magnetic susceptibility. This review article provides an overview of MRI-based CMRO2 quantification techniques. After a brief historical discussion motivating the need for improved CMRO2 methodology, current state-of-the-art MRI-based methods are critically appraised in terms of their respective tradeoffs between spatial resolution, temporal resolution, and robustness, all of critical importance given the spatially heterogeneous and temporally dynamic nature of brain energy requirements. PMID:27089912
Chen, Xing; Pavan, Matteo; Heinzer-Schweizer, Susanne; Boesiger, Peter; Henning, Anke
2012-01-01
This report describes our efforts on quantification of tissue metabolite concentrations in mM by nuclear Overhauser enhanced and proton decoupled (13) C magnetic resonance spectroscopy and the Electric Reference To access In vivo Concentrations (ERETIC) method. Previous work showed that a calibrated synthetic magnetic resonance spectroscopy-like signal transmitted through an optical fiber and inductively coupled into a transmit/receive coil represents a reliable reference standard for in vivo (1) H magnetic resonance spectroscopy quantification on a clinical platform. In this work, we introduce a related implementation that enables simultaneous proton decoupling and ERETIC-based metabolite quantification and hence extends the applicability of the ERETIC method to nuclear Overhauser enhanced and proton decoupled in vivo (13) C magnetic resonance spectroscopy. In addition, ERETIC signal stability under the influence of simultaneous proton decoupling is investigated. The proposed quantification method was cross-validated against internal and external reference standards on human skeletal muscle. The ERETIC signal intensity stability was 100.65 ± 4.18% over 3 months including measurements with and without proton decoupling. Glycogen and unsaturated fatty acid concentrations measured with the ERETIC method were in excellent agreement with internal creatine and external phantom reference methods, showing a difference of 1.85 ± 1.21% for glycogen and 1.84 ± 1.00% for unsaturated fatty acid between ERETIC and creatine-based quantification, whereas the deviations between external reference and creatine-based quantification are 6.95 ± 9.52% and 3.19 ± 2.60%, respectively. Copyright © 2011 Wiley Periodicals, Inc.
Surface smoothness: cartilage biomarkers for knee OA beyond the radiologist
NASA Astrophysics Data System (ADS)
Tummala, Sudhakar; Dam, Erik B.
2010-03-01
Fully automatic imaging biomarkers may allow quantification of patho-physiological processes that a radiologist would not be able to assess reliably. This can introduce new insight but is problematic to validate due to lack of meaningful ground truth expert measurements. Rather than quantification accuracy, such novel markers must therefore be validated against clinically meaningful end-goals such as the ability to allow correct diagnosis. We present a method for automatic cartilage surface smoothness quantification in the knee joint. The quantification is based on a curvature flow method used on tibial and femoral cartilage compartments resulting from an automatic segmentation scheme. These smoothness estimates are validated for their ability to diagnose osteoarthritis and compared to smoothness estimates based on manual expert segmentations and to conventional cartilage volume quantification. We demonstrate that the fully automatic markers eliminate the time required for radiologist annotations, and in addition provide a diagnostic marker superior to the evaluated semi-manual markers.
[DNA quantification of blood samples pre-treated with pyramidon].
Zhu, Chuan-Hong; Zheng, Dao-Li; Ni, Rao-Zhi; Wang, Hai-Sheng; Ning, Ping; Fang, Hui; Liu, Yan
2014-06-01
To study DNA quantification and STR typing of samples pre-treated with pyramidon. The blood samples of ten unrelated individuals were anticoagulated in EDTA. The blood stains were made on the filter paper. The experimental groups were divided into six groups in accordance with the storage time, 30 min, 1 h, 3 h, 6 h, 12 h and 24h after pre-treated with pyramidon. DNA was extracted by three methods: magnetic bead-based extraction, QIAcube DNA purification method and Chelex-100 method. The quantification of DNA was made by fluorescent quantitative PCR. STR typing was detected by PCR-STR fluorescent technology. In the same DNA extraction method, the sample DNA decreased gradually with times after pre-treatment with pyramidon. In the same storage time, the DNA quantification in different extraction methods had significant differences. Sixteen loci DNA typing were detected in 90.56% of samples. Pyramidon pre-treatment could cause DNA degradation, but effective STR typing can be achieved within 24 h. The magnetic bead-based extraction is the best method for STR profiling and DNA extraction.
A Cyber-Attack Detection Model Based on Multivariate Analyses
NASA Astrophysics Data System (ADS)
Sakai, Yuto; Rinsaka, Koichiro; Dohi, Tadashi
In the present paper, we propose a novel cyber-attack detection model based on two multivariate-analysis methods to the audit data observed on a host machine. The statistical techniques used here are the well-known Hayashi's quantification method IV and cluster analysis method. We quantify the observed qualitative audit event sequence via the quantification method IV, and collect similar audit event sequence in the same groups based on the cluster analysis. It is shown in simulation experiments that our model can improve the cyber-attack detection accuracy in some realistic cases where both normal and attack activities are intermingled.
Quantification of Training and Competition Loads in Endurance Sports: Methods and Applications.
Mujika, Iñigo
2017-04-01
Training quantification is basic to evaluate an endurance athlete's responses to training loads, ensure adequate stress/recovery balance, and determine the relationship between training and performance. Quantifying both external and internal workload is important, because external workload does not measure the biological stress imposed by the exercise sessions. Generally used quantification methods include retrospective questionnaires, diaries, direct observation, and physiological monitoring, often based on the measurement of oxygen uptake, heart rate, and blood lactate concentration. Other methods in use in endurance sports include speed measurement and the measurement of power output, made possible by recent technological advances such as power meters in cycling and triathlon. Among subjective methods of quantification, rating of perceived exertion stands out because of its wide use. Concurrent assessments of the various quantification methods allow researchers and practitioners to evaluate stress/recovery balance, adjust individual training programs, and determine the relationships between external load, internal load, and athletes' performance. This brief review summarizes the most relevant external- and internal-workload-quantification methods in endurance sports and provides practical examples of their implementation to adjust the training programs of elite athletes in accordance with their individualized stress/recovery balance.
Liquid Chromatography-Mass Spectrometry-based Quantitative Proteomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Xie, Fang; Liu, Tao; Qian, Weijun
2011-07-22
Liquid chromatography-mass spectrometry (LC-MS)-based quantitative proteomics has become increasingly applied for a broad range of biological applications due to growing capabilities for broad proteome coverage and good accuracy in quantification. Herein, we review the current LC-MS-based quantification methods with respect to their advantages and limitations, and highlight their potential applications.
New approach for the quantification of processed animal proteins in feed using light microscopy.
Veys, P; Baeten, V
2010-07-01
A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.
Chai, Liuying; Zhang, Jianwei; Zhang, Lili; Chen, Tongsheng
2015-03-01
Spectral measurement of fluorescence resonance energy transfer (FRET), spFRET, is a widely used FRET quantification method in living cells today. We set up a spectrometer-microscope platform that consists of a miniature fiber optic spectrometer and a widefield fluorescence microscope for the spectral measurement of absolute FRET efficiency (E) and acceptor-to-donor concentration ratio (R(C)) in single living cells. The microscope was used for guiding cells and the spectra were simultaneously detected by the miniature fiber optic spectrometer. Moreover, our platform has independent excitation and emission controllers, so different excitations can share the same emission channel. In addition, we developed a modified spectral FRET quantification method (mlux-FRET) for the multiple donors and multiple acceptors FRET construct (mD∼nA) sample, and we also developed a spectra-based 2-channel acceptor-sensitized FRET quantification method (spE-FRET). We implemented these modified FRET quantification methods on our platform to measure the absolute E and R(C) values of tandem constructs with different acceptor/donor stoichiometries in single living Huh-7 cells.
Jiménez-Carvelo, Ana M; González-Casado, Antonio; Cuadros-Rodríguez, Luis
2017-03-01
A new analytical method for the quantification of olive oil and palm oil in blends with other vegetable edible oils (canola, safflower, corn, peanut, seeds, grapeseed, linseed, sesame and soybean) using normal phase liquid chromatography, and applying chemometric tools was developed. The procedure for obtaining of chromatographic fingerprint from the methyl-transesterified fraction from each blend is described. The multivariate quantification methods used were Partial Least Square-Regression (PLS-R) and Support Vector Regression (SVR). The quantification results were evaluated by several parameters as the Root Mean Square Error of Validation (RMSEV), Mean Absolute Error of Validation (MAEV) and Median Absolute Error of Validation (MdAEV). It has to be highlighted that the new proposed analytical method, the chromatographic analysis takes only eight minutes and the results obtained showed the potential of this method and allowed quantification of mixtures of olive oil and palm oil with other vegetable oils. Copyright © 2016 Elsevier B.V. All rights reserved.
Artifacts Quantification of Metal Implants in MRI
NASA Astrophysics Data System (ADS)
Vrachnis, I. N.; Vlachopoulos, G. F.; Maris, T. G.; Costaridou, L. I.
2017-11-01
The presence of materials with different magnetic properties, such as metal implants, causes distortion of the magnetic field locally, resulting in signal voids and pile ups, i.e. susceptibility artifacts in MRI. Quantitative and unbiased measurement of the artifact is prerequisite for optimization of acquisition parameters. In this study an image gradient based segmentation method is proposed for susceptibility artifact quantification. The method captures abrupt signal alterations by calculation of the image gradient. Then the artifact is quantified in terms of its extent by an automated cross entropy thresholding method as image area percentage. The proposed method for artifact quantification was tested in phantoms containing two orthopedic implants with significantly different magnetic permeabilities. The method was compared against a method proposed in the literature, considered as a reference, demonstrating moderate to good correlation (Spearman’s rho = 0.62 and 0.802 in case of titanium and stainless steel implants). The automated character of the proposed quantification method seems promising towards MRI acquisition parameter optimization.
Simple, Fast, and Sensitive Method for Quantification of Tellurite in Culture Media▿
Molina, Roberto C.; Burra, Radhika; Pérez-Donoso, José M.; Elías, Alex O.; Muñoz, Claudia; Montes, Rebecca A.; Chasteen, Thomas G.; Vásquez, Claudio C.
2010-01-01
A fast, simple, and reliable chemical method for tellurite quantification is described. The procedure is based on the NaBH4-mediated reduction of TeO32− followed by the spectrophotometric determination of elemental tellurium in solution. The method is highly reproducible, is stable at different pH values, and exhibits linearity over a broad range of tellurite concentrations. PMID:20525868
De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos
2014-06-01
Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.
Eriksen, Jane N; Madsen, Pia L; Dragsted, Lars O; Arrigoni, Eva
2017-02-01
An improved UHPLC-DAD-based method was developed and validated for quantification of major carotenoids present in spinach, serum, chylomicrons, and feces. Separation was achieved with gradient elution within 12.5 min for six dietary carotenoids and the internal standard, echinenone. The proposed method provides, for all standard components, resolution > 1.1, linearity covering the target range (R > 0.99), LOQ < 0.035 mg/L, and intraday and interday RSDs < 2 and 10%, respectively. Suitability of the method was tested on biological matrices. Method precision (RSD%) for carotenoid quantification in serum, chylomicrons, and feces was below 10% for intra- and interday analysis, except for lycopene. Method accuracy was consistent with mean recoveries ranging from 78.8 to 96.9% and from 57.2 to 96.9% for all carotenoids, except for lycopene, in serum and feces, respectively. Additionally, an interlaboratory validation study on spinach at two institutions showed no significant differences in lutein or β-carotene content, when evaluated on four occasions.
Al Feteisi, Hajar; Achour, Brahim; Rostami-Hodjegan, Amin; Barber, Jill
2015-01-01
Drug-metabolizing enzymes and transporters play an important role in drug absorption, distribution, metabolism and excretion and, consequently, they influence drug efficacy and toxicity. Quantification of drug-metabolizing enzymes and transporters in various tissues is therefore essential for comprehensive elucidation of drug absorption, distribution, metabolism and excretion. Recent advances in liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS) have improved the quantification of pharmacologically relevant proteins. This report presents an overview of mass spectrometry-based methods currently used for the quantification of drug-metabolizing enzymes and drug transporters, mainly focusing on applications and cost associated with various quantitative strategies based on stable isotope-labeled standards (absolute quantification peptide standards, quantification concatemers, protein standards for absolute quantification) and label-free analysis. In mass spectrometry, there is no simple relationship between signal intensity and analyte concentration. Proteomic strategies are therefore complex and several factors need to be considered when selecting the most appropriate method for an intended application, including the number of proteins and samples. Quantitative strategies require appropriate mass spectrometry platforms, yet choice is often limited by the availability of appropriate instrumentation. Quantitative proteomics research requires specialist practical skills and there is a pressing need to dedicate more effort and investment to training personnel in this area. Large-scale multicenter collaborations are also needed to standardize quantitative strategies in order to improve physiologically based pharmacokinetic models.
Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations
Higgs, Richard E.; Butler, Jon P.; Han, Bomie; Knierman, Michael D.
2013-01-01
Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference. PMID:23710359
Chen, Xiaoxia; Zhao, Jing; Chen, Tianshu; Gao, Tao; Zhu, Xiaoli; Li, Genxi
2018-01-01
Comprehensive analysis of the expression level and location of tumor-associated membrane proteins (TMPs) is of vital importance for the profiling of tumor cells. Currently, two kinds of independent techniques, i.e. ex situ detection and in situ imaging, are usually required for the quantification and localization of TMPs respectively, resulting in some inevitable problems. Methods: Herein, based on a well-designed and fluorophore-labeled DNAzyme, we develop an integrated and facile method, in which imaging and quantification of TMPs in situ are achieved simultaneously in a single system. The labeled DNAzyme not only produces localized fluorescence for the visualization of TMPs but also catalyzes the cleavage of a substrate to produce quantitative fluorescent signals that can be collected from solution for the sensitive detection of TMPs. Results: Results from the DNAzyme-based in situ imaging and quantification of TMPs match well with traditional immunofluorescence and western blotting. In addition to the advantage of two-in-one, the DNAzyme-based method is highly sensitivity, allowing the detection of TMPs in only 100 cells. Moreover, the method is nondestructive. Cells after analysis could retain their physiological activity and could be cultured for other applications. Conclusion: The integrated system provides solid results for both imaging and quantification of TMPs, making it a competitive method over some traditional techniques for the analysis of TMPs, which offers potential application as a toolbox in the future.
Laurie, Matthew T; Bertout, Jessica A; Taylor, Sean D; Burton, Joshua N; Shendure, Jay A; Bielas, Jason H
2013-08-01
Due to the high cost of failed runs and suboptimal data yields, quantification and determination of fragment size range are crucial steps in the library preparation process for massively parallel sequencing (or next-generation sequencing). Current library quality control methods commonly involve quantification using real-time quantitative PCR and size determination using gel or capillary electrophoresis. These methods are laborious and subject to a number of significant limitations that can make library calibration unreliable. Herein, we propose and test an alternative method for quality control of sequencing libraries using droplet digital PCR (ddPCR). By exploiting a correlation we have discovered between droplet fluorescence and amplicon size, we achieve the joint quantification and size determination of target DNA with a single ddPCR assay. We demonstrate the accuracy and precision of applying this method to the preparation of sequencing libraries.
Deng, Yong; Luo, Zhaoyang; Jiang, Xu; Xie, Wenhao; Luo, Qingming
2015-07-01
We propose a method based on a decoupled fluorescence Monte Carlo model for constructing fluorescence Jacobians to enable accurate quantification of fluorescence targets within turbid media. The effectiveness of the proposed method is validated using two cylindrical phantoms enclosing fluorescent targets within homogeneous and heterogeneous background media. The results demonstrate that our method can recover relative concentrations of the fluorescent targets with higher accuracy than the perturbation fluorescence Monte Carlo method. This suggests that our method is suitable for quantitative fluorescence diffuse optical tomography, especially for in vivo imaging of fluorophore targets for diagnosis of different diseases and abnormalities.
Source separation on hyperspectral cube applied to dermatology
NASA Astrophysics Data System (ADS)
Mitra, J.; Jolivot, R.; Vabres, P.; Marzani, F. S.
2010-03-01
This paper proposes a method of quantification of the components underlying the human skin that are supposed to be responsible for the effective reflectance spectrum of the skin over the visible wavelength. The method is based on independent component analysis assuming that the epidermal melanin and the dermal haemoglobin absorbance spectra are independent of each other. The method extracts the source spectra that correspond to the ideal absorbance spectra of melanin and haemoglobin. The noisy melanin spectrum is fixed using a polynomial fit and the quantifications associated with it are reestimated. The results produce feasible quantifications of each source component in the examined skin patch.
Fei, Baowei; Yang, Xiaofeng; Nye, Jonathon A.; Aarsvold, John N.; Raghunath, Nivedita; Cervo, Morgan; Stark, Rebecca; Meltzer, Carolyn C.; Votaw, John R.
2012-01-01
Purpose: Combined MR/PET is a relatively new, hybrid imaging modality. A human MR/PET prototype system consisting of a Siemens 3T Trio MR and brain PET insert was installed and tested at our institution. Its present design does not offer measured attenuation correction (AC) using traditional transmission imaging. This study is the development of quantification tools including MR-based AC for quantification in combined MR/PET for brain imaging. Methods: The developed quantification tools include image registration, segmentation, classification, and MR-based AC. These components were integrated into a single scheme for processing MR/PET data. The segmentation method is multiscale and based on the Radon transform of brain MR images. It was developed to segment the skull on T1-weighted MR images. A modified fuzzy C-means classification scheme was developed to classify brain tissue into gray matter, white matter, and cerebrospinal fluid. Classified tissue is assigned an attenuation coefficient so that AC factors can be generated. PET emission data are then reconstructed using a three-dimensional ordered sets expectation maximization method with the MR-based AC map. Ten subjects had separate MR and PET scans. The PET with [11C]PIB was acquired using a high-resolution research tomography (HRRT) PET. MR-based AC was compared with transmission (TX)-based AC on the HRRT. Seventeen volumes of interest were drawn manually on each subject image to compare the PET activities between the MR-based and TX-based AC methods. Results: For skull segmentation, the overlap ratio between our segmented results and the ground truth is 85.2 ± 2.6%. Attenuation correction results from the ten subjects show that the difference between the MR and TX-based methods was <6.5%. Conclusions: MR-based AC compared favorably with conventional transmission-based AC. Quantitative tools including registration, segmentation, classification, and MR-based AC have been developed for use in combined MR/PET. PMID:23039679
Prado, Marta; Boix, Ana; von Holst, Christoph
2012-07-01
The development of DNA-based methods for the identification and quantification of fish in food and feed samples is frequently focused on a specific fish species and/or on the detection of mitochondrial DNA of fish origin. However, a quantitative method for the most common fish species used by the food and feed industry is needed for official control purposes, and such a method should rely on the use of a single-copy nuclear DNA target owing to its more stable copy number in different tissues. In this article, we report on the development of a real-time PCR method based on the use of a nuclear gene as a target for the simultaneous detection of fish DNA from different species and on the evaluation of its quantification potential. The method was tested in 22 different fish species, including those most commonly used by the food and feed industry, and in negative control samples, which included 15 animal species and nine feed ingredients. The results show that the method reported here complies with the requirements concerning specificity and with the criteria required for real-time PCR methods with high sensitivity.
Ermacora, Alessia; Hrnčiřík, Karel
2014-01-01
Substantial progress has been recently made in the development and optimisation of analytical methods for the quantification of 2-MCPD, 3-MCPD and glycidyl esters in oils and fats, and there are a few methods currently available that allow a reliable quantification of these contaminants in bulk oils and fats. On the other hand, no standard method for the analysis of foodstuffs has yet been established. The aim of this study was the development and validation of a new method for the simultaneous quantification of 2-MCPD, 3-MCPD and glycidyl esters in oil-based food products. The developed protocol includes a first step of liquid-liquid extraction and purification of the lipophilic substances of the sample, followed by the application of a previously developed procedure based on acid transesterification, for the indirect quantification of these contaminants in oils and fats. The method validation was carried out on food products (fat-based spreads, creams, margarine, mayonnaise) manufactured in-house, in order to control the manufacturing process and account for any food matrix-analyte interactions (the sample spiking was carried out on the single components used for the formulations rather than the final products). The method showed good accuracy (the recoveries ranged from 97% to 106% for bound 3-MCPD and 2-MCPD and from 88% to 115% for bound glycidol) and sensitivity (the LOD was 0.04 and 0.05 mg kg(-1) for bound MCPD and glycidol, respectively). Repeatability and reproducibility were satisfactory (RSD below 2% and 5%, respectively) for all analytes. The levels of salts and surface-active compounds in the formulation were found to have no impact on the accuracy and the other parameters of the method.
Liquid chromatography tandem-mass spectrometry (LC-MS/MS)- based methods such as isobaric tags for relative and absolute quantification (iTRAQ) and tandem mass tags (TMT) have been shown to provide overall better quantification accuracy and reproducibility over other LC-MS/MS techniques. However, large scale projects like the Clinical Proteomic Tumor Analysis Consortium (CPTAC) require comparisons across many genomically characterized clinical specimens in a single study and often exceed the capability of traditional iTRAQ-based quantification.
Plasma protein absolute quantification by nano-LC Q-TOF UDMSE for clinical biomarker verification
ILIES, MARIA; IUGA, CRISTINA ADELA; LOGHIN, FELICIA; DHOPLE, VISHNU MUKUND; HAMMER, ELKE
2017-01-01
Background and aims Proteome-based biomarker studies are targeting proteins that could serve as diagnostic, prognosis, and prediction molecules. In the clinical routine, immunoassays are currently used for the absolute quantification of such biomarkers, with the major limitation that only one molecule can be targeted per assay. The aim of our study was to test a mass spectrometry based absolute quantification method for the verification of plasma protein sets which might serve as reliable biomarker panels for the clinical practice. Methods Six EDTA plasma samples were analyzed after tryptic digestion using a high throughput data independent acquisition nano-LC Q-TOF UDMSE proteomics approach. Synthetic Escherichia coli standard peptides were spiked in each sample for the absolute quantification. Data analysis was performed using ProgenesisQI v2.0 software (Waters Corporation). Results Our method ensured absolute quantification of 242 non redundant plasma proteins in a single run analysis. The dynamic range covered was 105. 86% were represented by classical plasma proteins. The overall median coefficient of variation was 0.36, while a set of 63 proteins was found to be highly stable. Absolute protein concentrations strongly correlated with values reviewed in the literature. Conclusions Nano-LC Q-TOF UDMSE proteomic analysis can be used for a simple and rapid determination of absolute amounts of plasma proteins. A large number of plasma proteins could be analyzed, while a wide dynamic range was covered with low coefficient of variation at protein level. The method proved to be a reliable tool for the quantification of protein panel for biomarker verification in the clinical practice. PMID:29151793
Loziuk, Philip L.; Sederoff, Ronald R.; Chiang, Vincent L.; Muddiman, David C.
2014-01-01
Quantitative mass spectrometry has become central to the field of proteomics and metabolomics. Selected reaction monitoring is a widely used method for the absolute quantification of proteins and metabolites. This method renders high specificity using several product ions measured simultaneously. With growing interest in quantification of molecular species in complex biological samples, confident identification and quantitation has been of particular concern. A method to confirm purity or contamination of product ion spectra has become necessary for achieving accurate and precise quantification. Ion abundance ratio assessments were introduced to alleviate some of these issues. Ion abundance ratios are based on the consistent relative abundance (RA) of specific product ions with respect to the total abundance of all product ions. To date, no standardized method of implementing ion abundance ratios has been established. Thresholds by which product ion contamination is confirmed vary widely and are often arbitrary. This study sought to establish criteria by which the relative abundance of product ions can be evaluated in an absolute quantification experiment. These findings suggest that evaluation of the absolute ion abundance for any given transition is necessary in order to effectively implement RA thresholds. Overall, the variation of the RA value was observed to be relatively constant beyond an absolute threshold ion abundance. Finally, these RA values were observed to fluctuate significantly over a 3 year period, suggesting that these values should be assessed as close as possible to the time at which data is collected for quantification. PMID:25154770
Microfluidics-based digital quantitative PCR for single-cell small RNA quantification.
Yu, Tian; Tang, Chong; Zhang, Ying; Zhang, Ruirui; Yan, Wei
2017-09-01
Quantitative analyses of small RNAs at the single-cell level have been challenging because of limited sensitivity and specificity of conventional real-time quantitative PCR methods. A digital quantitative PCR (dqPCR) method for miRNA quantification has been developed, but it requires the use of proprietary stem-loop primers and only applies to miRNA quantification. Here, we report a microfluidics-based dqPCR (mdqPCR) method, which takes advantage of the Fluidigm BioMark HD system for both template partition and the subsequent high-throughput dqPCR. Our mdqPCR method demonstrated excellent sensitivity and reproducibility suitable for quantitative analyses of not only miRNAs but also all other small RNA species at the single-cell level. Using this method, we discovered that each sperm has a unique miRNA profile. © The Authors 2017. Published by Oxford University Press on behalf of Society for the Study of Reproduction. All rights reserved. For permissions, please e-mail: journals.permissions@oup.com.
Gallo-Oller, Gabriel; Ordoñez, Raquel; Dotor, Javier
2018-06-01
Since its first description, Western blot has been widely used in molecular labs. It constitutes a multistep method that allows the detection and/or quantification of proteins from simple to complex protein mixtures. Western blot quantification method constitutes a critical step in order to obtain accurate and reproducible results. Due to the technical knowledge required for densitometry analysis together with the resources availability, standard office scanners are often used for the imaging acquisition of developed Western blot films. Furthermore, the use of semi-quantitative software as ImageJ (Java-based image-processing and analysis software) is clearly increasing in different scientific fields. In this work, we describe the use of office scanner coupled with the ImageJ software together with a new image background subtraction method for accurate Western blot quantification. The proposed method represents an affordable, accurate and reproducible approximation that could be used in the presence of limited resources availability. Copyright © 2018 Elsevier B.V. All rights reserved.
1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.
Dagnino, Denise; Schripsema, Jan
2005-08-01
A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.
Pedersen, S N; Lindholst, C
1999-12-09
Extraction methods were developed for quantification of the xenoestrogens 4-tert.-octylphenol (tOP) and bisphenol A (BPA) in water and in liver and muscle tissue from the rainbow trout (Oncorhynchus mykiss). The extraction of tOP and BPA from tissue samples was carried out using microwave-assisted solvent extraction (MASE) followed by solid-phase extraction (SPE). Water samples were extracted using only SPE. For the quantification of tOP and BPA, liquid chromatography mass spectrometry (LC-MS) equipped with an atmospheric pressure chemical ionisation interface (APCI) was applied. The combined methods for tissue extraction allow the use of small sample amounts of liver or muscle (typically 1 g), low volumes of solvent (20 ml), and short extraction times (25 min). Limits of quantification of tOP in tissue samples were found to be approximately 10 ng/g in muscle and 50 ng/g in liver (both based on 1 g of fresh tissue). The corresponding values for BPA were approximately 50 ng/g in both muscle and liver tissue. In water, the limit of quantification for tOP and BPA was approximately 0.1 microg/l (based on 100 ml sample size).
Assessment of cardiac fibrosis: a morphometric method comparison for collagen quantification.
Schipke, Julia; Brandenberger, Christina; Rajces, Alexandra; Manninger, Martin; Alogna, Alessio; Post, Heiner; Mühlfeld, Christian
2017-04-01
Fibrotic remodeling of the heart is a frequent condition linked to various diseases and cardiac dysfunction. Collagen quantification is an important objective in cardiac fibrosis research; however, a variety of different histological methods are currently used that may differ in accuracy. Here, frequently applied collagen quantification techniques were compared. A porcine model of early stage heart failure with preserved ejection fraction was used as an example. Semiautomated threshold analyses were imprecise, mainly due to inclusion of noncollagen structures or failure to detect certain collagen deposits. In contrast, collagen assessment by automated image analysis and light microscopy (LM)-stereology was more sensitive. Depending on the quantification method, the amount of estimated collagen varied and influenced intergroup comparisons. PicroSirius Red, Masson's trichrome, and Azan staining protocols yielded similar results, whereas the measured collagen area increased with increasing section thickness. Whereas none of the LM-based methods showed significant differences between the groups, electron microscopy (EM)-stereology revealed a significant collagen increase between cardiomyocytes in the experimental group, but not at other localizations. In conclusion, in contrast to the staining protocol, section thickness and the quantification method being used directly influence the estimated collagen content and thus, possibly, intergroup comparisons. EM in combination with stereology is a precise and sensitive method for collagen quantification if certain prerequisites are considered. For subtle fibrotic alterations, consideration of collagen localization may be necessary. Among LM methods, LM-stereology and automated image analysis are appropriate to quantify fibrotic changes, the latter depending on careful control of algorithm and comparable section staining. NEW & NOTEWORTHY Direct comparison of frequently applied histological fibrosis assessment techniques revealed a distinct relation of measured collagen and utilized quantification method as well as section thickness. Besides electron microscopy-stereology, which was precise and sensitive, light microscopy-stereology and automated image analysis proved to be appropriate for collagen quantification. Moreover, consideration of collagen localization might be important in revealing minor fibrotic changes. Copyright © 2017 the American Physiological Society.
Taylor, Jonathan Christopher; Fenner, John Wesley
2017-11-29
Semi-quantification methods are well established in the clinic for assisted reporting of (I123) Ioflupane images. Arguably, these are limited diagnostic tools. Recent research has demonstrated the potential for improved classification performance offered by machine learning algorithms. A direct comparison between methods is required to establish whether a move towards widespread clinical adoption of machine learning algorithms is justified. This study compared three machine learning algorithms with that of a range of semi-quantification methods, using the Parkinson's Progression Markers Initiative (PPMI) research database and a locally derived clinical database for validation. Machine learning algorithms were based on support vector machine classifiers with three different sets of features: Voxel intensities Principal components of image voxel intensities Striatal binding radios from the putamen and caudate. Semi-quantification methods were based on striatal binding ratios (SBRs) from both putamina, with and without consideration of the caudates. Normal limits for the SBRs were defined through four different methods: Minimum of age-matched controls Mean minus 1/1.5/2 standard deviations from age-matched controls Linear regression of normal patient data against age (minus 1/1.5/2 standard errors) Selection of the optimum operating point on the receiver operator characteristic curve from normal and abnormal training data Each machine learning and semi-quantification technique was evaluated with stratified, nested 10-fold cross-validation, repeated 10 times. The mean accuracy of the semi-quantitative methods for classification of local data into Parkinsonian and non-Parkinsonian groups varied from 0.78 to 0.87, contrasting with 0.89 to 0.95 for classifying PPMI data into healthy controls and Parkinson's disease groups. The machine learning algorithms gave mean accuracies between 0.88 to 0.92 and 0.95 to 0.97 for local and PPMI data respectively. Classification performance was lower for the local database than the research database for both semi-quantitative and machine learning algorithms. However, for both databases, the machine learning methods generated equal or higher mean accuracies (with lower variance) than any of the semi-quantification approaches. The gain in performance from using machine learning algorithms as compared to semi-quantification was relatively small and may be insufficient, when considered in isolation, to offer significant advantages in the clinical context.
Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.
Hawkins, Steve F C; Guest, Paul C
2018-01-01
The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.
Accurate proteome-wide protein quantification from high-resolution 15N mass spectra
2011-01-01
In quantitative mass spectrometry-based proteomics, the metabolic incorporation of a single source of 15N-labeled nitrogen has many advantages over using stable isotope-labeled amino acids. However, the lack of a robust computational framework for analyzing the resulting spectra has impeded wide use of this approach. We have addressed this challenge by introducing a new computational methodology for analyzing 15N spectra in which quantification is integrated with identification. Application of this method to an Escherichia coli growth transition reveals significant improvement in quantification accuracy over previous methods. PMID:22182234
Enhancing sparsity of Hermite polynomial expansions by iterative rotations
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yang, Xiu; Lei, Huan; Baker, Nathan A.
2016-02-01
Compressive sensing has become a powerful addition to uncertainty quantification in recent years. This paper identifies new bases for random variables through linear mappings such that the representation of the quantity of interest is more sparse with new basis functions associated with the new random variables. This sparsity increases both the efficiency and accuracy of the compressive sensing-based uncertainty quantification method. Specifically, we consider rotation- based linear mappings which are determined iteratively for Hermite polynomial expansions. We demonstrate the effectiveness of the new method with applications in solving stochastic partial differential equations and high-dimensional (O(100)) problems.
Neiens, Patrick; De Simone, Angela; Ramershoven, Anna; Höfner, Georg; Allmendinger, Lars; Wanner, Klaus T
2018-03-03
MS Binding Assays represent a label-free alternative to radioligand binding assays. In this study, we present an LC-ESI-MS/MS method for the quantification of (R,R)-4-(2-benzhydryloxyethyl)-1-(4-fluorobenzyl)piperidin-3-ol [(R,R)-D-84, (R,R)-1], (S,S)-reboxetine [(S,S)-2], and (S)-citalopram [(S)-3] employed as highly selective nonlabeled reporter ligands in MS Binding Assays addressing the dopamine [DAT, (R,R)-D-84], norepinephrine [NET, (S,S)-reboxetine] and serotonin transporter [SERT, (S)-citalopram], respectively. The developed LC-ESI-MS/MS method uses a pentafluorphenyl stationary phase in combination with a mobile phase composed of acetonitrile and ammonium formate buffer for chromatography and a triple quadrupole mass spectrometer in the multiple reaction monitoring mode for mass spectrometric detection. Quantification is based on deuterated derivatives of all three analytes serving as internal standards. The established LC-ESI-MS/MS method enables fast, robust, selective and highly sensitive quantification of all three reporter ligands in a single chromatographic run. The method was validated according to the Center for Drug Evaluation and Research (CDER) guideline for bioanalytical method validation regarding selectivity, accuracy, precision, calibration curve and sensitivity. Finally, filtration-based MS Binding Assays were performed for all three monoamine transporters based on this LC-ESI-MS/MS quantification method as read out. The affinities determined in saturation experiments for (R,R)-D-84 toward hDAT, for (S,S)-reboxetine toward hNET, and for (S)-citalopram toward hSERT, respectively, were in good accordance with results from literature, clearly demonstrating that the established MS Binding Assays have the potential to be an efficient alternative to radioligand binding assays widely used for this purpose so far. Copyright © 2018 John Wiley & Sons, Ltd.
Stable isotope labelling methods in mass spectrometry-based quantitative proteomics.
Chahrour, Osama; Cobice, Diego; Malone, John
2015-09-10
Mass-spectrometry based proteomics has evolved as a promising technology over the last decade and is undergoing a dramatic development in a number of different areas, such as; mass spectrometric instrumentation, peptide identification algorithms and bioinformatic computational data analysis. The improved methodology allows quantitative measurement of relative or absolute protein amounts, which is essential for gaining insights into their functions and dynamics in biological systems. Several different strategies involving stable isotopes label (ICAT, ICPL, IDBEST, iTRAQ, TMT, IPTL, SILAC), label-free statistical assessment approaches (MRM, SWATH) and absolute quantification methods (AQUA) are possible, each having specific strengths and weaknesses. Inductively coupled plasma mass spectrometry (ICP-MS), which is still widely recognised as elemental detector, has recently emerged as a complementary technique to the previous methods. The new application area for ICP-MS is targeting the fast growing field of proteomics related research, allowing absolute protein quantification using suitable elemental based tags. This document describes the different stable isotope labelling methods which incorporate metabolic labelling in live cells, ICP-MS based detection and post-harvest chemical label tagging for protein quantification, in addition to summarising their pros and cons. Copyright © 2015 Elsevier B.V. All rights reserved.
Bostijn, N; Hellings, M; Van Der Veen, M; Vervaet, C; De Beer, T
2018-07-12
UltraViolet (UV) spectroscopy was evaluated as an innovative Process Analytical Technology (PAT) - tool for the in-line and real-time quantitative determination of low-dosed active pharmaceutical ingredients (APIs) in a semi-solid (gel) and a liquid (suspension) pharmaceutical formulation during their batch production process. The performance of this new PAT-tool (i.e., UV spectroscopy) was compared with an already more established PAT-method based on Raman spectroscopy. In-line UV measurements were carried out with an immersion probe while for the Raman measurements a non-contact PhAT probe was used. For both studied formulations, an in-line API quantification model was developed and validated per spectroscopic technique. The known API concentrations (Y) were correlated with the corresponding in-line collected preprocessed spectra (X) through a Partial Least Squares (PLS) regression. Each developed quantification method was validated by calculating the accuracy profile on the basis of the validation experiments. Furthermore, the measurement uncertainty was determined based on the data generated for the determination of the accuracy profiles. From the accuracy profile of the UV- and Raman-based quantification method for the gel, it was concluded that at the target API concentration of 2% (w/w), 95 out of 100 future routine measurements given by the Raman method will not deviate more than 10% (relative error) from the true API concentration, whereas for the UV method the acceptance limits of 10% were exceeded. For the liquid formulation, the Raman method was not able to quantify the API in the low-dosed suspension (0.09% (w/w) API). In contrast, the in-line UV method was able to adequately quantify the API in the suspension. This study demonstrated that UV spectroscopy can be adopted as a novel in-line PAT-technique for low-dose quantification purposes in pharmaceutical processes. Important is that none of the two spectroscopic techniques was superior to the other for both formulations: the Raman method was more accurate in quantifying the API in the gel (2% (w/w) API), while the UV method performed better for API quantification in the suspension (0.09% (w/w) API). Copyright © 2018 Elsevier B.V. All rights reserved.
Verant, Michelle L; Bohuski, Elizabeth A; Lorch, Jeffery M; Blehert, David S
2016-03-01
The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid from P. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer-based qPCR test for P. destructans to refine quantification capabilities of this assay. © 2016 The Author(s).
Verant, Michelle; Bohuski, Elizabeth A.; Lorch, Jeffrey M.; Blehert, David
2016-01-01
The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid fromP. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer–based qPCR test for P. destructans to refine quantification capabilities of this assay.
Quaternary ammonium isobaric tag for a relative and absolute quantification of peptides.
Setner, Bartosz; Stefanowicz, Piotr; Szewczuk, Zbigniew
2018-02-01
Isobaric labeling quantification of peptides has become a method of choice for mass spectrometry-based proteomics studies. However, despite of wide variety of commercially available isobaric tags, none of the currently available methods offers significant improvement of sensitivity of detection during MS experiment. Recently, many strategies were applied to increase the ionization efficiency of peptides involving chemical modifications introducing quaternary ammonium fixed charge. Here, we present a novel quaternary ammonium-based isobaric tag for relative and absolute quantification of peptides (QAS-iTRAQ 2-plex). Upon collisional activation, the new stable benzylic-type cationic reporter ion is liberated from the tag. Deuterium atoms were used to offset the differential masses of a reporter group. We tested the applicability of QAS-iTRAQ 2-plex reagent on a series of model peptides as well as bovine serum albumin tryptic digest. Obtained results suggest usefulness of this isobaric ionization tag for relative and absolute quantification of peptides. Copyright © 2017 John Wiley & Sons, Ltd.
Quantifying construction and demolition waste: an analytical review.
Wu, Zezhou; Yu, Ann T W; Shen, Liyin; Liu, Guiwen
2014-09-01
Quantifying construction and demolition (C&D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C&D waste generation at both regional and project levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C&D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Restaino, Stephen M.; White, Ian M.
2017-03-01
Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.
Simple and rapid quantification of brominated vegetable oil in commercial soft drinks by LC–MS
Chitranshi, Priyanka; da Costa, Gonçalo Gamboa
2016-01-01
We report here a simple and rapid method for the quantification of brominated vegetable oil (BVO) in soft drinks based upon liquid chromatography–electrospray ionization mass spectrometry. Unlike previously reported methods, this novel method does not require hydrolysis, extraction or derivatization steps, but rather a simple “dilute and shoot” sample preparation. The quantification is conducted by mass spectrometry in selected ion recording mode and a single point standard addition procedure. The method was validated in the range of 5–25 μg/mL BVO, encompassing the legal limit of 15 μg/mL established by the US FDA for fruit-flavored beverages in the US market. The method was characterized by excellent intra- and inter-assay accuracy (97.3–103.4%) and very low imprecision [0.5–3.6% (RSD)]. The direct nature of the quantification, simplicity, and excellent statistical performance of this methodology constitute clear advantages in relation to previously published methods for the analysis of BVO in soft drinks. PMID:27451219
Nitric Oxide Analyzer Quantification of Plant S-Nitrosothiols.
Hussain, Adil; Yun, Byung-Wook; Loake, Gary J
2018-01-01
Nitric oxide (NO) is a small diatomic molecule that regulates multiple physiological processes in animals, plants, and microorganisms. In animals, it is involved in vasodilation and neurotransmission and is present in exhaled breath. In plants, it regulates both plant immune function and numerous developmental programs. The high reactivity and short half-life of NO and cross-reactivity of its various derivatives make its quantification difficult. Different methods based on calorimetric, fluorometric, and chemiluminescent detection of NO and its derivatives are available, but all of them have significant limitations. Here we describe a method for the chemiluminescence-based quantification of NO using ozone-chemiluminescence technology in plants. This approach provides a sensitive, robust, and flexible approach for determining the levels of NO and its signaling products, protein S-nitrosothiols.
Tey, Wei Keat; Kuang, Ye Chow; Ooi, Melanie Po-Leen; Khoo, Joon Joon
2018-03-01
Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses. This study proposes an automated quantification system for measuring the amount of interstitial fibrosis in renal biopsy images as a consistent basis of comparison among pathologists. The system extracts and segments the renal tissue structures based on colour information and structural assumptions of the tissue structures. The regions in the biopsy representing the interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area and quantified as a percentage of the total area of the biopsy sample. A ground truth image dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated a good correlation in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses due to the uncertainties in human judgement. An automated quantification system for accurately measuring the amount of interstitial fibrosis in renal biopsy images is presented as a consistent basis of comparison among pathologists. The system identifies the renal tissue structures through knowledge-based rules employing colour space transformations and structural features extraction from the images. In particular, the renal glomerulus identification is based on a multiscale textural feature analysis and a support vector machine. The regions in the biopsy representing interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area. The experiments conducted evaluate the system in terms of quantification accuracy, intra- and inter-observer variability in visual quantification by pathologists, and the effect introduced by the automated quantification system on the pathologists' diagnosis. A 40-image ground truth dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated an average error of 9 percentage points in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists involving samples from 70 kidney patients also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. The accuracy of the proposed quantification system has been validated with the ground truth dataset and compared against the pathologists' quantification results. It has been shown that the correlation between different pathologists' estimation of interstitial fibrosis area has significantly improved, demonstrating the effectiveness of the quantification system as a diagnostic aide. Copyright © 2017 Elsevier B.V. All rights reserved.
Improved LC-MS/MS method for the quantification of hepcidin-25 in clinical samples.
Abbas, Ioana M; Hoffmann, Holger; Montes-Bayón, María; Weller, Michael G
2018-06-01
Mass spectrometry-based methods play a crucial role in the quantification of the main iron metabolism regulator hepcidin by singling out the bioactive 25-residue peptide from the other naturally occurring N-truncated isoforms (hepcidin-20, -22, -24), which seem to be inactive in iron homeostasis. However, several difficulties arise in the MS analysis of hepcidin due to the "sticky" character of the peptide and the lack of suitable standards. Here, we propose the use of amino- and fluoro-silanized autosampler vials to reduce hepcidin interaction to laboratory glassware surfaces after testing several types of vials for the preparation of stock solutions and serum samples for isotope dilution liquid chromatography-tandem mass spectrometry (ID-LC-MS/MS). Furthermore, we have investigated two sample preparation strategies and two chromatographic separation conditions with the aim of developing a LC-MS/MS method for the sensitive and reliable quantification of hepcidin-25 in serum samples. A chromatographic separation based on usual acidic mobile phases was compared with a novel approach involving the separation of hepcidin-25 with solvents at high pH containing 0.1% of ammonia. Both methods were applied to clinical samples in an intra-laboratory comparison of two LC-MS/MS methods using the same hepcidin-25 calibrators with good correlation of the results. Finally, we recommend a LC-MS/MS-based quantification method with a dynamic range of 0.5-40 μg/L for the assessment of hepcidin-25 in human serum that uses TFA-based mobile phases and silanized glass vials. Graphical abstract Structure of hepcidin-25 (Protein Data Bank, PDB ID 2KEF).
Esquinas, Pedro L; Uribe, Carlos F; Gonzalez, M; Rodríguez-Rodríguez, Cristina; Häfeli, Urs O; Celler, Anna
2017-07-20
The main applications of 188 Re in radionuclide therapies include trans-arterial liver radioembolization and palliation of painful bone-metastases. In order to optimize 188 Re therapies, the accurate determination of radiation dose delivered to tumors and organs at risk is required. Single photon emission computed tomography (SPECT) can be used to perform such dosimetry calculations. However, the accuracy of dosimetry estimates strongly depends on the accuracy of activity quantification in 188 Re images. In this study, we performed a series of phantom experiments aiming to investigate the accuracy of activity quantification for 188 Re SPECT using high-energy and medium-energy collimators. Objects of different shapes and sizes were scanned in Air, non-radioactive water (Cold-water) and water with activity (Hot-water). The ordered subset expectation maximization algorithm with clinically available corrections (CT-based attenuation, triple-energy window (TEW) scatter and resolution recovery was used). For high activities, the dead-time corrections were applied. The accuracy of activity quantification was evaluated using the ratio of the reconstructed activity in each object to this object's true activity. Each object's activity was determined with three segmentation methods: a 1% fixed threshold (for cold background), a 40% fixed threshold and a CT-based segmentation. Additionally, the activity recovered in the entire phantom, as well as the average activity concentration of the phantom background were compared to their true values. Finally, Monte-Carlo simulations of a commercial [Formula: see text]-camera were performed to investigate the accuracy of the TEW method. Good quantification accuracy (errors <10%) was achieved for the entire phantom, the hot-background activity concentration and for objects in cold background segmented with a 1% threshold. However, the accuracy of activity quantification for objects segmented with 40% threshold or CT-based methods decreased (errors >15%), mostly due to partial-volume effects. The Monte-Carlo simulations confirmed that TEW-scatter correction applied to 188 Re, although practical, yields only approximate estimates of the true scatter.
Piñeiro, Zulema; Cantos-Villar, Emma; Palma, Miguel; Puertas, Belen
2011-11-09
A validated HPLC method with fluorescence detection for the simultaneous quantification of hydroxytyrosol and tyrosol in red wines is described. Detection conditions for both compounds were optimized (excitation at 279 and 278 and emission at 631 and 598 nm for hydroxytyrosol and tyrosol, respectively). The validation of the analytical method was based on selectivity, linearity, robustness, detection and quantification limits, repeatability, and recovery. The detection and quantification limits in red wines were set at 0.023 and 0.076 mg L(-1) for hydroxytyrosol and at 0.007 and 0.024 mg L(-1) for tyrosol determination, respectively. Precision values, both within-day and between-day (n = 5), remained below 3% for both compounds. In addition, a fractional factorial experimental design was developed to analyze the influence of six different conditions on analysis. The final optimized HPLC-fluorescence method allowed the analysis of 30 nonpretreated Spanish red wines to evaluate their hydroxytyrosol and tyrosol contents.
Uncertainty Quantification in Alchemical Free Energy Methods.
Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V
2018-06-12
Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.
Deconinck, E; Crevits, S; Baten, P; Courselle, P; De Beer, J
2011-04-05
A fully validated UHPLC method for the identification and quantification of folic acid in pharmaceutical preparations was developed. The starting conditions for the development were calculated starting from the HPLC conditions of a validated method. These start conditions were tested on four different UHPLC columns: Grace Vision HT™ C18-P, C18, C18-HL and C18-B (2 mm × 100 mm, 1.5 μm). After selection of the stationary phase, the method was further optimised by testing two aqueous and two organic phases and by adapting to a gradient method. The obtained method was fully validated based on its measurement uncertainty (accuracy profile) and robustness tests. A UHPLC method was obtained for the identification and quantification of folic acid in pharmaceutical preparations, which will cut analysis times and solvent consumption. Copyright © 2010 Elsevier B.V. All rights reserved.
Clais, S; Boulet, G; Van Kerckhoven, M; Lanckacker, E; Delputte, P; Maes, L; Cos, P
2015-01-01
The viable plate count (VPC) is considered as the reference method for bacterial enumeration in periodontal microbiology but shows some important limitations for anaerobic bacteria. As anaerobes such as Porphyromonas gingivalis are difficult to culture, VPC becomes time-consuming and less sensitive. Hence, efficient normalization of experimental data to bacterial cell count requires alternative rapid and reliable quantification methods. This study compared the performance of VPC with that of turbidity measurement and real-time PCR (qPCR) in an experimental context using highly concentrated bacterial suspensions. Our TaqMan-based qPCR assay for P. gingivalis 16S rRNA proved to be sensitive and specific. Turbidity measurements offer a fast method to assess P. gingivalis growth, but suffer from high variability and a limited dynamic range. VPC was very time-consuming and less repeatable than qPCR. Our study concludes that qPCR provides the most rapid and precise approach for P. gingivalis quantification. Although our data were gathered in a specific research context, we believe that our conclusions on the inferior performance of VPC and turbidity measurements in comparison to qPCR can be extended to other research and clinical settings and even to other difficult-to-culture micro-organisms. Various clinical and research settings require fast and reliable quantification of bacterial suspensions. The viable plate count method (VPC) is generally seen as 'the gold standard' for bacterial enumeration. However, VPC-based quantification of anaerobes such as Porphyromonas gingivalis is time-consuming due to their stringent growth requirements and shows poor repeatability. Comparison of VPC, turbidity measurement and TaqMan-based qPCR demonstrated that qPCR possesses important advantages regarding speed, accuracy and repeatability. © 2014 The Society for Applied Microbiology.
Lavallée-Adam, Mathieu; Rauniyar, Navin; McClatchy, Daniel B; Yates, John R
2014-12-05
The majority of large-scale proteomics quantification methods yield long lists of quantified proteins that are often difficult to interpret and poorly reproduced. Computational approaches are required to analyze such intricate quantitative proteomics data sets. We propose a statistical approach to computationally identify protein sets (e.g., Gene Ontology (GO) terms) that are significantly enriched with abundant proteins with reproducible quantification measurements across a set of replicates. To this end, we developed PSEA-Quant, a protein set enrichment analysis algorithm for label-free and label-based protein quantification data sets. It offers an alternative approach to classic GO analyses, models protein annotation biases, and allows the analysis of samples originating from a single condition, unlike analogous approaches such as GSEA and PSEA. We demonstrate that PSEA-Quant produces results complementary to GO analyses. We also show that PSEA-Quant provides valuable information about the biological processes involved in cystic fibrosis using label-free protein quantification of a cell line expressing a CFTR mutant. Finally, PSEA-Quant highlights the differences in the mechanisms taking place in the human, rat, and mouse brain frontal cortices based on tandem mass tag quantification. Our approach, which is available online, will thus improve the analysis of proteomics quantification data sets by providing meaningful biological insights.
2015-01-01
The majority of large-scale proteomics quantification methods yield long lists of quantified proteins that are often difficult to interpret and poorly reproduced. Computational approaches are required to analyze such intricate quantitative proteomics data sets. We propose a statistical approach to computationally identify protein sets (e.g., Gene Ontology (GO) terms) that are significantly enriched with abundant proteins with reproducible quantification measurements across a set of replicates. To this end, we developed PSEA-Quant, a protein set enrichment analysis algorithm for label-free and label-based protein quantification data sets. It offers an alternative approach to classic GO analyses, models protein annotation biases, and allows the analysis of samples originating from a single condition, unlike analogous approaches such as GSEA and PSEA. We demonstrate that PSEA-Quant produces results complementary to GO analyses. We also show that PSEA-Quant provides valuable information about the biological processes involved in cystic fibrosis using label-free protein quantification of a cell line expressing a CFTR mutant. Finally, PSEA-Quant highlights the differences in the mechanisms taking place in the human, rat, and mouse brain frontal cortices based on tandem mass tag quantification. Our approach, which is available online, will thus improve the analysis of proteomics quantification data sets by providing meaningful biological insights. PMID:25177766
Oberbach, Andreas; Schlichting, Nadine; Neuhaus, Jochen; Kullnick, Yvonne; Lehmann, Stefanie; Heinrich, Marco; Dietrich, Arne; Mohr, Friedrich Wilhelm; von Bergen, Martin; Baumann, Sven
2014-12-05
Multiple reaction monitoring (MRM)-based mass spectrometric quantification of peptides and their corresponding proteins has been successfully applied for biomarker validation in serum. The option of multiplexing offers the chance to analyze various proteins in parallel, which is especially important in obesity research. Here, biomarkers that reflect multiple comorbidities and allow monitoring of therapy outcomes are required. Besides the suitability of established MRM assays for serum protein quantification, it is also feasible for analysis of tissues secreting the markers of interest. Surprisingly, studies comparing MRM data sets with established methods are rare, and therefore the biological and clinical value of most analytes remains questionable. A MRM method using nano-UPLC-MS/MS for the quantification of obesity related surrogate markers for several comorbidities in serum, plasma, visceral and subcutaneous adipose tissue was established. Proteotypic peptides for complement C3, adiponectin, angiotensinogen, and plasma retinol binding protein (RBP4) were quantified using isotopic dilution analysis and compared to the standard ELISA method. MRM method variabilities were mainly below 10%. The comparison with other MS-based approaches showed a good correlation. However, large differences in absolute quantification for complement C3 and adiponectin were obtained compared to ELISA, while less marked differences were observed for angiotensinogen and RBP4. The verification of MRM in obesity was performed to discriminate first lean and obese phenotype and second to monitor excessive weight loss after gastric bypass surgery in a seven-month follow-up. The presented MRM assay was able to discriminate obese phenotype from lean and monitor weight loss related changes of surrogate markers. However, inclusion of additional biomarkers was necessary to interpret the MRM data on obesity phenotype properly. In summary, the development of disease-related MRMs should include a step of matching the MRM data with clinically approved standard methods and defining reference values in well-sized representative age, gender, and disease-matched cohorts.
Lowering the quantification limit of the QubitTM RNA HS assay using RNA spike-in.
Li, Xin; Ben-Dov, Iddo Z; Mauro, Maurizio; Williams, Zev
2015-05-06
RNA quantification is often a prerequisite for most RNA analyses such as RNA sequencing. However, the relatively low sensitivity and large sample consumption of traditional RNA quantification methods such as UV spectrophotometry and even the much more sensitive fluorescence-based RNA quantification assays, such as the Qubit™ RNA HS Assay, are often inadequate for measuring minute levels of RNA isolated from limited cell and tissue samples and biofluids. Thus, there is a pressing need for a more sensitive method to reliably and robustly detect trace levels of RNA without interference from DNA. To improve the quantification limit of the Qubit™ RNA HS Assay, we spiked-in a known quantity of RNA to achieve the minimum reading required by the assay. Samples containing trace amounts of RNA were then added to the spike-in and measured as a reading increase over RNA spike-in baseline. We determined the accuracy and precision of reading increases between 1 and 20 pg/μL as well as RNA-specificity in this range, and compared to those of RiboGreen(®), another sensitive fluorescence-based RNA quantification assay. We then applied Qubit™ Assay with RNA spike-in to quantify plasma RNA samples. RNA spike-in improved the quantification limit of the Qubit™ RNA HS Assay 5-fold, from 25 pg/μL down to 5 pg/μL while maintaining high specificity to RNA. This enabled quantification of RNA with original concentration as low as 55.6 pg/μL compared to 250 pg/μL for the standard assay and decreased sample consumption from 5 to 1 ng. Plasma RNA samples that were not measurable by the Qubit™ RNA HS Assay were measurable by our modified method. The Qubit™ RNA HS Assay with RNA spike-in is able to quantify RNA with high specificity at 5-fold lower concentration and uses 5-fold less sample quantity than the standard Qubit™ Assay.
Phylogenetic Quantification of Intra-tumour Heterogeneity
Schwarz, Roland F.; Trinh, Anne; Sipos, Botond; Brenton, James D.; Goldman, Nick; Markowetz, Florian
2014-01-01
Intra-tumour genetic heterogeneity is the result of ongoing evolutionary change within each cancer. The expansion of genetically distinct sub-clonal populations may explain the emergence of drug resistance, and if so, would have prognostic and predictive utility. However, methods for objectively quantifying tumour heterogeneity have been missing and are particularly difficult to establish in cancers where predominant copy number variation prevents accurate phylogenetic reconstruction owing to horizontal dependencies caused by long and cascading genomic rearrangements. To address these challenges, we present MEDICC, a method for phylogenetic reconstruction and heterogeneity quantification based on a Minimum Event Distance for Intra-tumour Copy-number Comparisons. Using a transducer-based pairwise comparison function, we determine optimal phasing of major and minor alleles, as well as evolutionary distances between samples, and are able to reconstruct ancestral genomes. Rigorous simulations and an extensive clinical study show the power of our method, which outperforms state-of-the-art competitors in reconstruction accuracy, and additionally allows unbiased numerical quantification of tumour heterogeneity. Accurate quantification and evolutionary inference are essential to understand the functional consequences of tumour heterogeneity. The MEDICC algorithms are independent of the experimental techniques used and are applicable to both next-generation sequencing and array CGH data. PMID:24743184
NASA Astrophysics Data System (ADS)
Zhao, Fengjun; Liu, Junting; Qu, Xiaochao; Xu, Xianhui; Chen, Xueli; Yang, Xiang; Cao, Feng; Liang, Jimin; Tian, Jie
2014-12-01
To solve the multicollinearity issue and unequal contribution of vascular parameters for the quantification of angiogenesis, we developed a quantification evaluation method of vascular parameters for angiogenesis based on in vivo micro-CT imaging of hindlimb ischemic model mice. Taking vascular volume as the ground truth parameter, nine vascular parameters were first assembled into sparse principal components (PCs) to reduce the multicolinearity issue. Aggregated boosted trees (ABTs) were then employed to analyze the importance of vascular parameters for the quantification of angiogenesis via the loadings of sparse PCs. The results demonstrated that vascular volume was mainly characterized by vascular area, vascular junction, connectivity density, segment number and vascular length, which indicated they were the key vascular parameters for the quantification of angiogenesis. The proposed quantitative evaluation method was compared with both the ABTs directly using the nine vascular parameters and Pearson correlation, which were consistent. In contrast to the ABTs directly using the vascular parameters, the proposed method can select all the key vascular parameters simultaneously, because all the key vascular parameters were assembled into the sparse PCs with the highest relative importance.
Rapid quantification of soilborne pathogen communities in wheat-based long-term field experiments
USDA-ARS?s Scientific Manuscript database
Traditional isolation and quantification of inoculum density is difficult for most soilborne pathogens. Quantitative PCR methods have been developed to rapidly identify and quantify many of these pathogens using a single DNA extract from soil. Rainfed experiments operated continuously for up to 84 y...
In this study, a new analytical technique was developed for the identification and quantification of multi-functional compounds containing simultaneously at least one hydroxyl or one carboxylic group, or both. This technique is based on derivatizing first the carboxylic group(s) ...
Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Holst-Jensen, Arne; Žel, Jana
2015-08-18
Presence of genetically modified organisms (GMO) in food and feed products is regulated in many countries. The European Union (EU) has implemented a threshold for labeling of products containing more than 0.9% of authorized GMOs per ingredient. As the number of GMOs has increased over time, standard-curve based simplex quantitative polymerase chain reaction (qPCR) analyses are no longer sufficiently cost-effective, despite widespread use of initial PCR based screenings. Newly developed GMO detection methods, also multiplex methods, are mostly focused on screening and detection but not quantification. On the basis of droplet digital PCR (ddPCR) technology, multiplex assays for quantification of all 12 EU authorized GM maize lines (per April first 2015) were developed. Because of high sequence similarity of some of the 12 GM targets, two separate multiplex assays were needed. In both assays (4-plex and 10-plex), the transgenes were labeled with one fluorescence reporter and the endogene with another (GMO concentration = transgene/endogene ratio). It was shown that both multiplex assays produce specific results and that performance parameters such as limit of quantification, repeatability, and trueness comply with international recommendations for GMO quantification methods. Moreover, for samples containing GMOs, the throughput and cost-effectiveness is significantly improved compared to qPCR. Thus, it was concluded that the multiplex ddPCR assays could be applied for routine quantification of 12 EU authorized GM maize lines. In case of new authorizations, the events can easily be added to the existing multiplex assays. The presented principle of quantitative multiplexing can be applied to any other domain.
Uncertainty Quantification for Robust Control of Wind Turbines using Sliding Mode Observer
NASA Astrophysics Data System (ADS)
Schulte, Horst
2016-09-01
A new quantification method of uncertain models for robust wind turbine control using sliding-mode techniques is presented with the objective to improve active load mitigation. This approach is based on the so-called equivalent output injection signal, which corresponds to the average behavior of the discontinuous switching term, establishing and maintaining a motion on a so-called sliding surface. The injection signal is directly evaluated to obtain estimates of the uncertainty bounds of external disturbances and parameter uncertainties. The applicability of the proposed method is illustrated by the quantification of a four degree-of-freedom model of the NREL 5MW reference turbine containing uncertainties.
Zong, Ying; Wang, Yu; Li, Hang; Li, Na; Zhang, Hui; Sun, Jiaming; Niu, Xiaohui; Gao, Xiaochen
2014-01-01
Background: Cervi Cornu Pantotrichum has been a well known traditional Chinese medicine, which is young horn of Cervus Nippon Temminck (Hualurong: HLR). At present, the methods used for the quality control of Cervi Cornu Pantotrichum show low specificity. Objective: To describe a holistic method based on chemical characteristics and splenocyte-proliferating activities to evaluate the quality of HLR. Materials and Methods: The nucleosides and bases from HLR were identified by high performance liquid chromatography electrospray ionization mass spectrometry (HPLC-ESI-MS), and six of them were chosen to be used for simultaneous HPLC quantification according to the results of proliferation of mouse splenocytes in vitro. Results: In this study, eight nucleosides and bases have been identified. In addition, uracil, hypoxanthine, uridine, inosine, guanosine, and adenosine were chosen to be used for simultaneous HPLC quantification. Simultaneous quantification of these six substances was performed on ten groups of HLR under the condition of a TIANHE Kromasil C18 column (5 μm, 4.6 mm × 250 mm i.d.) and a gradient elution of water and acetonitrile. Of the ten groups, HLR displayed the highest total nucleoside contents (TNC, sum of adenosine and uracil, 0.412 mg/g) with the strongest splenocyte-proliferating activities. Conclusion: These results suggest that TNC (such as particularly highly contained adenosine and uracil) in HLR has a certain correlation with the activity of splenocyte-proliferating, and it may be used as a quality control for HLR. This comprehensive method could be applied to other traditional Chinese medicines to ameliorate their quality control. PMID:25422536
Schmidt, Carla; Grønborg, Mads; Deckert, Jochen; Bessonov, Sergey; Conrad, Thomas; Lührmann, Reinhard; Urlaub, Henning
2014-01-01
The spliceosome undergoes major changes in protein and RNA composition during pre-mRNA splicing. Knowing the proteins—and their respective quantities—at each spliceosomal assembly stage is critical for understanding the molecular mechanisms and regulation of splicing. Here, we applied three independent mass spectrometry (MS)–based approaches for quantification of these proteins: (1) metabolic labeling by SILAC, (2) chemical labeling by iTRAQ, and (3) label-free spectral count for quantification of the protein composition of the human spliceosomal precatalytic B and catalytic C complexes. In total we were able to quantify 157 proteins by at least two of the three approaches. Our quantification shows that only a very small subset of spliceosomal proteins (the U5 and U2 Sm proteins, a subset of U5 snRNP-specific proteins, and the U2 snRNP-specific proteins U2A′ and U2B′′) remains unaltered upon transition from the B to the C complex. The MS-based quantification approaches classify the majority of proteins as dynamically associated specifically with the B or the C complex. In terms of experimental procedure and the methodical aspect of this work, we show that metabolically labeled spliceosomes are functionally active in terms of their assembly and splicing kinetics and can be utilized for quantitative studies. Moreover, we obtain consistent quantification results from all three methods, including the relatively straightforward and inexpensive label-free spectral count technique. PMID:24448447
Cryar, Adam; Pritchard, Caroline; Burkitt, William; Walker, Michael; O'Connor, Gavin; Burns, Duncan Thorburn; Quaglia, Milena
2013-01-01
Current routine food allergen quantification methods, which are based on immunochemistry, offer high sensitivity but can suffer from issues of specificity and significant variability of results. MS approaches have been developed, but currently lack metrological traceability. A feasibility study on the application of metrologically traceable MS-based reference procedures was undertaken. A proof of concept involving proteolytic digestion and isotope dilution MS for quantification of protein allergens in a food matrix was undertaken using lysozyme in wine as a model system. A concentration of lysozyme in wine of 0.95 +/- 0.03 microg/g was calculated based on the concentrations of two peptides, confirming that this type of analysis is viable at allergenically meaningful concentrations. The challenges associated with this promising method were explored; these included peptide stability, chemical modification, enzymatic digestion, and sample cleanup. The method is suitable for the production of allergen in food certified reference materials, which together with the achieved understanding of the effects of sample preparation and of the matrix on the final results, will assist in addressing the bias of the techniques routinely used and improve measurement confidence. Confirmation of the feasibility of MS methods for absolute quantification of an allergenic protein in a food matrix with results traceable to the International System of Units is a step towards meaningful comparison of results for allergen proteins among laboratories. This approach will also underpin risk assessment and risk management of allergens in the food industry, and regulatory compliance of the use of thresholds or action levels when adopted.
Provost, Karine; Leblond, Antoine; Gauthier-Lemire, Annie; Filion, Édith; Bahig, Houda; Lord, Martin
2017-09-01
Planar perfusion scintigraphy with 99m Tc-labeled macroaggregated albumin is often used for pretherapy quantification of regional lung perfusion in lung cancer patients, particularly those with poor respiratory function. However, subdividing lung parenchyma into rectangular regions of interest, as done on planar images, is a poor reflection of true lobar anatomy. New tridimensional methods using SPECT and SPECT/CT have been introduced, including semiautomatic lung segmentation software. The present study evaluated inter- and intraobserver agreement on quantification using SPECT/CT software and compared the results for regional lung contribution obtained with SPECT/CT and planar scintigraphy. Methods: Thirty lung cancer patients underwent ventilation-perfusion scintigraphy with 99m Tc-macroaggregated albumin and 99m Tc-Technegas. The regional lung contribution to perfusion and ventilation was measured on both planar scintigraphy and SPECT/CT using semiautomatic lung segmentation software by 2 observers. Interobserver and intraobserver agreement for the SPECT/CT software was assessed using the intraclass correlation coefficient, Bland-Altman plots, and absolute differences in measurements. Measurements from planar and tridimensional methods were compared using the paired-sample t test and mean absolute differences. Results: Intraclass correlation coefficients were in the excellent range (above 0.9) for both interobserver and intraobserver agreement using the SPECT/CT software. Bland-Altman analyses showed very narrow limits of agreement. Absolute differences were below 2.0% in 96% of both interobserver and intraobserver measurements. There was a statistically significant difference between planar and SPECT/CT methods ( P < 0.001) for quantification of perfusion and ventilation for all right lung lobes, with a maximal mean absolute difference of 20.7% for the right middle lobe. There was no statistically significant difference in quantification of perfusion and ventilation for the left lung lobes using either method; however, absolute differences reached 12.0%. The total right and left lung contributions were similar for the two methods, with a mean difference of 1.2% for perfusion and 2.0% for ventilation. Conclusion: Quantification of regional lung perfusion and ventilation using SPECT/CT-based lung segmentation software is highly reproducible. This tridimensional method yields statistically significant differences in measurements for right lung lobes when compared with planar scintigraphy. We recommend that SPECT/CT-based quantification be used for all lung cancer patients undergoing pretherapy evaluation of regional lung function. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.
Turner, Clare E; Russell, Bruce R; Gant, Nicholas
2015-11-01
Magnetic resonance spectroscopy (MRS) is an analytical procedure that can be used to non-invasively measure the concentration of a range of neural metabolites. Creatine is an important neurometabolite with dietary supplementation offering therapeutic potential for neurological disorders with dysfunctional energetic processes. Neural creatine concentrations can be probed using proton MRS and quantified using a range of software packages based on different analytical methods. This experiment examines the differences in quantification performance of two commonly used analysis packages following a creatine supplementation strategy with potential therapeutic application. Human participants followed a seven day dietary supplementation regime in a placebo-controlled, cross-over design interspersed with a five week wash-out period. Spectroscopy data were acquired the day immediately following supplementation and analyzed with two commonly-used software packages which employ vastly different quantification methods. Results demonstrate that neural creatine concentration was augmented following creatine supplementation when analyzed using the peak fitting method of quantification (105.9%±10.1). In contrast, no change in neural creatine levels were detected with supplementation when analysis was conducted using the basis spectrum method of quantification (102.6%±8.6). Results suggest that software packages that employ the peak fitting procedure for spectral quantification are possibly more sensitive to subtle changes in neural creatine concentrations. The relative simplicity of the spectroscopy sequence and the data analysis procedure suggest that peak fitting procedures may be the most effective means of metabolite quantification when detection of subtle alterations in neural metabolites is necessary. The straightforward technique can be used on a clinical magnetic resonance imaging system. Copyright © 2015 Elsevier Inc. All rights reserved.
Worbs, Sylvia; Fiebig, Uwe; Zeleny, Reinhard; Schimmel, Heinz; Rummel, Andreas; Luginbühl, Werner; Dorner, Brigitte G.
2015-01-01
In the framework of the EU project EQuATox, a first international proficiency test (PT) on the detection and quantification of botulinum neurotoxins (BoNT) was conducted. Sample materials included BoNT serotypes A, B and E spiked into buffer, milk, meat extract and serum. Different methods were applied by the participants combining different principles of detection, identification and quantification. Based on qualitative assays, 95% of all results reported were correct. Successful strategies for BoNT detection were based on a combination of complementary immunological, MS-based and functional methods or on suitable functional in vivo/in vitro approaches (mouse bioassay, hemidiaphragm assay and Endopep-MS assay). Quantification of BoNT/A, BoNT/B and BoNT/E was performed by 48% of participating laboratories. It turned out that precise quantification of BoNT was difficult, resulting in a substantial scatter of quantitative data. This was especially true for results obtained by the mouse bioassay which is currently considered as “gold standard” for BoNT detection. The results clearly demonstrate the urgent need for certified BoNT reference materials and the development of methods replacing animal testing. In this context, the BoNT PT provided the valuable information that both the Endopep-MS assay and the hemidiaphragm assay delivered quantitative results superior to the mouse bioassay. PMID:26703724
Metering error quantification under voltage and current waveform distortion
NASA Astrophysics Data System (ADS)
Wang, Tao; Wang, Jia; Xie, Zhi; Zhang, Ran
2017-09-01
With integration of more and more renewable energies and distortion loads into power grid, the voltage and current waveform distortion results in metering error in the smart meters. Because of the negative effects on the metering accuracy and fairness, it is an important subject to study energy metering combined error. In this paper, after the comparing between metering theoretical value and real recorded value under different meter modes for linear and nonlinear loads, a quantification method of metering mode error is proposed under waveform distortion. Based on the metering and time-division multiplier principles, a quantification method of metering accuracy error is proposed also. Analyzing the mode error and accuracy error, a comprehensive error analysis method is presented which is suitable for new energy and nonlinear loads. The proposed method has been proved by simulation.
Tsukahara, Keita; Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Nishimaki-Mogami, Tomoko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi
2016-01-01
A real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event, MON87701. First, a standard plasmid for MON87701 quantification was constructed. The conversion factor (C f ) required to calculate the amount of genetically modified organism (GMO) was experimentally determined for a real-time PCR instrument. The determined C f for the real-time PCR instrument was 1.24. For the evaluation of the developed method, a blind test was carried out in an inter-laboratory trial. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr), respectively. The determined biases and the RSDr values were less than 30 and 13%, respectively, at all evaluated concentrations. The limit of quantitation of the method was 0.5%, and the developed method would thus be applicable for practical analyses for the detection and quantification of MON87701.
Takabatake, Reona; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Akiyama, Hiroshi; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi
2013-01-01
A novel real-time polymerase chain reaction (PCR)-based quantitative screening method was developed for three genetically modified soybeans: RRS, A2704-12, and MON89788. The 35S promoter (P35S) of cauliflower mosaic virus is introduced into RRS and A2704-12 but not MON89788. We then designed a screening method comprised of the combination of the quantification of P35S and the event-specific quantification of MON89788. The conversion factor (Cf) required to convert the amount of a genetically modified organism (GMO) from a copy number ratio to a weight ratio was determined experimentally. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDR), respectively. The determined RSDR values for the method were less than 25% for both targets. We consider that the developed method would be suitable for the simple detection and approximate quantification of GMO.
Nicolás, Paula; Lassalle, Verónica L; Ferreira, María L
2017-02-01
The aim of this manuscript was to study the application of a new method of protein quantification in Candida antarctica lipase B commercial solutions. Error sources associated to the traditional Bradford technique were demonstrated. Eight biocatalysts based on C. antarctica lipase B (CALB) immobilized onto magnetite nanoparticles were used. Magnetite nanoparticles were coated with chitosan (CHIT) and modified with glutaraldehyde (GLUT) and aminopropyltriethoxysilane (APTS). Later, CALB was adsorbed on the modified support. The proposed novel protein quantification method included the determination of sulfur (from protein in CALB solution) by means of Atomic Emission by Inductive Coupling Plasma (AE-ICP). Four different protocols were applied combining AE-ICP and classical Bradford assays, besides Carbon, Hydrogen and Nitrogen (CHN) analysis. The calculated error in protein content using the "classic" Bradford method with bovine serum albumin as standard ranged from 400 to 1200% when protein in CALB solution was quantified. These errors were calculated considering as "true protein content values" the results of the amount of immobilized protein obtained with the improved method. The optimum quantification procedure involved the combination of Bradford method, ICP and CHN analysis. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Lao, Zhiqiang; Zheng, Xin
2011-03-01
This paper proposes a multiscale method to quantify tissue spiculation and distortion in mammography CAD systems that aims at improving the sensitivity in detecting architectural distortion and spiculated mass. This approach addresses the difficulty of predetermining the neighborhood size for feature extraction in characterizing lesions demonstrating spiculated mass/architectural distortion that may appear in different sizes. The quantification is based on the recognition of tissue spiculation and distortion pattern using multiscale first-order phase portrait model in texture orientation field generated by Gabor filter bank. A feature map is generated based on the multiscale quantification for each mammogram and two features are then extracted from the feature map. These two features will be combined with other mass features to provide enhanced discriminate ability in detecting lesions demonstrating spiculated mass and architectural distortion. The efficiency and efficacy of the proposed method are demonstrated with results obtained by applying the method to over 500 cancer cases and over 1000 normal cases.
USDA-ARS?s Scientific Manuscript database
A general method was developed for the quantification of hydroxycinnamic acid derivatives and flavones, flavonols, and their glycosides based on the UV molar relative response factors (MRRF) of the standards. Each of these phenolic compounds contains a cinnamoyl structure and has a maximum absorban...
Protein quantification using a cleavable reporter peptide.
Duriez, Elodie; Trevisiol, Stephane; Domon, Bruno
2015-02-06
Peptide and protein quantification based on isotope dilution and mass spectrometry analysis are widely employed for the measurement of biomarkers and in system biology applications. The accuracy and reliability of such quantitative assays depend on the quality of the stable-isotope labeled standards. Although the quantification using stable-isotope labeled peptides is precise, the accuracy of the results can be severely biased by the purity of the internal standards, their stability and formulation, and the determination of their concentration. Here we describe a rapid and cost-efficient method to recalibrate stable isotope labeled peptides in a single LC-MS analysis. The method is based on the equimolar release of a protein reference peptide (used as surrogate for the protein of interest) and a universal reporter peptide during the trypsinization of a concatenated polypeptide standard. The quality and accuracy of data generated with such concatenated polypeptide standards are highlighted by the quantification of two clinically important proteins in urine samples and compared with results obtained with conventional stable isotope labeled reference peptides. Furthermore, the application of the UCRP standards in complex samples is described.
Sandra, Koen; Mortier, Kjell; Jorge, Lucie; Perez, Luis C; Sandra, Pat; Priem, Sofie; Poelmans, Sofie; Bouche, Marie-Paule
2014-05-01
Nanobodies(®) are therapeutic proteins derived from the smallest functional fragments of heavy chain-only antibodies. The development and validation of an LC-MS/MS-based method for the quantification of an IgE binding Nanobody in cynomolgus monkey plasma is presented. Nanobody quantification was performed making use of a proteotypic tryptic peptide chromatographically enriched prior to LC-MS/MS analysis. The validated LLOQ at 36 ng/ml was measured with an intra- and inter-assay precision and accuracy <20%. The required sensitivity could be obtained based on the selectivity of 2D LC combined with MS/MS. No analyte specific tools for affinity purification were used. Plasma samples originating from a PK/PD study were analyzed and compared with the results obtained with a traditional ligand-binding assay. Excellent correlations between the two techniques were obtained, and similar PK parameters were estimated. A 2D LC-MS/MS method was successfully developed and validated for the quantification of a next generation biotherapeutic.
A multi-center study benchmarks software tools for label-free proteome quantification
Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan
2016-01-01
The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404
Izanloo, Maryam; Esrafili, Ali; Behbahani, Mohammad; Ghambarian, Mahnaz; Reza Sobhi, Hamid
2018-02-01
Herein, a new dispersive solid-phase extraction method using a nano magnetic titanium dioxide graphene-based sorbent in conjunction with high-performance liquid chromatography and ultraviolet detection was successfully developed. The method was proved to be simple, sensitive, and highly efficient for the trace quantification of sulfacetamide, sulfathiazole, sulfamethoxazole, and sulfadiazine in relatively large volume of aqueous media. Initially, the nano magnetic titanium dioxide graphene-based sorbent was successfully synthesized and subsequently characterized by scanning electron microscopy and X-ray diffraction. Then, the sorbent was used for the sorption and extraction of the selected sulfonamides mainly through π-π stacking hydrophobic interactions. Under the established conditions, the calibration curves were linear over the concentration range of 1-200 μg/L. The limit of quantification (precision of 20%, and accuracy of 80-120%) for the detection of each sulfonamide by the proposed method was 1.0 μg/L. To test the extraction efficiency, the method was applied to various fortified real water samples. The average relative recoveries obtained from the fortified samples varied between 90 and 108% with the relative standard deviations of 5.3-10.7%. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME
NASA Astrophysics Data System (ADS)
Otis, Richard A.; Liu, Zi-Kui
2017-05-01
One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.
Targeted Quantification of Isoforms of a Thylakoid-Bound Protein: MRM Method Development.
Bru-Martínez, Roque; Martínez-Márquez, Ascensión; Morante-Carriel, Jaime; Sellés-Marchart, Susana; Martínez-Esteso, María José; Pineda-Lucas, José Luis; Luque, Ignacio
2018-01-01
Targeted mass spectrometric methods such as selected/multiple reaction monitoring (SRM/MRM) have found intense application in protein detection and quantification which competes with classical immunoaffinity techniques. It provides a universal procedure to develop a fast, highly specific, sensitive, accurate, and cheap methodology for targeted detection and quantification of proteins based on the direct analysis of their surrogate peptides typically generated by tryptic digestion. This methodology can be advantageously applied in the field of plant proteomics and particularly for non-model species since immunoreagents are scarcely available. Here, we describe the issues to take into consideration in order to develop a MRM method to detect and quantify isoforms of the thylakoid-bound protein polyphenol oxidase from the non-model and database underrepresented species Eriobotrya japonica Lindl.
Hagiwara, Akifumi; Warntjes, Marcel; Hori, Masaaki; Andica, Christina; Nakazawa, Misaki; Kumamaru, Kanako Kunishima; Abe, Osamu; Aoki, Shigeki
2017-01-01
Abstract Conventional magnetic resonance images are usually evaluated using the image signal contrast between tissues and not based on their absolute signal intensities. Quantification of tissue parameters, such as relaxation rates and proton density, would provide an absolute scale; however, these methods have mainly been performed in a research setting. The development of rapid quantification, with scan times in the order of 6 minutes for full head coverage, has provided the prerequisites for clinical use. The aim of this review article was to introduce a specific quantification method and synthesis of contrast-weighted images based on the acquired absolute values, and to present automatic segmentation of brain tissues and measurement of myelin based on the quantitative values, along with application of these techniques to various brain diseases. The entire technique is referred to as “SyMRI” in this review. SyMRI has shown promising results in previous studies when used for multiple sclerosis, brain metastases, Sturge-Weber syndrome, idiopathic normal pressure hydrocephalus, meningitis, and postmortem imaging. PMID:28257339
Computer-aided Assessment of Regional Abdominal Fat with Food Residue Removal in CT
Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi
2014-01-01
Rationale and Objectives Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Materials and Methods Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. Results We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Conclusions Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. PMID:24119354
Costa, Sofia R; Kerry, Brian R; Bardgett, Richard D; Davies, Keith G
2006-12-01
The Pasteuria group of endospore-forming bacteria has been studied as a biocontrol agent of plant-parasitic nematodes. Techniques have been developed for its detection and quantification in soil samples, and these mainly focus on observations of endospore attachment to nematodes. Characterization of Pasteuria populations has recently been performed with DNA-based techniques, which usually require the extraction of large numbers of spores. We describe a simple immunological method for the quantification and characterization of Pasteuria populations. Bayesian statistics were used to determine an extraction efficiency of 43% and a threshold of detection of 210 endospores g(-1) sand. This provided a robust means of estimating numbers of endospores in small-volume samples from a natural system. Based on visual assessment of endospore fluorescence, a quantitative method was developed to characterize endospore populations, which were shown to vary according to their host.
Jiang, Tingting; Dai, Yongmei; Miao, Miao; Zhang, Yue; Song, Chenglin; Wang, Zhixu
2015-07-01
To evaluate the usefulness and efficiency of a novel dietary method among urban pregnant women. Sixty one pregnant women were recruited from the ward and provided with a meal accurately weighed before cooking. The meal was photographed from three different angles before and after eating. The subjects were also interviewed for 24 h dietary recall by the investigators. Food weighting, image quantification and 24 h dietary recall were conducted by investigators from three different groups, and the messages were isolated from each other. Food consumption was analyzed on bases of classification and total summation. Nutrient intake from the meal was calculated for each subject. The data obtained from the dietary recall and the image quantification were compared with the actual values. Correlation and regression analyses were carried out on values between weight method and image quantification as well as dietary recall. Total twenty three kinds of food including rice, vegetables, fish, meats and soy bean curd were included in the experimental meal for the study. Compared with data from 24 h dietary recall (r = 0.413, P < 0.05), food weight estimated by image quantification (r = 0.778, P < 0.05, n = 308) were more correlated with weighed data, and show more concentrated linear distribution. Absolute difference distribution between image quantification and weight method of all food was 77.23 ± 56.02 (P < 0.05, n = 61), which was much small than the difference (172.77 ± 115.18) between 24 h recall and weight method. Values of almost all nutrients, including energy, protein, fat, carbohydrate, vitamin A, vitamin C, calcium, iron and zine calculated based on food weight from image quantification were more close to those of weighed data compared with 24 h dietary recall (P < 0.01). The results found by the Bland Altman analysis showed that the majority of the measurements for nutrient intake, were scattered along the mean difference line and close to the equality line (difference = 0). The plots show fairly good agreement between estimated and actual food consumption. It indicate that the differences (including the outliers) were random and did not exhibit any systematic bias, being consistent over different levels of mean food amount. On the other hand, the questionnaire showed that fifty six pregnant women considered the image quantification was less time-consuming and burdened than 24 h recall. Fifty eight of them would like to use image quantification to know their dietary status. The novel method which called instant photography (image quantification) for dietary assessment is more effective than conventional 24 h dietary recall and it also can obtain food intake values close to weighed data.
NASA Astrophysics Data System (ADS)
Hu, Yong; Wu, Hai-Long; Yin, Xiao-Li; Gu, Hui-Wen; Xiao, Rong; Wang, Li; Fang, Huan; Yu, Ru-Qin
2017-03-01
A rapid interference-free spectrofluorometric method combined with the excitation-emission matrix fluorescence and the second-order calibration methods based on the alternating penalty trilinear decomposition (APTLD) and the self-weighted alternating trilinear decomposition (SWATLD) algorithms, was proposed for the simultaneous determination of nephrotoxic aristolochic acid I (AA-I) and aristololactam I (AL-I) in five Chinese herbal medicines. The method was based on a chemical derivatization that converts the non-fluorescent AA-I to high-fluorescent AL-I, achieving a high sensitive and simultaneous quantification of the analytes. The variables of the derivatization reaction that conducted by using zinc powder in acetose methanol aqueous solution, were studied and optimized for best quantification results of AA-I and AL-I. The satisfactory results of AA-I and AL-I for the spiked recovery assay were achieved with average recoveries in the range of 100.4-103.8% and RMSEPs < 0.78 ng mL- 1, which validate the accuracy and reliability of the proposed method. The contents of AA-I and AL-I in five herbal medicines obtained from the proposed method were also in good accordance with those of the validated LC-MS/MS method. In light of high sensitive fluorescence detection, the limits of detection (LODs) of AA-I and AL-I for the proposed method compare favorably with that of the LC-MS/MS method, with the LODs < 0.35 and 0.29 ng mL- 1, respectively. The proposed strategy based on the APTLD and SWATLD algorithms by virtue of the "second-order advantage", can be considered as an attractive and green alternative for the quantification of AA-I and AL-I in complex herbal medicine matrices without any prior separations and clear-up processes.
Jiang, Lingxi; Yang, Litao; Rao, Jun; Guo, Jinchao; Wang, Shu; Liu, Jia; Lee, Seonghun; Zhang, Dabing
2010-02-01
To implement genetically modified organism (GMO) labeling regulations, an event-specific analysis method based on the junction sequence between exogenous integration and host genomic DNA has become the preferential approach for GMO identification and quantification. In this study, specific primers and TaqMan probes based on the revealed 5'-end junction sequence of GM cotton MON15985 were designed, and qualitative and quantitative polymerase chain reaction (PCR) assays were established employing the designed primers and probes. In the qualitative PCR assay, the limit of detection (LOD) was 0.5 g kg(-1) in 100 ng total cotton genomic DNA, corresponding to about 17 copies of haploid cotton genomic DNA, and the LOD and limit of quantification (LOQ) for quantitative PCR assay were 10 and 17 copies of haploid cotton genomic DNA, respectively. Furthermore, the developed quantitative PCR assays were validated in-house by five different researchers. Also, five practical samples with known GM contents were quantified using the developed PCR assay in in-house validation, and the bias between the true and quantification values ranged from 2.06% to 12.59%. This study shows that the developed qualitative and quantitative PCR methods are applicable for the identification and quantification of GM cotton MON15985 and its derivates.
Elsohaby, Ibrahim; McClure, J Trenton; Riley, Christopher B; Bryanton, Janet; Bigsby, Kathryn; Shaw, R Anthony
2018-02-20
Attenuated total reflectance infrared (ATR-IR) spectroscopy is a simple, rapid and cost-effective method for the analysis of serum. However, the complex nature of serum remains a limiting factor to the reliability of this method. We investigated the benefits of coupling the centrifugal ultrafiltration with ATR-IR spectroscopy for quantification of human serum IgA concentration. Human serum samples (n = 196) were analyzed for IgA using an immunoturbidimetric assay. ATR-IR spectra were acquired for whole serum samples and for the retentate (residue) reconstituted with saline following 300 kDa centrifugal ultrafiltration. IR-based analytical methods were developed for each of the two spectroscopic datasets, and the accuracy of each of the two methods compared. Analytical methods were based upon partial least squares regression (PLSR) calibration models - one with 5-PLS factors (for whole serum) and the second with 9-PLS factors (for the reconstituted retentate). Comparison of the two sets of IR-based analytical results to reference IgA values revealed improvements in the Pearson correlation coefficient (from 0.66 to 0.76), and the root mean squared error of prediction in IR-based IgA concentrations (from 102 to 79 mg/dL) for the ultrafiltration retentate-based method as compared to the method built upon whole serum spectra. Depleting human serum low molecular weight proteins using a 300 kDa centrifugal filter thus enhances the accuracy IgA quantification by ATR-IR spectroscopy. Further evaluation and optimization of this general approach may ultimately lead to routine analysis of a range of high molecular-weight analytical targets that are otherwise unsuitable for IR-based analysis. Copyright © 2017 Elsevier B.V. All rights reserved.
Allevi, Pietro; Femia, Eti Alessandra; Costa, Maria Letizia; Cazzola, Roberta; Anastasia, Mario
2008-11-28
The present report describes a method for the quantification of N-acetyl- and N-glycolylneuraminic acids without any derivatization, using their (13)C(3)-isotopologues as internal standards and a C(18) reversed-phase column modified by decylboronic acid which allows for the first time a complete chromatographic separation between the two analytes. The method is based on high-performance liquid chromatographic coupled with electrospray ion-trap mass spectrometry. The limit of quantification of the method is 0.1mg/L (2.0ng on column) for both analytes. The calibration curves are linear for both sialic acids over the range of 0.1-80mg/L (2.0-1600ng on column) with a correlation coefficient greater than 0.997. The proposed method was applied to the quantitative determination of sialic acids released from fetuin as a model of glycoproteins.
Good quantification practices of flavours and fragrances by mass spectrometry.
Begnaud, Frédéric; Chaintreau, Alain
2016-10-28
Over the past 15 years, chromatographic techniques with mass spectrometric detection have been increasingly used to monitor the rapidly expanded list of regulated flavour and fragrance ingredients. This trend entails a need for good quantification practices suitable for complex media, especially for multi-analytes. In this article, we present experimental precautions needed to perform the analyses and ways to process the data according to the most recent approaches. This notably includes the identification of analytes during their quantification and method validation, when applied to real matrices, based on accuracy profiles. A brief survey of application studies based on such practices is given.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Authors.
Reiter, Rolf; Wetzel, Martin; Hamesch, Karim; Strnad, Pavel; Asbach, Patrick; Haas, Matthias; Siegmund, Britta; Trautwein, Christian; Hamm, Bernd; Klatt, Dieter; Braun, Jürgen; Sack, Ingolf; Tzschätzsch, Heiko
2018-01-01
Although it has been known for decades that patients with alpha1-antitrypsin deficiency (AATD) have an increased risk of cirrhosis and hepatocellular carcinoma, limited data exist on non-invasive imaging-based methods for assessing liver fibrosis such as magnetic resonance elastography (MRE) and acoustic radiation force impulse (ARFI) quantification, and no data exist on 2D-shear wave elastography (2D-SWE). Therefore, the purpose of this study is to evaluate and compare the applicability of different elastography methods for the assessment of AATD-related liver fibrosis. Fifteen clinically asymptomatic AATD patients (11 homozygous PiZZ, 4 heterozygous PiMZ) and 16 matched healthy volunteers were examined using MRE and ARFI quantification. Additionally, patients were examined with 2D-SWE. A high correlation is evident for the shear wave speed (SWS) determined with different elastography methods in AATD patients: 2D-SWE/MRE, ARFI quantification/2D-SWE, and ARFI quantification/MRE (R = 0.8587, 0.7425, and 0.6914, respectively; P≤0.0089). Four AATD patients with pathologically increased SWS were consistently identified with all three methods-MRE, ARFI quantification, and 2D-SWE. The high correlation and consistent identification of patients with pathologically increased SWS using MRE, ARFI quantification, and 2D-SWE suggest that elastography has the potential to become a suitable imaging tool for the assessment of AATD-related liver fibrosis. These promising results provide motivation for further investigation of non-invasive assessment of AATD-related liver fibrosis using elastography.
Asara, John M; Zhang, Xiang; Zheng, Bin; Christofk, Heather H; Wu, Ning; Cantley, Lewis C
2006-01-01
Most proteomics approaches for relative quantification of protein expression use a combination of stable-isotope labeling and mass spectrometry. Traditionally, researchers have used difference gel electrophoresis (DIGE) from stained 1D and 2D gels for relative quantification. While differences in protein staining intensity can often be visualized, abundant proteins can obscure less abundant proteins, and quantification of post-translational modifications is difficult. A method is presented for quantifying changes in the abundance of a specific protein or changes in specific modifications of a protein using In-gel Stable-Isotope Labeling (ISIL). Proteins extracted from any source (tissue, cell line, immunoprecipitate, etc.), treated under two experimental conditions, are resolved in separate lanes by gel electrophoresis. The regions of interest (visualized by staining) are reacted separately with light versus heavy isotope-labeled reagents, and the gel slices are then mixed and digested with proteases. The resulting peptides are then analyzed by LC-MS to determine relative abundance of light/heavy isotope pairs and analyzed by LC-MS/MS for identification of sequence and modifications. The strategy compares well with other relative quantification strategies, and in silico calculations reveal its effectiveness as a global relative quantification strategy. An advantage of ISIL is that visualization of gel differences can be used as a first quantification step followed by accurate and sensitive protein level stable-isotope labeling and mass spectrometry-based relative quantification.
Traditional environmental mold analysis is based-on microscopic observations and counting of mold structures collected from the air on a sticky surface or culturing of molds on growth media for identification and quantification. A DNA-based method of mold analysis called mol...
Simple and accurate quantification of BTEX in ambient air by SPME and GC-MS.
Baimatova, Nassiba; Kenessov, Bulat; Koziel, Jacek A; Carlsen, Lars; Bektassov, Marat; Demyanenko, Olga P
2016-07-01
Benzene, toluene, ethylbenzene and xylenes (BTEX) comprise one of the most ubiquitous and hazardous groups of ambient air pollutants of concern. Application of standard analytical methods for quantification of BTEX is limited by the complexity of sampling and sample preparation equipment, and budget requirements. Methods based on SPME represent simpler alternative, but still require complex calibration procedures. The objective of this research was to develop a simpler, low-budget, and accurate method for quantification of BTEX in ambient air based on SPME and GC-MS. Standard 20-mL headspace vials were used for field air sampling and calibration. To avoid challenges with obtaining and working with 'zero' air, slope factors of external standard calibration were determined using standard addition and inherently polluted lab air. For polydimethylsiloxane (PDMS) fiber, differences between the slope factors of calibration plots obtained using lab and outdoor air were below 14%. PDMS fiber provided higher precision during calibration while the use of Carboxen/PDMS fiber resulted in lower detection limits for benzene and toluene. To provide sufficient accuracy, the use of 20mL vials requires triplicate sampling and analysis. The method was successfully applied for analysis of 108 ambient air samples from Almaty, Kazakhstan. Average concentrations of benzene, toluene, ethylbenzene and o-xylene were 53, 57, 11 and 14µgm(-3), respectively. The developed method can be modified for further quantification of a wider range of volatile organic compounds in air. In addition, the new method is amenable to automation. Copyright © 2016 Elsevier B.V. All rights reserved.
Shinozuka, Hiroshi; Forster, John W
2016-01-01
Background. Multiplexed sequencing is commonly performed on massively parallel short-read sequencing platforms such as Illumina, and the efficiency of library normalisation can affect the quality of the output dataset. Although several library normalisation approaches have been established, none are ideal for highly multiplexed sequencing due to issues of cost and/or processing time. Methods. An inexpensive and high-throughput library quantification method has been developed, based on an adaptation of the melting curve assay. Sequencing libraries were subjected to the assay using the Bio-Rad Laboratories CFX Connect(TM) Real-Time PCR Detection System. The library quantity was calculated through summation of reduction of relative fluorescence units between 86 and 95 °C. Results.PCR-enriched sequencing libraries are suitable for this quantification without pre-purification of DNA. Short DNA molecules, which ideally should be eliminated from the library for subsequent processing, were differentiated from the target DNA in a mixture on the basis of differences in melting temperature. Quantification results for long sequences targeted using the melting curve assay were correlated with those from existing methods (R (2) > 0.77), and that observed from MiSeq sequencing (R (2) = 0.82). Discussion.The results of multiplexed sequencing suggested that the normalisation performance of the described method is equivalent to that of another recently reported high-throughput bead-based method, BeNUS. However, costs for the melting curve assay are considerably lower and processing times shorter than those of other existing methods, suggesting greater suitability for highly multiplexed sequencing applications.
NASA Astrophysics Data System (ADS)
Ahn, Sung Hee; Hyeon, Taeghwan; Kim, Myung Soo; Moon, Jeong Hee
2017-09-01
In matrix-assisted laser desorption ionization time-of-flight mass spectrometry (MALDI-TOF), matrix-derived ions are routinely deflected away to avoid problems with ion detection. This, however, limits the use of a quantification method that utilizes the analyte-to-matrix ion abundance ratio. In this work, we will show that it is possible to measure this ratio by a minor instrumental modification of a simple form of MALDI-TOF. This involves detector gain switching. [Figure not available: see fulltext.
HPLC-MRM relative quantification analysis of fatty acids based on a novel derivatization strategy.
Cai, Tie; Ting, Hu; Xin-Xiang, Zhang; Jiang, Zhou; Jin-Lan, Zhang
2014-12-07
Fatty acids (FAs) are associated with a series of diseases including tumors, diabetes, and heart diseases. As potential biomarkers, FAs have attracted increasing attention from both biological researchers and the pharmaceutical industry. However, poor ionization efficiency, extreme diversity, strict dependence on internal standards and complicated multiple reaction monitoring (MRM) optimization protocols have challenged efforts to quantify FAs. In this work, a novel derivatization strategy based on 2,4-bis(diethylamino)-6-hydrazino-1,3,5-triazine was developed to enable quantification of FAs. The sensitivity of FA detection was significantly enhanced as a result of the derivatization procedure. FA quantities as low as 10 fg could be detected by high-performance liquid chromatography coupled with triple-quadrupole mass spectrometry. General MRM conditions were developed for any FA, which facilitated the quantification and extended the application of the method. The FA quantification strategy based on HPLC-MRM was carried out using deuterated derivatization reagents. "Heavy" derivatization reagents were used as internal standards (ISs) to minimize matrix effects. Prior to statistical analysis, amounts of each FA species were normalized by their corresponding IS, which guaranteed the accuracy and reliability of the method. FA changes in plasma induced by ageing were studied using this strategy. Several FA species were identified as potential ageing biomarkers. The sensitivity, accuracy, reliability, and full coverage of the method ensure that this strategy has strong potential for both biomarker discovery and lipidomic research.
NASA Astrophysics Data System (ADS)
Arnst, M.; Abello Álvarez, B.; Ponthot, J.-P.; Boman, R.
2017-11-01
This paper is concerned with the characterization and the propagation of errors associated with data limitations in polynomial-chaos-based stochastic methods for uncertainty quantification. Such an issue can arise in uncertainty quantification when only a limited amount of data is available. When the available information does not suffice to accurately determine the probability distributions that must be assigned to the uncertain variables, the Bayesian method for assigning these probability distributions becomes attractive because it allows the stochastic model to account explicitly for insufficiency of the available information. In previous work, such applications of the Bayesian method had already been implemented by using the Metropolis-Hastings and Gibbs Markov Chain Monte Carlo (MCMC) methods. In this paper, we present an alternative implementation, which uses an alternative MCMC method built around an Itô stochastic differential equation (SDE) that is ergodic for the Bayesian posterior. We draw together from the mathematics literature a number of formal properties of this Itô SDE that lend support to its use in the implementation of the Bayesian method, and we describe its discretization, including the choice of the free parameters, by using the implicit Euler method. We demonstrate the proposed methodology on a problem of uncertainty quantification in a complex nonlinear engineering application relevant to metal forming.
NASA Astrophysics Data System (ADS)
He, Jingjing; Wang, Dengjiang; Zhang, Weifang
2015-03-01
This study presents an experimental and modeling study for damage detection and quantification in riveted lap joints. Embedded lead zirconate titanate piezoelectric (PZT) ceramic wafer-type sensors are employed to perform in-situ non-destructive testing during fatigue cyclical loading. A multi-feature integration method is developed to quantify the crack size using signal features of correlation coefficient, amplitude change, and phase change. In addition, probability of detection (POD) model is constructed to quantify the reliability of the developed sizing method. Using the developed crack size quantification method and the resulting POD curve, probabilistic fatigue life prediction can be performed to provide comprehensive information for decision-making. The effectiveness of the overall methodology is demonstrated and validated using several aircraft lap joint specimens from different manufactures and under different loading conditions.
Chen, Yao; Zane, Nicole R; Thakker, Dhiren R; Wang, Michael Zhuo
2016-07-01
Flavin-containing monooxygenases (FMOs) have a significant role in the metabolism of small molecule pharmaceuticals. Among the five human FMOs, FMO1, FMO3, and FMO5 are the most relevant to hepatic drug metabolism. Although age-dependent hepatic protein expression, based on immunoquantification, has been reported previously for FMO1 and FMO3, there is very little information on hepatic FMO5 protein expression. To overcome the limitations of immunoquantification, an ultra-performance liquid chromatography (UPLC)-multiple reaction monitoring (MRM)-based targeted quantitative proteomic method was developed and optimized for the quantification of FMO1, FMO3, and FMO5 in human liver microsomes (HLM). A post-in silico product ion screening process was incorporated to verify LC-MRM detection of potential signature peptides before their synthesis. The developed method was validated by correlating marker substrate activity and protein expression in a panel of adult individual donor HLM (age 39-67 years). The mean (range) protein expression of FMO3 and FMO5 was 46 (26-65) pmol/mg HLM protein and 27 (11.5-49) pmol/mg HLM protein, respectively. To demonstrate quantification of FMO1, a panel of fetal individual donor HLM (gestational age 14-20 weeks) was analyzed. The mean (range) FMO1 protein expression was 7.0 (4.9-9.7) pmol/mg HLM protein. Furthermore, the ontogenetic protein expression of FMO5 was evaluated in fetal, pediatric, and adult HLM. The quantification of FMO proteins also was compared using two different calibration standards, recombinant proteins versus synthetic signature peptides, to assess the ratio between holoprotein versus total protein. In conclusion, a UPLC-MRM-based targeted quantitative proteomic method has been developed for the quantification of FMO enzymes in HLM. Copyright © 2016 by The American Society for Pharmacology and Experimental Therapeutics.
Chen, Yao; Zane, Nicole R.; Thakker, Dhiren R.
2016-01-01
Flavin-containing monooxygenases (FMOs) have a significant role in the metabolism of small molecule pharmaceuticals. Among the five human FMOs, FMO1, FMO3, and FMO5 are the most relevant to hepatic drug metabolism. Although age-dependent hepatic protein expression, based on immunoquantification, has been reported previously for FMO1 and FMO3, there is very little information on hepatic FMO5 protein expression. To overcome the limitations of immunoquantification, an ultra-performance liquid chromatography (UPLC)-multiple reaction monitoring (MRM)-based targeted quantitative proteomic method was developed and optimized for the quantification of FMO1, FMO3, and FMO5 in human liver microsomes (HLM). A post-in silico product ion screening process was incorporated to verify LC-MRM detection of potential signature peptides before their synthesis. The developed method was validated by correlating marker substrate activity and protein expression in a panel of adult individual donor HLM (age 39–67 years). The mean (range) protein expression of FMO3 and FMO5 was 46 (26–65) pmol/mg HLM protein and 27 (11.5–49) pmol/mg HLM protein, respectively. To demonstrate quantification of FMO1, a panel of fetal individual donor HLM (gestational age 14–20 weeks) was analyzed. The mean (range) FMO1 protein expression was 7.0 (4.9–9.7) pmol/mg HLM protein. Furthermore, the ontogenetic protein expression of FMO5 was evaluated in fetal, pediatric, and adult HLM. The quantification of FMO proteins also was compared using two different calibration standards, recombinant proteins versus synthetic signature peptides, to assess the ratio between holoprotein versus total protein. In conclusion, a UPLC-MRM-based targeted quantitative proteomic method has been developed for the quantification of FMO enzymes in HLM. PMID:26839369
Multi-fidelity methods for uncertainty quantification in transport problems
NASA Astrophysics Data System (ADS)
Tartakovsky, G.; Yang, X.; Tartakovsky, A. M.; Barajas-Solano, D. A.; Scheibe, T. D.; Dai, H.; Chen, X.
2016-12-01
We compare several multi-fidelity approaches for uncertainty quantification in flow and transport simulations that have a lower computational cost than the standard Monte Carlo method. The cost reduction is achieved by combining a small number of high-resolution (high-fidelity) simulations with a large number of low-resolution (low-fidelity) simulations. We propose a new method, a re-scaled Multi Level Monte Carlo (rMLMC) method. The rMLMC is based on the idea that the statistics of quantities of interest depends on scale/resolution. We compare rMLMC with existing multi-fidelity methods such as Multi Level Monte Carlo (MLMC) and reduced basis methods and discuss advantages of each approach.
Determination of free polysaccharide in Vi glycoconjugate vaccine against typhoid fever.
Giannelli, C; Cappelletti, E; Di Benedetto, R; Pippi, F; Arcuri, M; Di Cioccio, V; Martin, L B; Saul, A; Micoli, F
2017-05-30
Glycoconjugate vaccines based on the Vi capsular polysaccharide directed against Salmonella enterica serovar Typhi are licensed or in development against typhoid fever, an important cause of morbidity and mortality in developing countries. Quantification of free polysaccharide in conjugate vaccines is an important quality control for release, to monitor vaccine stability and to ensure appropriate immune response. However, we found that existing separation methods based on size are not appropriate as free Vi non-specifically binds to unconjugated and conjugated protein. We developed a method based on free Vi separation by Capto Adhere resin and quantification by HPAEC-PAD. The method has been tested for conjugates of Vi derived from Citrobacter freundii with different carrier proteins such as CRM 197 , Tetanus Toxoid and Diphtheria Toxoid. Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
2017-02-02
Corresponding Author Abstract Accurate virus quantification is sought, but a perfect method still eludes the scientific community. Electron...unlimited. UNCLASSIFIED 2 provides morphology data and counts all viral particles, including partial or noninfectious particles; however, EM methods ...consistent, reproducible virus quantification method called Scanning Transmission Electron Microscopy – Virus Quantification (STEM-VQ) which simplifies
Louwagie, Mathilde; Kieffer-Jaquinod, Sylvie; Dupierris, Véronique; Couté, Yohann; Bruley, Christophe; Garin, Jérôme; Dupuis, Alain; Jaquinod, Michel; Brun, Virginie
2012-07-06
Accurate quantification of pure peptides and proteins is essential for biotechnology, clinical chemistry, proteomics, and systems biology. The reference method to quantify peptides and proteins is amino acid analysis (AAA). This consists of an acidic hydrolysis followed by chromatographic separation and spectrophotometric detection of amino acids. Although widely used, this method displays some limitations, in particular the need for large amounts of starting material. Driven by the need to quantify isotope-dilution standards used for absolute quantitative proteomics, particularly stable isotope-labeled (SIL) peptides and PSAQ proteins, we developed a new AAA assay (AAA-MS). This method requires neither derivatization nor chromatographic separation of amino acids. It is based on rapid microwave-assisted acidic hydrolysis followed by high-resolution mass spectrometry analysis of amino acids. Quantification is performed by comparing MS signals from labeled amino acids (SIL peptide- and PSAQ-derived) with those of unlabeled amino acids originating from co-hydrolyzed NIST standard reference materials. For both SIL peptides and PSAQ standards, AAA-MS quantification results were consistent with classical AAA measurements. Compared to AAA assay, AAA-MS was much faster and was 100-fold more sensitive for peptide and protein quantification. Finally, thanks to the development of a labeled protein standard, we also extended AAA-MS analysis to the quantification of unlabeled proteins.
Addressing Phase Errors in Fat-Water Imaging Using a Mixed Magnitude/Complex Fitting Method
Hernando, D.; Hines, C. D. G.; Yu, H.; Reeder, S.B.
2012-01-01
Accurate, noninvasive measurements of liver fat content are needed for the early diagnosis and quantitative staging of nonalcoholic fatty liver disease. Chemical shift-based fat quantification methods acquire images at multiple echo times using a multiecho spoiled gradient echo sequence, and provide fat fraction measurements through postprocessing. However, phase errors, such as those caused by eddy currents, can adversely affect fat quantification. These phase errors are typically most significant at the first echo of the echo train, and introduce bias in complex-based fat quantification techniques. These errors can be overcome using a magnitude-based technique (where the phase of all echoes is discarded), but at the cost of significantly degraded signal-to-noise ratio, particularly for certain choices of echo time combinations. In this work, we develop a reconstruction method that overcomes these phase errors without the signal-to-noise ratio penalty incurred by magnitude fitting. This method discards the phase of the first echo (which is often corrupted) while maintaining the phase of the remaining echoes (where phase is unaltered). We test the proposed method on 104 patient liver datasets (from 52 patients, each scanned twice), where the fat fraction measurements are compared to coregistered spectroscopy measurements. We demonstrate that mixed fitting is able to provide accurate fat fraction measurements with high signal-to-noise ratio and low bias over a wide choice of echo combinations. PMID:21713978
NASA Astrophysics Data System (ADS)
Keller, Brad M.; Reeves, Anthony P.; Yankelevitz, David F.; Henschke, Claudia I.; Barr, R. Graham
2009-02-01
Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for the imaging of the anatomical basis of emphysema and quantification of the underlying disease state. Several measures have been introduced for the quantification emphysema directly from CT data; most,however, are based on the analysis of density information provided by the CT scans, which vary by scanner and can be hard to standardize across sites and time. Given that one of the anatomical variations associated with the progression of emphysema is the flatting of the diaphragm due to the loss of elasticity in the lung parenchyma, curvature analysis of the diaphragm would provide information about emphysema from CT. Therefore, we propose a new, non-density based measure of the curvature of the diaphragm that would allow for further quantification methods in a robust manner. To evaluate the new method, 24 whole-lung scans were analyzed using the ratios of the lung height and diaphragm width to diaphragm height as curvature estimates as well as using the emphysema index as comparison. Pearson correlation coefficients showed a strong trend of several of the proposed diaphragm curvature measures to have higher correlations, of up to r=0.57, with DLCO% and VA than did the emphysema index. Furthermore, we found emphysema index to have only a 0.27 correlation to the proposed measures, indicating that the proposed measures evaluate different aspects of the disease.
Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi
2014-01-01
A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize event, MIR162. We first prepared a standard plasmid for MIR162 quantification. The conversion factor (Cf) required to calculate the genetically modified organism (GMO) amount was empirically determined for two real-time PCR instruments, the Applied Biosystems 7900HT (ABI7900) and the Applied Biosystems 7500 (ABI7500) for which the determined Cf values were 0.697 and 0.635, respectively. To validate the developed method, a blind test was carried out in an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSDr). The determined biases were less than 25% and the RSDr values were less than 20% at all evaluated concentrations. These results suggested that the limit of quantitation of the method was 0.5%, and that the developed method would thus be suitable for practical analyses for the detection and quantification of MIR162.
Interferences in the direct quantification of bisphenol S in paper by means of thermochemolysis.
Becerra, Valentina; Odermatt, Jürgen
2013-02-01
This article analyses the interferences in the quantification of traces of bisphenol S in paper by applying the direct analytical method "analytical pyrolysis gas chromatography mass spectrometry" (Py-GC/MS) in conjunction with on-line derivatisation with tetramethylammonium hydroxide (TMAH). As the analytes are simultaneously analysed with the matrix, the interferences derive from the matrix. The investigated interferences are found in the analysis of paper samples, which include bisphenol S derivative compounds. As the free bisphenol S is the hydrolysis product of the bisphenol S derivative compounds, the detected amount of bisphenol S in the sample may be overestimated. It is found that the formation of free bisphenol S from the bisphenol S derivative compounds is enhanced in the presence of tetramethylammonium hydroxide (TMAH) under pyrolytic conditions. In order to avoid the formation of bisphenol S trimethylsulphonium hydroxide (TMSH) is introduced. Different parameters are optimised in the development of the quantification method with TMSH. The quantification method based on TMSH thermochemolysis has been validated in terms of reproducibility and accuracy. Copyright © 2012 Elsevier B.V. All rights reserved.
Multivariate Analysis for Quantification of Plutonium(IV) in Nitric Acid Based on Absorption Spectra
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lines, Amanda M.; Adami, Susan R.; Sinkov, Sergey I.
Development of more effective, reliable, and fast methods for monitoring process streams is a growing opportunity for analytical applications. Many fields can benefit from on-line monitoring, including the nuclear fuel cycle where improved methods for monitoring radioactive materials will facilitate maintenance of proper safeguards and ensure safe and efficient processing of materials. On-line process monitoring with a focus on optical spectroscopy can provide a fast, non-destructive method for monitoring chemical species. However, identification and quantification of species can be hindered by the complexity of the solutions if bands overlap or show condition-dependent spectral features. Plutonium (IV) is one example ofmore » a species which displays significant spectral variation with changing nitric acid concentration. Single variate analysis (i.e. Beer’s Law) is difficult to apply to the quantification of Pu(IV) unless the nitric acid concentration is known and separate calibration curves have been made for all possible acid strengths. Multivariate, or chemometric, analysis is an approach that allows for the accurate quantification of Pu(IV) without a priori knowledge of nitric acid concentration.« less
Quantification of triglyceride content in oleaginous materials using thermo-gravimetry
Maddi, Balakrishna; Vadlamani, Agasteswar; Viamajala, Sridhar; ...
2017-10-16
Laboratory analytical methods for quantification of triglyceride content in oleaginous biomass samples, especially microalgae, require toxic chemicals and/or organic solvents and involve multiple steps. We describe a simple triglyceride quantification method that uses thermo-gravimetry. This method is based on the observation that triglycerides undergo near-complete volatilization/degradation over a narrow temperature interval with a derivative weight loss peak at 420 °C when heated in an inert atmosphere. Degradation of the other constituents of oleaginous biomass (protein and carbohydrates) is largely complete after prolonged exposure of samples at 320 °C. Based on these observations, the triglyceride content of oleaginous biomass was estimatedmore » by using the following two-step process. In Step 1, samples were heated to 320 °C and kept isothermal at this temperature for 15 min. In Step 2, samples were heated from 320 °C to 420 °C and then kept isothermal at 420 °C for 15 min. The results show that mass loss in step 2 correlated well with triglyceride content estimates obtained from conventional techniques for diverse microalgae and oilseed samples.« less
Quantification of triglyceride content in oleaginous materials using thermo-gravimetry
DOE Office of Scientific and Technical Information (OSTI.GOV)
Maddi, Balakrishna; Vadlamani, Agasteswar; Viamajala, Sridhar
Laboratory analytical methods for quantification of triglyceride content in oleaginous biomass samples, especially microalgae, require toxic chemicals and/or organic solvents and involve multiple steps. We describe a simple triglyceride quantification method that uses thermo-gravimetry. This method is based on the observation that triglycerides undergo near-complete volatilization/degradation over a narrow temperature interval with a derivative weight loss peak at 420 °C when heated in an inert atmosphere. Degradation of the other constituents of oleaginous biomass (protein and carbohydrates) is largely complete after prolonged exposure of samples at 320 °C. Based on these observations, the triglyceride content of oleaginous biomass was estimatedmore » by using the following two-step process. In Step 1, samples were heated to 320 °C and kept isothermal at this temperature for 15 min. In Step 2, samples were heated from 320 °C to 420 °C and then kept isothermal at 420 °C for 15 min. The results show that mass loss in step 2 correlated well with triglyceride content estimates obtained from conventional techniques for diverse microalgae and oilseed samples.« less
Molecular nonlinear dynamics and protein thermal uncertainty quantification
Xia, Kelin; Wei, Guo-Wei
2014-01-01
This work introduces molecular nonlinear dynamics (MND) as a new approach for describing protein folding and aggregation. By using a mode system, we show that the MND of disordered proteins is chaotic while that of folded proteins exhibits intrinsically low dimensional manifolds (ILDMs). The stability of ILDMs is found to strongly correlate with protein energies. We propose a novel method for protein thermal uncertainty quantification based on persistently invariant ILDMs. Extensive comparison with experimental data and the state-of-the-art methods in the field validate the proposed new method for protein B-factor prediction. PMID:24697365
Kamesh Iyer, Srikant; Tasdizen, Tolga; Burgon, Nathan; Kholmovski, Eugene; Marrouche, Nassir; Adluru, Ganesh; DiBella, Edward
2016-09-01
Current late gadolinium enhancement (LGE) imaging of left atrial (LA) scar or fibrosis is relatively slow and requires 5-15min to acquire an undersampled (R=1.7) 3D navigated dataset. The GeneRalized Autocalibrating Partially Parallel Acquisitions (GRAPPA) based parallel imaging method is the current clinical standard for accelerating 3D LGE imaging of the LA and permits an acceleration factor ~R=1.7. Two compressed sensing (CS) methods have been developed to achieve higher acceleration factors: a patch based collaborative filtering technique tested with acceleration factor R~3, and a technique that uses a 3D radial stack-of-stars acquisition pattern (R~1.8) with a 3D total variation constraint. The long reconstruction time of these CS methods makes them unwieldy to use, especially the patch based collaborative filtering technique. In addition, the effect of CS techniques on the quantification of percentage of scar/fibrosis is not known. We sought to develop a practical compressed sensing method for imaging the LA at high acceleration factors. In order to develop a clinically viable method with short reconstruction time, a Split Bregman (SB) reconstruction method with 3D total variation (TV) constraints was developed and implemented. The method was tested on 8 atrial fibrillation patients (4 pre-ablation and 4 post-ablation datasets). Blur metric, normalized mean squared error and peak signal to noise ratio were used as metrics to analyze the quality of the reconstructed images, Quantification of the extent of LGE was performed on the undersampled images and compared with the fully sampled images. Quantification of scar from post-ablation datasets and quantification of fibrosis from pre-ablation datasets showed that acceleration factors up to R~3.5 gave good 3D LGE images of the LA wall, using a 3D TV constraint and constrained SB methods. This corresponds to reducing the scan time by half, compared to currently used GRAPPA methods. Reconstruction of 3D LGE images using the SB method was over 20 times faster than standard gradient descent methods. Copyright © 2016 Elsevier Inc. All rights reserved.
Lombardero, Martin; Henquin, Ruth; Perea, Gabriel; Corneli, Mariana; Izurieta, Carlos
2017-01-01
Quantification of mitral regurgitation (MR) by two-dimensional (2D) transthoracic echocardiography (TTE) is based on the analysis of the proximal flow convergence (PFC) and the "vena contracta" (VC). This method assumes geometries and can be misleading. In contrast, three-dimensional (3D) echocardiography directly measures flow volumes and does not assume geometries, which allows for more accurate MR evaluation. To report the 3D transesophageal echocardiography (3DTEE) feasibility for MR quantification and evaluate its concordance with 2D echo. Twenty-seven consecutive patients undergoing 2D and 3DTEE for presurgical MR evaluation were studied prospectively. MR quantification was performed by classical 2D methods based on PFC. Diameters of the VC in orthogonal planes by 3DTEE were estimated, establishing the VC sphericity index as well as VC area (VCA) by direct planimetry. In case of multiple jets, we calculated the sum of the VCA. MR assessment by 3DTEE was feasible. An adequate concordance between VC measurements by 2D methods (TTE and TEE) was observed; however, there was a poor correlation when compared with 3DTEE. The sphericity index of the VC was: 2.08 (±0. 72), reflecting a noncircular VC. 3DTEE is a feasible method for the assessment of the MR true morphology, allowing a better quantification of MR without assuming any geometry. This method revealed the presence of multiple jets, potentially improving MR evaluation and leading to changes in medical decision when compared to 2D echo assessment. © 2016, Wiley Periodicals, Inc.
Heckman, Katherine M; Otemuyiwa, Bamidele; Chenevert, Thomas L; Malyarenko, Dariya; Derstine, Brian A; Wang, Stewart C; Davenport, Matthew S
2018-06-27
The purpose of the study is to determine whether a novel semi-automated DIXON-based fat quantification algorithm can reliably quantify visceral fat using a CT-based reference standard. This was an IRB-approved retrospective cohort study of 27 subjects who underwent abdominopelvic CT within 7 days of proton density fat fraction (PDFF) mapping on a 1.5T MRI. Cross-sectional visceral fat area per slice (cm 2 ) was measured in blinded fashion in each modality at intervertebral disc levels from T12 to L4. CT estimates were obtained using a previously published semi-automated computational image processing system that sums pixels with attenuation - 205 to - 51 HU. MR estimates were obtained using two novel semi-automated DIXON-based fat quantification algorithms that measure visceral fat area by spatially regularizing non-uniform fat-only signal intensity or de-speckling PDFF 2D images and summing pixels with PDFF ≥ 50%. Pearson's correlations and Bland-Altman analyses were performed. Visceral fat area per slice ranged from 9.2 to 429.8 cm 2 for MR and from 1.6 to 405.5 cm 2 for CT. There was a strong correlation between CT and MR methods in measured visceral fat area across all studied vertebral body levels (r = 0.97; n = 101 observations); the least (r = 0.93) correlation was at T12. Bland-Altman analysis revealed a bias of 31.7 cm 2 (95% CI [- 27.1]-90.4 cm 2 ), indicating modestly higher visceral fat assessed by MR. MR- and CT-based visceral fat quantification are highly correlated and have good cross-modality reliability, indicating that visceral fat quantification by either method can yield a stable and reliable biomarker.
Barco, Sebastiano; Castagnola, Elio; Moscatelli, Andrea; Rudge, James; Tripodi, Gino; Cangemi, Giuliana
2017-10-25
In this paper we show the development and validation of a volumetric absorptive microsampling (VAMS™)-LC-MS/MS method for the simultaneous quantification of four antibiotics: piperacillin-tazobactam, meropenem, linezolid and ceftazidime in 10μL human blood. The novel VAMS-LC-MS/MS method has been compared with a dried blood spot (DBS)-based method in terms of impact of hematocrit (HCT) on accuracy, reproducibility, recovery and matrix effect. Antibiotics were extracted from VAMS and DBS by protein precipitation with methanol after a re-hydration step at 37°C for 10min. LC-MS/MS was carried out on a Thermo Scientific™ TSQ Quantum™ Access MAX triple quadrupole coupled to an Accela ™UHPLC system. The VAMS-LC-MS/MS method is selective, precise and reproducible. In contrast to DBS, it allows an accurate quantification without any HCT influence. It has been applied to samples derived from pediatric patients under therapy. VAMS is a valid alternative sampling strategy for the quantification of antibiotics and is valuable in support of clinical PK/PD studies and consequently therapeutic drug monitoring (TDM) in pediatrics. Copyright © 2017 Elsevier B.V. All rights reserved.
Multiple products monitoring as a robust approach for peptide quantification.
Baek, Je-Hyun; Kim, Hokeun; Shin, Byunghee; Yu, Myeong-Hee
2009-07-01
Quantification of target peptides and proteins is crucial for biomarker discovery. Approaches such as selected reaction monitoring (SRM) and multiple reaction monitoring (MRM) rely on liquid chromatography and mass spectrometric analysis of defined peptide product ions. These methods are not very widespread because the determination of quantifiable product ion using either SRM or MRM is a very time-consuming process. We developed a novel approach for quantifying target peptides without such an arduous process of ion selection. This method is based on monitoring multiple product ions (multiple products monitoring: MpM) from full-range MS2 spectra of a target precursor. The MpM method uses a scoring system that considers both the absolute intensities of product ions and the similarities between the query MS2 spectrum and the reference MS2 spectrum of the target peptide. Compared with conventional approaches, MpM greatly improves sensitivity and selectivity of peptide quantification using an ion-trap mass spectrometer.
Trimboli, Francesca; Morittu, Valeria Maria; Cicino, Caterina; Palmieri, Camillo; Britti, Domenico
2017-10-13
The substitution of ewe milk with more economic cow milk is a common fraud. Here we present a capillary electrophoresis method for the quantification of ewe milk in ovine/bovine milk mixtures, which allows for the rapid and inexpensive recognition of ewe milk adulteration with cow milk. We utilized a routine CE method for human blood and urine proteins analysis, which fulfilled the separation of skimmed milk proteins in alkaline buffer. Under this condition, ovine and bovine milk exhibited a recognizable and distinct CE protein profiles, with a specific ewe peak showing a reproducible migration zone in ovine/bovine mixtures. Based on ewe specific CE peak, we developed a method for ewe milk quantification in ovine/bovine skimmed milk mixtures, which showed good linearity, precision and accuracy, and a minimum amount of detectable fraudulent cow milk equal to 5%. Copyright © 2017 Elsevier B.V. All rights reserved.
Gibby, Jacob T; Njeru, Dennis K; Cvetko, Steve T; Heiny, Eric L; Creer, Andrew R; Gibby, Wendell A
We correlate and evaluate the accuracy of accepted anthropometric methods of percent body fat (%BF) quantification, namely, hydrostatic weighing (HW) and air displacement plethysmography (ADP), to 2 automatic adipose tissue quantification methods using computed tomography (CT). Twenty volunteer subjects (14 men, 6 women) received head-to-toe CT scans. Hydrostatic weighing and ADP were obtained from 17 and 12 subjects, respectively. The CT data underwent conversion using 2 separate algorithms, namely, the Schneider method and the Beam method, to convert Hounsfield units to their respective tissue densities. The overall mass and %BF of both methods were compared with HW and ADP. When comparing ADP to CT data using the Schneider method and Beam method, correlations were r = 0.9806 and 0.9804, respectively. Paired t tests indicated there were no statistically significant biases. Additionally, observed average differences in %BF between ADP and the Schneider method and the Beam method were 0.38% and 0.77%, respectively. The %BF measured from ADP, the Schneider method, and the Beam method all had significantly higher mean differences when compared with HW (3.05%, 2.32%, and 1.94%, respectively). We have shown that total body mass correlates remarkably well with both the Schneider method and Beam method of mass quantification. Furthermore, %BF calculated with the Schneider method and Beam method CT algorithms correlates remarkably well with ADP. The application of these CT algorithms have utility in further research to accurately stratify risk factors with periorgan, visceral, and subcutaneous types of adipose tissue, and has the potential for significant clinical application.
Zhang, Chi; Fang, Xin; Qiu, Haopu; Li, Ning
2015-01-01
Real-time PCR amplification of mitochondria gene could not be used for DNA quantification, and that of single copy DNA did not allow an ideal sensitivity. Moreover, cross-reactions among similar species were commonly observed in the published methods amplifying repetitive sequence, which hindered their further application. The purpose of this study was to establish a short interspersed nuclear element (SINE)-based real-time PCR approach having high specificity for species detection that could be used in DNA quantification. After massive screening of candidate Sus scrofa SINEs, one optimal combination of primers and probe was selected, which had no cross-reaction with other common meat species. LOD of the method was 44 fg DNA/reaction. Further, quantification tests showed this approach was practical in DNA estimation without tissue variance. Thus, this study provided a new tool for qualitative detection of porcine component, which could be promising in the QC of meat products.
Pollet, Jeroen; Versteeg, Leroy; Rezende, Wanderson; Strych, Ulrich; Gusovsky, Fabian; Hotez, Peter J; Bottazzi, Maria Elena
2017-03-07
Despite the generally accepted immunostimulatory effect of Toll-Like Receptor 4 (TLR4) agonists and their value as vaccine adjuvants, there remains a demand for fast and easy quantification assays for these TLR4 agonists in order to accelerate and improve vaccine formulation studies. A new medium-throughput method was developed for the quantification of the TLR4 agonist, E6020, independent of the formulation composition. The assay uses a fluorescent hydrazide (DCCH) to label the synthetic lipopolysaccharide (LPS) analog E6020 through its diketone groups. This novel, low-cost, and fluorescence based assay may obviate the need for traditional approaches that primarily rely on Fourier transform infrared spectroscopy (FTIR) or mass spectrometry. The experiments were performed in a wide diversity of vaccine formulations containing E6020 to assess method robustness and accuracy. The assay was also expanded to evaluate the loading efficiency of E6020 in poly(lactic-co-glycolic acid) (PLGA) micro-particles. Copyright © 2017 Elsevier Ltd. All rights reserved.
Xia, Jun; Danielli, Amos; Liu, Yan; Wang, Lidai; Maslov, Konstantin; Wang, Lihong V.
2014-01-01
Photoacoustic tomography (PAT) is a hybrid imaging technique that has broad preclinical and clinical applications. Based on the photoacoustic effect, PAT directly measures specific optical absorption, which is the product of the tissue-intrinsic optical absorption coefficient and the local optical fluence. Therefore, quantitative PAT, such as absolute oxygen saturation (sO2) quantification, requires knowledge of the local optical fluence, which can be estimated only through invasive measurements or sophisticated modeling of light transportation. In this work, we circumvent this requirement by taking advantage of the dynamics in sO2. The new method works when the sO2 transition can be simultaneously monitored with multiple wavelengths. For each wavelength, the ratio of photoacoustic amplitudes measured at different sO2 states is utilized. Using the ratio cancels the contribution from optical fluence and allows calibration-free quantification of absolute sO2. The new method was validated through both phantom and in vivo experiments. PMID:23903146
Effect of food processing on plant DNA degradation and PCR-based GMO analysis: a review.
Gryson, Nicolas
2010-03-01
The applicability of a DNA-based method for GMO detection and quantification depends on the quality and quantity of the DNA. Important food-processing conditions, for example temperature and pH, may lead to degradation of the DNA, rendering PCR analysis impossible or GMO quantification unreliable. This review discusses the effect of several food processes on DNA degradation and subsequent GMO detection and quantification. The data show that, although many of these processes do indeed lead to the fragmentation of DNA, amplification of the DNA may still be possible. Length and composition of the amplicon may, however, affect the result, as also may the method of extraction used. Also, many techniques are used to describe the behaviour of DNA in food processing, which occasionally makes it difficult to compare research results. Further research should be aimed at defining ingredients in terms of their DNA quality and PCR amplification ability, and elaboration of matrix-specific certified reference materials.
Monakhova, Yulia B; Kohl-Himmelseher, Matthias; Kuballa, Thomas; Lachenmeier, Dirk W
2014-11-01
A fast and reliable nuclear magnetic resonance spectroscopic method for quantitative determination (qNMR) of targeted molecules in reference materials has been established using the ERETIC2 methodology (electronic reference to access in vivo concentrations) based on the PULCON principle (pulse length based concentration determination). The developed approach was validated for the analysis of pharmaceutical samples in the context of official medicines control, including ibandronic acid, amantadine, ambroxol and lercanidipine. The PULCON recoveries were above 94.3% and coefficients of variation (CVs) obtained by quantification of different targeted resonances ranged between 0.7% and 2.8%, demonstrating that the qNMR method is a precise tool for rapid quantification (approximately 15min) of reference materials and medicinal products. Generally, the values were within specification (certified values) provided by the manufactures. The results were in agreement with NMR quantification using an internal standard and validated reference HPLC analysis. The PULCON method was found to be a practical alternative with competitive precision and accuracy to the classical internal reference method and it proved to be applicable to different solvent conditions. The method can be recommended for routine use in medicines control laboratories, especially when the availability and costs of reference compounds are problematic. Copyright © 2014 Elsevier B.V. All rights reserved.
Adrait, Annie; Lebert, Dorothée; Trauchessec, Mathieu; Dupuis, Alain; Louwagie, Mathilde; Masselon, Christophe; Jaquinod, Michel; Chevalier, Benoît; Vandenesch, François; Garin, Jérôme; Bruley, Christophe; Brun, Virginie
2012-06-06
Enterotoxin A (SEA) is a staphylococcal virulence factor which is suspected to worsen septic shock prognosis. However, the presence of SEA in the blood of sepsis patients has never been demonstrated. We have developed a mass spectrometry-based assay for the targeted and absolute quantification of SEA in serum. To enhance sensitivity and specificity, we combined an immunoaffinity-based sample preparation with mass spectrometry analysis in the selected reaction monitoring (SRM) mode. Absolute quantification of SEA was performed using the PSAQ™ method (Protein Standard Absolute Quantification), which uses a full-length isotope-labeled SEA as internal standard. The lower limit of detection (LLOD) and lower limit of quantification (LLOQ) were estimated at 352pg/mL and 1057pg/mL, respectively. SEA recovery after immunocapture was determined to be 7.8±1.4%. Therefore, we assumed that less than 1femtomole of each SEA proteotypic peptide was injected on the liquid chromatography column before SRM analysis. From a 6-point titration experiment, quantification accuracy was determined to be 77% and precision at LLOQ was lower than 5%. With this sensitive PSAQ-SRM assay, we expect to contribute to decipher the pathophysiological role of SEA in severe sepsis. This article is part of a Special Issue entitled: Proteomics: The clinical link. Copyright © 2011 Elsevier B.V. All rights reserved.
A multicenter study benchmarks software tools for label-free proteome quantification.
Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan
2016-11-01
Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.
Leclercq, L; Laurent, C; De Pauw, E
1997-05-15
A method was developed for the analysis of 7-(2-hydroxyethyl)guanine (7HEG), the major DNA adduct formed after exposure to ethylene oxide (EO). The method is based on DNA neutral thermal hydrolysis, adduct micro-concentration, and final characterization and quantification by HPLC coupled to single-ion monitoring electrospray mass spectrometry (HPLC/SIR-ESMS). The method was found to be selective, sensitive, and easy to handle with no need for enzymatic digestion or previous sample derivatization. Detection limit was found to be close to 1 fmol of adduct injected (10(-10) M), thus allowing the detection of approximately three modified bases on 10(8) intact nucleotides in blood sample analysis. Quantification results are shown for 7HEG after calf thymus DNA and blood exposure to various doses of EO, in both cases obtaining clear dose-response relationships.
Karain, Wael I
2017-11-28
Proteins undergo conformational transitions over different time scales. These transitions are closely intertwined with the protein's function. Numerous standard techniques such as principal component analysis are used to detect these transitions in molecular dynamics simulations. In this work, we add a new method that has the ability to detect transitions in dynamics based on the recurrences in the dynamical system. It combines bootstrapping and recurrence quantification analysis. We start from the assumption that a protein has a "baseline" recurrence structure over a given period of time. Any statistically significant deviation from this recurrence structure, as inferred from complexity measures provided by recurrence quantification analysis, is considered a transition in the dynamics of the protein. We apply this technique to a 132 ns long molecular dynamics simulation of the β-Lactamase Inhibitory Protein BLIP. We are able to detect conformational transitions in the nanosecond range in the recurrence dynamics of the BLIP protein during the simulation. The results compare favorably to those extracted using the principal component analysis technique. The recurrence quantification analysis based bootstrap technique is able to detect transitions between different dynamics states for a protein over different time scales. It is not limited to linear dynamics regimes, and can be generalized to any time scale. It also has the potential to be used to cluster frames in molecular dynamics trajectories according to the nature of their recurrence dynamics. One shortcoming for this method is the need to have large enough time windows to insure good statistical quality for the recurrence complexity measures needed to detect the transitions.
Novel method to detect microRNAs using chip-based QuantStudio 3D digital PCR.
Conte, Davide; Verri, Carla; Borzi, Cristina; Suatoni, Paola; Pastorino, Ugo; Sozzi, Gabriella; Fortunato, Orazio
2015-10-23
Research efforts for the management of cancer, in particular for lung cancer, are directed to identify new strategies for its early detection. MicroRNAs (miRNAs) are a new promising class of circulating biomarkers for cancer detection, but lack of consensus on data normalization methods has affected the diagnostic potential of circulating miRNAs. There is a growing interest in techniques that allow an absolute quantification of miRNAs which could be useful for early diagnosis. Recently, digital PCR, mainly based on droplets generation, emerged as an affordable technology for precise and absolute quantification of nucleic acids. In this work, we described a new interesting approach for profiling circulating miRNAs in plasma samples using a chip-based platform, the QuantStudio 3D digital PCR. The proposed method was validated using synthethic oligonucleotide at serial dilutions in plasma samples of lung cancer patients and in lung tissues and cell lines. Given its reproducibility and reliability, our approach could be potentially applied for the identification and quantification of miRNAs in other biological samples such as circulating exosomes or protein complexes. As chip-digital PCR becomes more established, it would be a robust tool for quantitative assessment of miRNA copy number for diagnosis of lung cancer and other diseases.
Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N A; Bin Zaheer, Kashif
2015-01-01
Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.
Bansal, Sunil; Durrett, Timothy P
2016-09-01
Acetyl-triacylglycerols (acetyl-TAG) possess an sn-3 acetate group, which confers useful chemical and physical properties to these unusual triacylglycerols (TAG). Current methods for quantification of acetyl-TAG are time consuming and do not provide any information on the molecular species profile. Electrospray ionization mass spectrometry (ESI-MS)-based methods can overcome these drawbacks. However, the ESI-MS signal intensity for TAG depends on the aliphatic chain length and unsaturation index of the molecule. Therefore response factors for different molecular species need to be determined before any quantification. The effects of the chain length and the number of double-bonds of the sn-1/2 acyl groups on the signal intensity for the neutral loss of short chain length sn-3 groups were quantified using a series of synthesized sn-3 specific structured TAG. The signal intensity for the neutral loss of the sn-3 acyl group was found to negatively correlated with the aliphatic chain length and unsaturation index of the sn-1/2 acyl groups. The signal intensity of the neutral loss of the sn-3 acyl group was also negatively correlated with the size of that chain. Further, the position of the group undergoing neutral loss was also important, with the signal from an sn-2 acyl group much lower than that from one located at sn-3. Response factors obtained from these analyses were used to develop a method for the absolute quantification of acetyl-TAG. The increased sensitivity of this ESI-MS-based approach allowed successful quantification of acetyl-TAG in various biological settings, including the products of in vitro enzyme activity assays.
NASA Astrophysics Data System (ADS)
Schwabe, O.; Shehab, E.; Erkoyuncu, J.
2015-08-01
The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis for future work in this field.
MPQ-cytometry: a magnetism-based method for quantification of nanoparticle-cell interactions
NASA Astrophysics Data System (ADS)
Shipunova, V. O.; Nikitin, M. P.; Nikitin, P. I.; Deyev, S. M.
2016-06-01
Precise quantification of interactions between nanoparticles and living cells is among the imperative tasks for research in nanobiotechnology, nanotoxicology and biomedicine. To meet the challenge, a rapid method called MPQ-cytometry is developed, which measures the integral non-linear response produced by magnetically labeled nanoparticles in a cell sample with an original magnetic particle quantification (MPQ) technique. MPQ-cytometry provides a sensitivity limit 0.33 ng of nanoparticles and is devoid of a background signal present in many label-based assays. Each measurement takes only a few seconds, and no complicated sample preparation or data processing is required. The capabilities of the method have been demonstrated by quantification of interactions of iron oxide nanoparticles with eukaryotic cells. The total amount of targeted nanoparticles that specifically recognized the HER2/neu oncomarker on the human cancer cell surface was successfully measured, the specificity of interaction permitting the detection of HER2/neu positive cells in a cell mixture. Moreover, it has been shown that MPQ-cytometry analysis of a HER2/neu-specific iron oxide nanoparticle interaction with six cell lines of different tissue origins quantitatively reflects the HER2/neu status of the cells. High correlation of MPQ-cytometry data with those obtained by three other commonly used in molecular and cell biology methods supports consideration of this method as a prospective alternative for both quantifying cell-bound nanoparticles and estimating the expression level of cell surface antigens. The proposed method does not require expensive sophisticated equipment or highly skilled personnel and it can be easily applied for rapid diagnostics, especially under field conditions.Precise quantification of interactions between nanoparticles and living cells is among the imperative tasks for research in nanobiotechnology, nanotoxicology and biomedicine. To meet the challenge, a rapid method called MPQ-cytometry is developed, which measures the integral non-linear response produced by magnetically labeled nanoparticles in a cell sample with an original magnetic particle quantification (MPQ) technique. MPQ-cytometry provides a sensitivity limit 0.33 ng of nanoparticles and is devoid of a background signal present in many label-based assays. Each measurement takes only a few seconds, and no complicated sample preparation or data processing is required. The capabilities of the method have been demonstrated by quantification of interactions of iron oxide nanoparticles with eukaryotic cells. The total amount of targeted nanoparticles that specifically recognized the HER2/neu oncomarker on the human cancer cell surface was successfully measured, the specificity of interaction permitting the detection of HER2/neu positive cells in a cell mixture. Moreover, it has been shown that MPQ-cytometry analysis of a HER2/neu-specific iron oxide nanoparticle interaction with six cell lines of different tissue origins quantitatively reflects the HER2/neu status of the cells. High correlation of MPQ-cytometry data with those obtained by three other commonly used in molecular and cell biology methods supports consideration of this method as a prospective alternative for both quantifying cell-bound nanoparticles and estimating the expression level of cell surface antigens. The proposed method does not require expensive sophisticated equipment or highly skilled personnel and it can be easily applied for rapid diagnostics, especially under field conditions. Electronic supplementary information (ESI) available. See DOI: 10.1039/c6nr03507h
Le Guillou-Guillemette, Helene; Lunel-Fabiani, Francoise
2009-01-01
The treatment schedule (combination of compounds, doses, and duration) and the virological follow-up for management of antiviral treatment in patients chronically infected by HCV is now well standardized, but to ensure good monitoring of the treated patients, physicians need rapid, reproducible, and sensitive molecular virological tools with a wide range of detection and quantification of HCV RNA in blood samples. Several assays for detection and/or quantification of HCV RNA are currently commercially available. Here, all these assays are detailed, and a brief description of each step of the assay is provided. They are divided into two categories by method: those based on signal amplification and those based on target amplification. These two categories are then divided into qualitative, quantitative, and quantitative detection assays. The real-time reverse-transcription polymerase chain reaction (RT-PCR)-based assays are the most promising strategy in the HCV virological area.
Computer-aided assessment of regional abdominal fat with food residue removal in CT.
Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi
2013-11-01
Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. Published by Elsevier Inc.
Quantitative interaction proteomics using mass spectrometry.
Wepf, Alexander; Glatter, Timo; Schmidt, Alexander; Aebersold, Ruedi; Gstaiger, Matthias
2009-03-01
We present a mass spectrometry-based strategy for the absolute quantification of protein complex components isolated through affinity purification. We quantified bait proteins via isotope-labeled reference peptides corresponding to an affinity tag sequence and prey proteins by label-free correlational quantification using the precursor ion signal intensities of proteotypic peptides generated in reciprocal purifications. We used this method to quantitatively analyze interaction stoichiometries in the human protein phosphatase 2A network.
Dapic, Irena; Kobetic, Renata; Brkljacic, Lidija; Kezic, Sanja; Jakasa, Ivone
2018-02-01
The free fatty acids (FFAs) are one of the major components of the lipids in the stratum corneum (SC), the uppermost layer of the skin. Relative composition of FFAs has been proposed as a biomarker of the skin barrier status in patients with atopic dermatitis (AD). Here, we developed an LC-ESI-MS/MS method for simultaneous quantification of a range of FFAs with long and very long chain length in the SC collected by adhesive tape (D-Squame). The method, based on derivatization with 2-bromo-1-methylpyridinium iodide and 3-carbinol-1-methylpyridinium iodide, allowed highly sensitive detection and quantification of FFAs using multiple reaction monitoring. For the quantification, we applied a surrogate analyte approach and internal standardization using isotope labeled derivatives of FFAs. Adhesive tapes showed the presence of several FFAs, which are also present in the SC, a problem encountered in previous studies. Therefore, the levels of FFAs in the SC were corrected using C12:0, which was present on the adhesive tape, but not detected in the SC. The method was applied to SC samples from patients with atopic dermatitis and healthy subjects. Quantification using multiple reaction monitoring allowed sufficient sensitivity to analyze FFAs of chain lengths C16-C28 in the SC collected on only one tape strip. Copyright © 2017 John Wiley & Sons, Ltd.
Bilbao, Aivett; Zhang, Ying; Varesio, Emmanuel; Luban, Jeremy; Strambio-De-Castillia, Caterina; Lisacek, Frédérique; Hopfgartner, Gérard
2016-01-01
Data-independent acquisition LC-MS/MS techniques complement supervised methods for peptide quantification. However, due to the wide precursor isolation windows, these techniques are prone to interference at the fragment ion level, which in turn is detrimental for accurate quantification. The “non-outlier fragment ion” (NOFI) ranking algorithm has been developed to assign low priority to fragment ions affected by interference. By using the optimal subset of high priority fragment ions these interfered fragment ions are effectively excluded from quantification. NOFI represents each fragment ion as a vector of four dimensions related to chromatographic and MS fragmentation attributes and applies multivariate outlier detection techniques. Benchmarking conducted on a well-defined quantitative dataset (i.e. the SWATH Gold Standard), indicates that NOFI on average is able to accurately quantify 11-25% more peptides than the commonly used Top-N library intensity ranking method. The sum of the area of the Top3-5 NOFIs produces similar coefficients of variation as compared to the library intensity method but with more accurate quantification results. On a biologically relevant human dendritic cell digest dataset, NOFI properly assigns low priority ranks to 85% of annotated interferences, resulting in sensitivity values between 0.92 and 0.80 against 0.76 for the Spectronaut interference detection algorithm. PMID:26412574
Couillerot, O; Poirier, M-A; Prigent-Combaret, C; Mavingui, P; Caballero-Mellado, J; Moënne-Loccoz, Y
2010-08-01
To assess the applicability of sequence characterized amplified region (SCAR) markers obtained from BOX, ERIC and RAPD fragments to design primers for real-time PCR quantification of the phytostimulatory maize inoculants Azospirillum brasilense UAP-154 and CFN-535 in the rhizosphere. Primers were designed based on strain-specific SCAR markers and were screened for successful amplification of target strain and absence of cross-reaction with other Azospirillum strains. The specificity of primers thus selected was verified under real-time PCR conditions using genomic DNA from strain collection and DNA from rhizosphere samples. The detection limit was 60 fg DNA with pure cultures and 4 x 10(3) (for UAP-154) and 4 x 10(4) CFU g(-1) (for CFN-535) in the maize rhizosphere. Inoculant quantification was effective from 10(4) to 10(8) CFU g(-1) soil. BOX-based SCAR markers were useful to find primers for strain-specific real-time PCR quantification of each A. brasilense inoculant in the maize rhizosphere. Effective root colonization is a prerequisite for successful Azospirillum phytostimulation, but cultivation-independent monitoring methods were lacking. The real-time PCR methods developed here will help understand the effect of environmental conditions on root colonization and phytostimulation by A. brasilense UAP-154 and CFN-535.
Salvi, Sergio; D'Orso, Fabio; Morelli, Giorgio
2008-06-25
Many countries have introduced mandatory labeling requirements on foods derived from genetically modified organisms (GMOs). Real-time quantitative polymerase chain reaction (PCR) based upon the TaqMan probe chemistry has become the method mostly used to support these regulations; moreover, event-specific PCR is the preferred method in GMO detection because of its high specificity based on the flanking sequence of the exogenous integrant. The aim of this study was to evaluate the use of very short (eight-nucleotide long), locked nucleic acid (LNA) TaqMan probes in 5'-nuclease PCR assays for the detection and quantification of GMOs. Classic TaqMan and LNA TaqMan probes were compared for the analysis of the maize MON810 transgene. The performance of the two types of probes was tested on the maize endogenous reference gene hmga, the CaMV 35S promoter, and the hsp70/cryIA(b) construct as well as for the event-specific 5'-integration junction of MON810, using plasmids as standard reference molecules. The results of our study demonstrate that the LNA 5'-nuclease PCR assays represent a valid and reliable analytical system for the detection and quantification of transgenes. Application of very short LNA TaqMan probes to GMO quantification can simplify the design of 5'-nuclease assays.
Li, Xiang; Wang, Xiuxiu; Yang, Jielin; Liu, Yueming; He, Yuping; Pan, Liangwen
2014-05-16
To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5'-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products.
2014-01-01
Background To date, over 150 genetically modified (GM) crops are widely cultivated. To comply with regulations developed for genetically modified organisms (GMOs), including labeling policies, many detection methods for GMO identification and quantification have been developed. Results To detect the entrance and exit of unauthorized GM crop events in China, we developed a novel quadruplex real-time PCR method for simultaneous detection and quantification of GM cotton events GHB119 and T304-40 in cotton-derived products (based on the 5′-flanking sequence) and the insect-resistance gene Cry2Ae. The limit of detection was 10 copies for GHB119 and Cry2Ae and 25 copies for T304-40. The limit of quantification was 25 copies for GHB119 and Cry2Ae and 50 copies for T304-40. Moreover, low bias and acceptable standard deviation and relative standard deviation values were obtained in quantification analysis of six blind samples containing different GHB119 and T304-40 ingredients. Conclusions The developed quadruplex quantitative method could be used for quantitative detection of two GM cotton events (GHB119 and T304-40) and Cry2Ae gene ingredient in cotton derived products. PMID:24884946
Cheng, Dongwan; Zheng, Li; Hou, Junjie; Wang, Jifeng; Xue, Peng; Yang, Fuquan; Xu, Tao
2015-01-01
The absolute quantification of target proteins in proteomics involves stable isotope dilution coupled with multiple reactions monitoring mass spectrometry (SID-MRM-MS). The successful preparation of stable isotope-labeled internal standard peptides is an important prerequisite for the SID-MRM absolute quantification methods. Dimethyl labeling has been widely used in relative quantitative proteomics and it is fast, simple, reliable, cost-effective, and applicable to any protein sample, making it an ideal candidate method for the preparation of stable isotope-labeled internal standards. MRM mass spectrometry is of high sensitivity, specificity, and throughput characteristics and can quantify multiple proteins simultaneously, including low-abundance proteins in precious samples such as pancreatic islets. In this study, a new method for the absolute quantification of three proteases involved in insulin maturation, namely PC1/3, PC2 and CPE, was developed by coupling a stable isotope dimethyl labeling strategy for internal standard peptide preparation with SID-MRM-MS quantitative technology. This method offers a new and effective approach for deep understanding of the functional status of pancreatic β cells and pathogenesis in diabetes.
Flouda, Konstantina; Dersch, Julie Maria; Gabel-Jensen, Charlotte; Stürup, Stefan; Misra, Sougat; Björnstedt, Mikael; Gammelgaard, Bente
2016-03-01
The paper presents an analytical method for quantification of low molecular weight (LMW) selenium compounds in human plasma based on liquid chromatography inductively coupled plasma mass spectrometry (LC-ICP-MS) and post column isotope dilution-based quantification. Prior to analysis, samples were ultrafiltrated using a cut-off value of 3000 Da. The method was validated in aqueous solution as well as plasma using standards of selenomethionine (SeMet), Se-methylselenocysteine (MeSeCys), selenite, and the selenosugar Se-methylseleno-N-acetylgalactosamine (SeGal) for linearity, precision, recoveries, and limits of detection and quantitation with satisfactory results. The method was applied for analysis of a set of plasma samples from cancer patients receiving selenite treatment in a clinical trial. Three LMW selenium compounds were observed. The main compounds, SeGal and selenite were tentatively identified by retention time matching with standards in different chromatographic systems, while the third minor compound was not identified. The identity of the selenosugar was verified by ESI-MS-MS product ion scanning, while selenite was identified indirectly as the glutathione (GSH) reaction product, GS-Se-SG.
Savazzini, Federica; Longa, Claudia Maria Oliveira; Pertot, Ilaria; Gessler, Cesare
2008-05-01
Trichoderma (Hypocreales, Ascomycota) is a widespread genus in nature and several Trichoderma species are used in industrial processes and as biocontrol agents against crop diseases. It is very important that the persistence and spread of microorganisms released on purpose into the environment are accurately monitored. Real-time PCR methods for genus/species/strain identification of microorganisms are currently being developed to overcome the difficulties of classical microbiological and enzymatic methods for monitoring these populations. The aim of the present study was to develop and validate a specific real-time PCR-based method for detecting Trichoderma atroviride SC1 in soil. We developed a primer and TaqMan probe set constructed on base mutations in an endochitinase gene. This tool is highly specific for the detection and quantification of the SC1 strain. The limits of detection and quantification calculated from the relative standard deviation were 6000 and 20,000 haploid genome copies per gram of soil. Together with the low throughput time associated with this procedure, which allows the evaluation of many soil samples within a short time period, these results suggest that this method could be successfully used to trace the fate of T. atroviride SC1 applied as an open-field biocontrol agent.
Quantification of Cannabinoid Content in Cannabis
NASA Astrophysics Data System (ADS)
Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.
2015-09-01
Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.
Comparative quantification of human intestinal bacteria based on cPCR and LDR/LCR
Tang, Zhou-Rui; Li, Kai; Zhou, Yu-Xun; Xiao, Zhen-Xian; Xiao, Jun-Hua; Huang, Rui; Gu, Guo-Hao
2012-01-01
AIM: To establish a multiple detection method based on comparative polymerase chain reaction (cPCR) and ligase detection reaction (LDR)/ligase chain reaction (LCR) to quantify the intestinal bacterial components. METHODS: Comparative quantification of 16S rDNAs from different intestinal bacterial components was used to quantify multiple intestinal bacteria. The 16S rDNAs of different bacteria were amplified simultaneously by cPCR. The LDR/LCR was examined to actualize the genotyping and quantification. Two beneficial (Bifidobacterium, Lactobacillus) and three conditionally pathogenic bacteria (Enterococcus, Enterobacterium and Eubacterium) were used in this detection. With cloned standard bacterial 16S rDNAs, standard curves were prepared to validate the quantitative relations between the ratio of original concentrations of two templates and the ratio of the fluorescence signals of their final ligation products. The internal controls were added to monitor the whole detection flow. The quantity ratio between two bacteria was tested. RESULTS: cPCR and LDR revealed obvious linear correlations with standard DNAs, but cPCR and LCR did not. In the sample test, the distributions of the quantity ratio between each two bacterial species were obtained. There were significant differences among these distributions in the total samples. But these distributions of quantity ratio of each two bacteria remained stable among groups divided by age or sex. CONCLUSION: The detection method in this study can be used to conduct multiple intestinal bacteria genotyping and quantification, and to monitor the human intestinal health status as well. PMID:22294830
Ott, Stephan J; Musfeldt, Meike; Ullmann, Uwe; Hampe, Jochen; Schreiber, Stefan
2004-06-01
The composition of the human intestinal flora is important for the health status of the host. The global composition and the presence of specific pathogens are relevant to the effects of the flora. Therefore, accurate quantification of all major bacterial populations of the enteric flora is needed. A TaqMan real-time PCR-based method for the quantification of 20 dominant bacterial species and groups of the intestinal flora has been established on the basis of 16S ribosomal DNA taxonomy. A PCR with conserved primers was used for all reactions. In each real-time PCR, a universal probe for quantification of total bacteria and a specific probe for the species in question were included. PCR with conserved primers and the universal probe for total bacteria allowed relative and absolute quantification. Minor groove binder probes increased the sensitivity of the assays 10- to 100-fold. The method was evaluated by cross-reaction experiments and quantification of bacteria in complex clinical samples from healthy patients. A sensitivity of 10(1) to 10(3) bacterial cells per sample was achieved. No significant cross-reaction was observed. The real-time PCR assays presented may facilitate understanding of the intestinal bacterial flora through a normalized global estimation of the major contributing species.
van Erven, Gijs; de Visser, Ries; Merkx, Donny W H; Strolenberg, Willem; de Gijsel, Peter; Gruppen, Harry; Kabel, Mirjam A
2017-10-17
Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13 C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12 C and 13 C lignin were isolated from nonlabeled and uniformly 13 C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13 C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12 C lignin analogue and was shown to be extremely accurate (>99.9%, R 2 > 0.999) and precise (RSD < 1.5%). Third, 13 C-IS based lignin quantification was applied to four common poaceous biomass sources (wheat straw, barley straw, corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin.
2017-01-01
Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12C and 13C lignin were isolated from nonlabeled and uniformly 13C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12C lignin analogue and was shown to be extremely accurate (>99.9%, R2 > 0.999) and precise (RSD < 1.5%). Third, 13C-IS based lignin quantification was applied to four common poaceous biomass sources (wheat straw, barley straw, corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin. PMID:28926698
Sonnante, Gabriella; Montemurro, Cinzia; Morgese, Anita; Sabetta, Wilma; Blanco, Antonio; Pasqualone, Antonella
2009-11-11
Italian industrial pasta and durum wheat typical breads must be prepared using exclusively durum wheat semolina. Previously, a microsatellite sequence specific of the wheat D-genome had been chosen for traceability of soft wheat in semolina and bread samples, using qualitative and quantitative Sybr green-based real-time experiments. In this work, we describe an improved method based on the same soft wheat genomic region by means of a quantitative real-time PCR using a dual-labeled probe. Standard curves based on dilutions of 100% soft wheat flour, pasta, or bread were constructed. Durum wheat semolina, pasta, and bread samples were prepared with increasing amounts of soft wheat to verify the accuracy of the method. Results show that reliable quantifications were obtained especially for the samples containing a lower amount of soft wheat DNA, fulfilling the need to verify labeling of pasta and typical durum wheat breads.
NASA Astrophysics Data System (ADS)
Kim, Chang-Beom; Kim, Kwan-Soo; Song, Ki-Bong
2013-05-01
The importance of early Alzheimer's disease (AD) detection has been recognized to diagnose people at high risk of AD. The existence of intra/extracellular beta-amyloid (Aβ) of brain neurons has been regarded as the most archetypal hallmark of AD. The existing computed-image-based neuroimaging tools have limitations on accurate quantification of nanoscale Aβ peptides due to optical diffraction during imaging processes. Therefore, we propose a new method that is capable of evaluating a small amount of Aβ peptides by using photo-sensitive field-effect transistor (p-FET) integrated with magnetic force-based microbead collecting platform and selenium(Se) layer (thickness ~700 nm) as an optical filter. This method demonstrates a facile approach for the analysis of Aβ quantification using magnetic force and magnetic silica microparticles (diameter 0.2~0.3 μm). The microbead collecting platform mainly consists of the p-FET sensing array and the magnet (diameter ~1 mm) which are placed beneath each sensing region of the p-FET, which enables the assembly of the Aβ antibody conjugated microbeads, captures the Aβ peptides from samples, measures the photocurrents generated by the Q-dot tagged with Aβ peptides, and consequently results in the effective Aβ quantification.
Theoretical limitations of quantification for noncompetitive sandwich immunoassays.
Woolley, Christine F; Hayes, Mark A; Mahanti, Prasun; Douglass Gilman, S; Taylor, Tom
2015-11-01
Immunoassays exploit the highly selective interaction between antibodies and antigens to provide a vital method for biomolecule detection at low concentrations. Developers and practitioners of immunoassays have long known that non-specific binding often restricts immunoassay limits of quantification (LOQs). Aside from non-specific binding, most efforts by analytical chemists to reduce the LOQ for these techniques have focused on improving the signal amplification methods and minimizing the limitations of the detection system. However, with detection technology now capable of sensing single-fluorescence molecules, this approach is unlikely to lead to dramatic improvements in the future. Here, fundamental interactions based on the law of mass action are analytically connected to signal generation, replacing the four- and five-parameter fittings commercially used to approximate sigmoidal immunoassay curves and allowing quantitative consideration of non-specific binding and statistical limitations in order to understand the ultimate detection capabilities of immunoassays. The restrictions imposed on limits of quantification by instrumental noise, non-specific binding, and counting statistics are discussed based on equilibrium relations for a sandwich immunoassay. Understanding the maximal capabilities of immunoassays for each of these regimes can greatly assist in the development and evaluation of immunoassay platforms. While many studies suggest that single molecule detection is possible through immunoassay techniques, here, it is demonstrated that the fundamental limit of quantification (precision of 10 % or better) for an immunoassay is approximately 131 molecules and this limit is based on fundamental and unavoidable statistical limitations.
Parsons, Teresa L.; Emory, Joshua F.; Seserko, Lauren A.; Aung, Wutyi S.; Marzinke, Mark A.
2014-01-01
Background Topical microbicidal agents are being actively pursued as a modality to prevent HIV viral transmission during sexual intercourse. Quantification of antiretroviral agents in specimen sources where antiviral activity is elicited is critical, and drug measurements in cervicovaginal fluid can provide key information on local drug concentrations. Two antiretroviral drugs, dapivirine and maraviroc, have gained interest as vaginal microbicidal agents, and rugged methods are required for their quantification in cervicovaginal secretions. Methods Cervicovaginal fluid spiked with dapivirine and maraviroc were applied to ophthalmic tear strips or polyester-based swabs to mimic collection procedures used in clinical studies. Following sample extraction and the addition of isotopically-labeled internal standards, samples were subjected to liquid chromatographic-tandem mass spectrometric (LC-MS/MS) analysis using a Waters BEH C8, 50 × 2.1 mm, 1.7 µm particle size column, on an API 4000 mass analyzer operated in selective reaction monitoring mode. The method was validated according to FDA Bioanalytical Method Validation guidelines. Results Due to the disparate saturation capacity of the tested collection devices, the analytical measuring ranges for dapivirine and maravirocin cervicovaginal fluid on the ophthalmic tear strip were 0.05 to 25 ng/tear strip, and 0.025 to 25 ng/tear strip, respectively. As for the polyester-based swab, the analytical measuring ranges were 0.25 to 125 ng/swab for dapivirine and 0.125 to 125 ng/swab for maraviroc. Dilutional studies were performed for both analytes to extended ranges of 25,000 ng/tear strip and 11,250 ng/swab. Standard curves were generated via weighted (1/x2) linear or quadratic regression of calibrators. Precision, accuracy, stability and matrix effects studies were all performed and deemed acceptable according to the recommendations of the FDA Bioanalytical Method Validation guidelines. Conclusions A rugged LC-MS/MS method for the dual quantification of dapivirine and maraviroc in cervicovaginal fluid using two unique collection devices has been developed and validated. The described method meets the criteria to support large research trials. PMID:25005891
Singh, Jaswant; Parkash, Jyoti; Kaur, Varinder; Singh, Raghubir
2017-10-05
A new method is reported for the quantification of some metallic components of healthcare products utilizing a Schiff base chelator derived from 2-hydroxyacetophenone and ethanolamine. The Schiff base chelator recognizes some metallic species such as iron, copper and zinc (important components of some healthcare products), and cadmium (common contaminant in healthcare products) giving colorimetric/fluorimetric response. It coordinates with Fe 2+ /Fe 3+ and Cu 2+ ions via ONO donor set and switches the colour to bright red, green and orange, respectively. Similarly, it switches 'ON' a fluorometric response when coordinates with Zn 2+ and Cd 2+ ions. In the present approach, detailed studies on the colorimetric and fluorimetric response of ONO Schiff base is investigated in detail. The Job plot for the complexation of ONO switch with various metal ions suggested formation of 1:1 (metal-chelator) complex with Fe 2+ , Fe 3+ , and Cu 2+ while 1:2 (metal-chelator) for Zn 2+ and Cd 2+ ions. The limit of detection, limit of quantification are 6.73, 18.0, 25.0, 0.65, 1.10μM and 27.0, 72.0, 100.0, 2.60 and 4.40μM for Fe 2+ , Fe 3+ , Cu 2+ , Zn 2+ and Cd 2+ ions, respectively. Under the optimized conditions, chelator was used for the quantification of important metals present in healthcare products via direct dissolution and furnace treatment during sample preparation. The results were found precise and accurate for both sample preparation techniques using the developed method. Copyright © 2017 Elsevier B.V. All rights reserved.
Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N. A.; Zaheer, Kashif Bin
2015-01-01
Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called ‘StakeMeter’. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error. PMID:25799490
Chen, Qi; Zhang, Jingshun; Ke, Xing; Lai, Shiyun; Li, Duo; Yang, Jinchuan; Mo, Weimin; Ren, Yiping
2016-09-01
In recent years, there is an increasing need to measure the concentration of individual proteins in human milk, instead of total human milk proteins. Due to lack of human milk protein standards, there are only few quantification methods established. The objective of the present work was to develop a simple and rapid quantification method for simultaneous determination of α-lactalbumin and β-casein in human milk using signature peptides according to a modified quantitative proteomics strategy. The internal standards containing the signature peptide sequences were synthesized with isotope-labeled amino acids. The purity of synthesized peptides as standards was determined by amino acid analysis method and area normalization method. The contents of α-lactalbumin and β-casein in human milk were measured according to the equimolar relationship between the two proteins and their corresponding signature peptides. The method validation results showed a satisfied linearity (R(2)>0.99) and recoveries (97.2-102.5% for α-lactalbumin and 99.5-100.3% for β-casein). The limit of quantification for α-lactalbumin and β-casein was 8.0mg/100g and 1.2mg/100g, respectively. CVs for α-lactalbumin and β-casein in human milk were 5.2% and 3.0%. The contents of α-lactalbumin and β-casein in 147 human milk samples were successfully determined by the established method and their contents were 205.5-578.2mg/100g and 116.4-467.4mg/100g at different lactation stages. The developed method allows simultaneously determination of α-lactalbumin and β-casein in human milk. The quantitative strategy based on signature peptide should be applicable to other endogenous proteins in breast milk and other body fluids. Copyright © 2016 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Hayashi, Tatsuro; Zhou, Xiangrong; Chen, Huayue; Hara, Takeshi; Miyamoto, Kei; Kobayashi, Tatsunori; Yokoyama, Ryujiro; Kanematsu, Masayuki; Hoshi, Hiroaki; Fujita, Hiroshi
2010-03-01
X-ray CT images have been widely used in clinical routine in recent years. CT images scanned by a modern CT scanner can show the details of various organs and tissues. This means various organs and tissues can be simultaneously interpreted on CT images. However, CT image interpretation requires a lot of time and energy. Therefore, support for interpreting CT images based on image-processing techniques is expected. The interpretation of the spinal curvature is important for clinicians because spinal curvature is associated with various spinal disorders. We propose a quantification scheme of the spinal curvature based on the center line of spinal canal on CT images. The proposed scheme consists of four steps: (1) Automated extraction of the skeletal region based on CT number thresholding. (2) Automated extraction of the center line of spinal canal. (3) Generation of the median plane image of spine, which is reformatted based on the spinal canal. (4) Quantification of the spinal curvature. The proposed scheme was applied to 10 cases, and compared with the Cobb angle that is commonly used by clinicians. We found that a high-correlation (for the 95% confidence interval, lumbar lordosis: 0.81-0.99) between values obtained by the proposed (vector) method and Cobb angle. Also, the proposed method can provide the reproducible result (inter- and intra-observer variability: within 2°). These experimental results suggested a possibility that the proposed method was efficient for quantifying the spinal curvature on CT images.
NASA Astrophysics Data System (ADS)
Arsene, Cristian G.; Schulze, Dirk; Kratzsch, Jürgen; Henrion, André
2012-12-01
Amphiphilic peptide conjugation affords a significant increase in sensitivity with protein quantification by electrospray-ionization mass spectrometry. This has been demonstrated here for human growth hormone in serum using N-(3-iodopropyl)-N,N,N-dimethyloctylammonium iodide (IPDOA-iodide) as derivatizing reagent. The signal enhancement achieved in comparison to the method without derivatization enables extension of the applicable concentration range down to the very low concentrations as encountered with clinical glucose suppression tests for patients with acromegaly. The method has been validated using a set of serum samples spiked with known amounts of recombinant 22 kDa growth hormone in the range of 0.48 to 7.65 \\mug/L. The coefficient of variation (CV) calculated, based on the deviation of results from the expected concentrations, was 3.5% and the limit of quantification (LoQ) was determined as 0.4 \\mug/L. The potential of the method as a tool in clinical practice has been demonstrated with patient samples of about 1 \\mug/L.
Harder, Timm C.; Hufnagel, Markus; Zahn, Katrin; Beutel, Karin; Schmitt, Heinz-Josef; Ullmann, Uwe; Rautenberg, Peter
2001-01-01
Detection of parvovirus B19 DNA offers diagnostic advantages over serology, particularly in persistent infections of immunocompromised patients. A rapid, novel method of B19 DNA detection and quantification is introduced. This method, a quantitative PCR assay, is based on real-time glass capillary thermocycling (LightCycler [LC]) and fluorescence resonance energy transfer (FRET). The PCR assay allowed quantification over a dynamic range of over 7 logs and could quantify as little as 250 B19 genome equivalents (geq) per ml as calculated for plasmid DNA (i.e., theoretically ≥5 geq per assay). Interrater agreement analysis demonstrated equivalence of LC-FRET PCR and conventional nested PCR in the diagnosis of an active B19 infection (kappa coefficient = 0.83). The benefit of the new method was demonstrated in an immunocompromised child with a relapsing infection, who required an attenuation of the immunosuppressive therapy in addition to repeated doses of immunoglobulin to eliminate the virus. PMID:11724854
Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina
2006-01-01
Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary criterion by which to evaluate the quality and performance on different matrixes and extraction techniques. The effect of PCR efficiency on the resulting GMO content is demonstrated. Conclusion The crucial influence of extraction technique and sample matrix properties on the results of GMO quantification is demonstrated. Appropriate extraction techniques for each matrix need to be determined to achieve accurate DNA quantification. Nevertheless, as it is shown that in the area of food and feed testing matrix with certain specificities is impossible to define strict quality controls need to be introduced to monitor PCR. The results of our study are also applicable to other fields of quantitative testing by real-time PCR. PMID:16907967
Chappell, Michael A; Woolrich, Mark W; Petersen, Esben T; Golay, Xavier; Payne, Stephen J
2013-05-01
Amongst the various implementations of arterial spin labeling MRI methods for quantifying cerebral perfusion, the QUASAR method is unique. By using a combination of labeling with and without flow suppression gradients, the QUASAR method offers the separation of macrovascular and tissue signals. This permits local arterial input functions to be defined and "model-free" analysis, using numerical deconvolution, to be used. However, it remains unclear whether arterial spin labeling data are best treated using model-free or model-based analysis. This work provides a critical comparison of these two approaches for QUASAR arterial spin labeling in the healthy brain. An existing two-component (arterial and tissue) model was extended to the mixed flow suppression scheme of QUASAR to provide an optimal model-based analysis. The model-based analysis was extended to incorporate dispersion of the labeled bolus, generally regarded as the major source of discrepancy between the two analysis approaches. Model-free and model-based analyses were compared for perfusion quantification including absolute measurements, uncertainty estimation, and spatial variation in cerebral blood flow estimates. Major sources of discrepancies between model-free and model-based analysis were attributed to the effects of dispersion and the degree to which the two methods can separate macrovascular and tissue signal. Copyright © 2012 Wiley Periodicals, Inc.
Quantification of Shape, Angularity, and Surface texture of Base Course Materials
DOT National Transportation Integrated Search
1998-01-01
A state-of-the-art review was conducted to determine existing test methods for characterizing the shape, angularity, and surface texture of coarse aggregates. The review found direct methods used by geologists to determine these characteristics. Thes...
Multiplex cDNA quantification method that facilitates the standardization of gene expression data
Gotoh, Osamu; Murakami, Yasufumi; Suyama, Akira
2011-01-01
Microarray-based gene expression measurement is one of the major methods for transcriptome analysis. However, current microarray data are substantially affected by microarray platforms and RNA references because of the microarray method can provide merely the relative amounts of gene expression levels. Therefore, valid comparisons of the microarray data require standardized platforms, internal and/or external controls and complicated normalizations. These requirements impose limitations on the extensive comparison of gene expression data. Here, we report an effective approach to removing the unfavorable limitations by measuring the absolute amounts of gene expression levels on common DNA microarrays. We have developed a multiplex cDNA quantification method called GEP-DEAN (Gene expression profiling by DCN-encoding-based analysis). The method was validated by using chemically synthesized DNA strands of known quantities and cDNA samples prepared from mouse liver, demonstrating that the absolute amounts of cDNA strands were successfully measured with a sensitivity of 18 zmol in a highly multiplexed manner in 7 h. PMID:21415008
2017-01-01
Unique Molecular Identifiers (UMIs) are random oligonucleotide barcodes that are increasingly used in high-throughput sequencing experiments. Through a UMI, identical copies arising from distinct molecules can be distinguished from those arising through PCR amplification of the same molecule. However, bioinformatic methods to leverage the information from UMIs have yet to be formalized. In particular, sequencing errors in the UMI sequence are often ignored or else resolved in an ad hoc manner. We show that errors in the UMI sequence are common and introduce network-based methods to account for these errors when identifying PCR duplicates. Using these methods, we demonstrate improved quantification accuracy both under simulated conditions and real iCLIP and single-cell RNA-seq data sets. Reproducibility between iCLIP replicates and single-cell RNA-seq clustering are both improved using our proposed network-based method, demonstrating the value of properly accounting for errors in UMIs. These methods are implemented in the open source UMI-tools software package. PMID:28100584
Washko, George R; Criner, Gerald J; Mohsenifar, Zab; Sciurba, Frank C; Sharafkhaneh, Amir; Make, Barry J; Hoffman, Eric A; Reilly, John J
2008-06-01
Computed tomographic based indices of emphysematous lung destruction may highlight differences in disease pathogenesis and further enable the classification of subjects with Chronic Obstructive Pulmonary Disease. While there are multiple techniques that can be utilized for such radiographic analysis, there is very little published information comparing the performance of these methods in a clinical case series. Our objective was to examine several quantitative and semi-quantitative methods for the assessment of the burden of emphysema apparent on computed tomographic scans and compare their ability to predict lung mechanics and function. Automated densitometric analysis was performed on 1094 computed tomographic scans collected upon enrollment into the National Emphysema Treatment Trial. Trained radiologists performed an additional visual grading of emphysema on high resolution CT scans. Full pulmonary function test results were available for correlation, with a subset of subjects having additional measurements of lung static recoil. There was a wide range of emphysematous lung destruction apparent on the CT scans and univariate correlations to measures of lung function were of modest strength. No single method of CT scan analysis clearly outperformed the rest of the group. Quantification of the burden of emphysematous lung destruction apparent on CT scan is a weak predictor of lung function and mechanics in severe COPD with no uniformly superior method found to perform this analysis. The CT based quantification of emphysema may augment pulmonary function testing in the characterization of COPD by providing complementary phenotypic information.
Yuan, Teng-Fei; Wang, Shao-Ting; Li, Yan
2017-09-15
Menadione, as the crucial component of vitamin Ks, possessed significant nutritional and clinical values. However, there was still lack of favourable quantification strategies for it to date. For improvement, a novel cysteamine derivatization based UPLC-MS/MS method was presented in this work. The derivatizating reaction was proved non-toxic, easy-handling and high-efficient, which realized the MS detection of menadione under positive mode. Benefitting from the excellent sensitivity of the derivatizating product as well as the introduction of the stable isotope dilution technique, the quantification could be achieved in the range of 0.05-50.0ng/mL for plasma and urine matrixes with satisfied accuracy and precision. After analysis of the samples from healthy volunteers after oral administration of menadione sodium bisulfite tablets, the urinary free menadione was quantified for the very first time. We believe the progress in this work could largely promote the exploration of the metabolic mechanism of vitamin K in vivo. Copyright © 2017 Elsevier B.V. All rights reserved.
Higashi, Tatsuya; Ogawa, Shoujiro
2016-09-01
Sensitive and specific methods for the detection, characterization and quantification of endogenous steroids in body fluids or tissues are necessary for the diagnosis, pathological analysis and treatment of many diseases. Recently, liquid chromatography/electrospray ionization-tandem mass spectrometry (LC/ESI-MS/MS) has been widely used for these purposes due to its specificity and versatility. However, the ESI efficiency and fragmentation behavior of some steroids are poor, which lead to a low sensitivity. Chemical derivatization is one of the most effective methods to improve the detection characteristics of steroids in ESI-MS/MS. Based on this background, this article reviews the recent advances in chemical derivatization for the trace quantification of steroids in biological samples by LC/ESI-MS/MS. The derivatization in ESI-MS/MS is based on tagging a proton-affinitive or permanently charged moiety on the target steroid. Introduction/formation of a fragmentable moiety suitable for the selected reaction monitoring by the derivatization also enhances the sensitivity. The stable isotope-coded derivatization procedures for the steroid analysis are also described. Copyright © 2015 Elsevier Ltd. All rights reserved.
Talarico, Sarah; Safaeian, Mahboobeh; Gonzalez, Paula; Hildesheim, Allan; Herrero, Rolando; Porras, Carolina; Cortes, Bernal; Larson, Ann; Fang, Ferric C; Salama, Nina R
2016-08-01
Epidemiologic studies of the carcinogenic stomach bacterium Helicobacter pylori have been limited by the lack of noninvasive detection and genotyping methods. We developed a new stool-based method for detection, quantification, and partial genotyping of H. pylori using droplet digital PCR (ddPCR), which allows for increased sensitivity and absolute quantification by PCR partitioning. Stool-based ddPCR assays for H. pylori 16S gene detection and cagA virulence gene typing were tested using a collection of 50 matched stool and serum samples from Costa Rican volunteers and 29 H. pylori stool antigen-tested stool samples collected at a US hospital. The stool-based H. pylori 16S ddPCR assay had a sensitivity of 84% and 100% and a specificity of 100% and 71% compared to serology and stool antigen tests, respectively. The stool-based cagA genotyping assay detected cagA in 22 (88%) of 25 stools from CagA antibody-positive individuals and four (16%) of 25 stools from CagA antibody-negative individuals from Costa Rica. All 26 of these samples had a Western-type cagA allele. Presence of serum CagA antibodies was correlated with a significantly higher load of H. pylori in the stool. The stool-based ddPCR assays are a sensitive, noninvasive method for detection, quantification, and partial genotyping of H. pylori. The quantitative nature of ddPCR-based H. pylori detection revealed significant variation in bacterial load among individuals that correlates with presence of the cagA virulence gene. These stool-based ddPCR assays will facilitate future population-based epidemiologic studies of this important human pathogen. © 2015 John Wiley & Sons Ltd.
NASA Astrophysics Data System (ADS)
Bau, Haim; Liu, Changchun; Killawala, Chitvan; Sadik, Mohamed; Mauk, Michael
2014-11-01
Real-time amplification and quantification of specific nucleic acid sequences plays a major role in many medical and biotechnological applications. In the case of infectious diseases, quantification of the pathogen-load in patient specimens is critical to assessing disease progression, effectiveness of drug therapy, and emergence of drug-resistance. Typically, nucleic acid quantification requires sophisticated and expensive instruments, such as real-time PCR machines, which are not appropriate for on-site use and for low resource settings. We describe a simple, low-cost, reactiondiffusion based method for end-point quantification of target nucleic acids undergoing enzymatic amplification. The number of target molecules is inferred from the position of the reaction-diffusion front, analogous to reading temperature in a mercury thermometer. We model the process with the Fisher Kolmogoroff Petrovskii Piscounoff (FKPP) Equation and compare theoretical predictions with experimental observations. The proposed method is suitable for nucleic acid quantification at the point of care, compatible with multiplexing and high-throughput processing, and can function instrument-free. C.L. was supported by NIH/NIAID K25AI099160; M.S. was supported by the Pennsylvania Ben Franklin Technology Development Authority; C.K. and H.B. were funded, in part, by NIH/NIAID 1R41AI104418-01A1.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hoofnagle, Andrew N.; Whiteaker, Jeffrey R.; Carr, Steven A.
2015-12-30
The Clinical Proteomic Tumor Analysis Consortium (1) (CPTAC) of the National Cancer Institute (NCI) is a comprehensive and coordinated effort to accelerate the understanding of the molecular basis of cancer through the application of robust technologies and workflows for the quantitative measurements of proteins. The Assay Development Working Group of the CPTAC Program aims to foster broad uptake of targeted mass spectrometry-based assays employing isotopically labeled peptides for confident assignment and quantification, including multiple reaction monitoring (MRM; also referred to as Selected Reaction Monitoring), parallel reaction monitoring (PRM), and other targeted methods.
Srikanth, Jandhyam; Agalyadevi, Rathinasamy; Babu, Ponnusamy
2017-02-01
The site-specific quantitation of N- and O-glycosylation is vital to understanding the function(s) of different glycans expressed at a given site of a protein under physiological and disease conditions. Most commonly used precursor ion intensity based quantification method is less accurate and other labeled methods are expensive and require enrichment of glycopeptides. Here, we used glycopeptide product (y and Y0) ions and 18 O-labeling of C-terminal carboxyl group as a strategy to obtain quantitative information about fold-change and relative abundance of most of the glycoforms attached to the glycopeptides. As a proof of concept, the accuracy and robustness of this targeted, relative quantification LC-MS method was demonstrated using Rituximab. Furthermore, the N-glycopeptide quantification results were compared with a biosimilar of Rituximab and validated with quantitative data obtained from 2-AB-UHPLC-FL method. We further demonstrated the intensity fold-change and relative abundance of 46 unique N- and O-glycopeptides and aglycopeptides from innovator and biosimilar samples of Etanercept using both the normal-MS and product ion based quantitation. The results showed a very similar site-specific expression of N- and O-glycopeptides between the samples but with subtle differences. Interestingly, we have also been able to quantify macro-heterogeneity of all N- and O-glycopetides of Etanercept. In addition to applications in biotherapeutics, the developed method can also be used for site-specific quantitation of N- and O-glycopeptides and aglycopeptides of glycoproteins with known glycosylation pattern.
Hatanaka, Rafael Rodrigues; Sequinel, Rodrigo; Gualtieri, Carlos Eduardo; Tercini, Antônio Carlos Bergamaschi; Flumignan, Danilo Luiz; de Oliveira, José Eduardo
2013-05-15
Lubricating oils are crucial in the operation of automotive engines because they both reduce friction between moving parts and protect against corrosion. However, the performance of lubricant oil may be affected by contaminants, such as gasoline, diesel, ethanol, water and ethylene glycol. Although there are many standard methods and studies related to the quantification of contaminants in lubricant oil, such as gasoline and diesel oil, to the best of our knowledge, no methods have been reported for the quantification of ethanol in used Otto cycle engine lubrication oils. Therefore, this work aimed at the development and validation of a routine method based on partial least-squares multivariate analysis combined with attenuated total reflectance in the mid-infrared region to quantify ethanol content in used lubrication oil. The method was validated based on its figures of merit (using the net analyte signal) as follows: limit of detection (0.049%), limit of quantification (0.16%), accuracy (root mean square error of prediction=0.089% w/w), repeatability (0.05% w/w), fit (R(2)=0.9997), mean selectivity (0.047), sensitivity (0.011), inverse analytical sensitivity (0.016% w/w(-1)) and signal-to-noise ratio (max: 812.4 and min: 200.9). The results show that the proposed method can be routinely implemented for the quality control of lubricant oils. Copyright © 2013 Elsevier B.V. All rights reserved.
Kassler, Alexander; Pittenauer, Ernst; Doerr, Nicole; Allmaier, Guenter
2014-01-15
For the qualification and quantification of antioxidants (aromatic amines and sterically hindered phenols), most of them applied as lubricant additives, two ultrahigh-performance liquid chromatography (UHPLC) electrospray ionization mass spectrometric methods applying the positive and negative ion mode have been developed for lubricant design and engineering thus allowing e.g. the study of the degradation of lubricants. Based on the different chemical properties of the two groups of antioxidants, two methods offering a fast separation (10 min) without prior derivatization were developed. In order to reach these requirements, UHPLC was coupled with an LTQ Orbitrap hybrid tandem mass spectrometer with positive and negative ion electrospray ionization for simultaneous detection of spectra from UHPLC-high-resolution (HR)-MS (full scan mode) and UHPLC-low-resolution linear ion trap MS(2) (LITMS(2)), which we term UHPLC/HRMS-LITMS(2). All 20 analytes investigated could be qualified by an UHPLC/HRMS-LITMS(2) approach consisting of simultaneous UHPLC/HRMS (elemental composition) and UHPLC/LITMS(2) (diagnostic product ions) according to EC guidelines. Quantification was based on an UHPLC/LITMS(2) approach due to increased sensitivity and selectivity compared to UHPLC/HRMS. Absolute quantification was only feasible for seven analytes with well-specified purity of references whereas relative quantification was obtainable for another nine antioxidants. All of them showed good standard deviation and repeatability. The combined methods allow qualitative and quantitative determination of a wide variety of different antioxidants including aminic/phenolic compounds applied in lubricant engineering. These data show that the developed methods will be versatile tools for further research on identification and characterization of the thermo-oxidative degradation products of antioxidants in lubricants. Copyright © 2013 John Wiley & Sons, Ltd.
Hynstova, Veronika; Sterbova, Dagmar; Klejdus, Borivoj; Hedbavny, Josef; Huska, Dalibor; Adam, Vojtech
2018-01-30
In this study, 14 commercial products (dietary supplements) containing alga Chlorella vulgaris and cyanobacteria Spirulina platensis, originated from China and Japan, were analysed. UV-vis spectrophotometric method was applied for rapid determination of chlorophylls, carotenoids and pheophytins; as degradation products of chlorophylls. High Performance Thin-Layer Chromatography (HPTLC) was used for effective separation of these compounds, and also Atomic Absorption Spectrometry for determination of heavy metals as indicator of environmental pollution. Based on the results obtained from UV-vis spectrophotometric determination of photosynthetic pigments (chlorophylls and carotenoids), it was confirmed that Chlorella vulgaris contains more of all these pigments compared to the cyanobacteria Spirulina platensis. The fastest mobility compound identified in Chlorella vulgaris and Spirulina platensis using HPTLC method was β-carotene. Spectral analysis and standard calibration curve method were used for identification and quantification of separated substances on Thin-Layer Chromatographic plate. Quantification of copper (Cu 2+ , at 324.7 nm) and zinc (Zn 2+ , at 213.9nm) was performed using Flame Atomic Absorption Spectrometry with air-acetylene flame atomization. Quantification of cadmium (Cd 2+ , at 228.8 nm), nickel (Ni 2+ , at 232.0nm) and lead (Pb 2+ , at 283.3nm) by Electrothermal Graphite Furnace Atomic Absorption Spectrometry; and quantification of mercury (Hg 2+ , at 254nm) by Cold Vapour Atomic Absorption Spectrometry. Copyright © 2017 Elsevier B.V. All rights reserved.
Fabregat-Cabello, Neus; Sancho, Juan V; Vidal, Andreu; González, Florenci V; Roig-Navarro, Antoni Francesc
2014-02-07
We present here a new measurement method for the rapid extraction and accurate quantification of technical nonylphenol (NP) and 4-t-octylphenol (OP) in complex matrix water samples by UHPLC-ESI-MS/MS. The extraction of both compounds is achieved in 30min by means of hollow fiber liquid phase microextraction (HF-LPME) using 1-octanol as acceptor phase, which provides an enrichment (preconcentration) factor of 800. On the other hand we have developed a quantification method based on isotope dilution mass spectrometry (IDMS) and singly (13)C1-labeled compounds. To this end the minimal labeled (13)C1-4-(3,6-dimethyl-3-heptyl)-phenol and (13)C1-t-octylphenol isomers were synthesized, which coelute with the natural compounds and allows the compensation of the matrix effect. The quantification was carried out by using isotope pattern deconvolution (IPD), which permits to obtain the concentration of both compounds without the need to build any calibration graph, reducing the total analysis time. The combination of both extraction and determination techniques have allowed to validate for the first time a HF-LPME methodology at the required levels by legislation achieving limits of quantification of 0.1ngmL(-1) and recoveries within 97-109%. Due to the low cost of HF-LPME and total time consumption, this methodology is ready for implementation in routine analytical laboratories. Copyright © 2013 Elsevier B.V. All rights reserved.
Fast dictionary generation and searching for magnetic resonance fingerprinting.
Jun Xie; Mengye Lyu; Jian Zhang; Hui, Edward S; Wu, Ed X; Ze Wang
2017-07-01
A super-fast dictionary generation and searching (DGS) algorithm was developed for MR parameter quantification using magnetic resonance fingerprinting (MRF). MRF is a new technique for simultaneously quantifying multiple MR parameters using one temporally resolved MR scan. But it has a multiplicative computation complexity, resulting in a big burden of dictionary generating, saving, and retrieving, which can easily be intractable for any state-of-art computers. Based on retrospective analysis of the dictionary matching object function, a multi-scale ZOOM like DGS algorithm, dubbed as MRF-ZOOM, was proposed. MRF ZOOM is quasi-parameter-separable so the multiplicative computation complexity is broken into additive one. Evaluations showed that MRF ZOOM was hundreds or thousands of times faster than the original MRF parameter quantification method even without counting the dictionary generation time in. Using real data, it yielded nearly the same results as produced by the original method. MRF ZOOM provides a super-fast solution for MR parameter quantification.
Parsons, Teresa L; Emory, Joshua F; Seserko, Lauren A; Aung, Wutyi S; Marzinke, Mark A
2014-09-01
Topical microbicidal agents are being actively pursued as a modality to prevent HIV viral transmission during sexual intercourse. Quantification of antiretroviral agents in specimen sources where antiviral activity is elicited is critical, and drug measurements in cervicovaginal fluid can provide key information on local drug concentrations. Two antiretroviral drugs, dapivirine and maraviroc, have gained interest as vaginal microbicidal agents, and rugged methods are required for their quantification in cervicovaginal secretions. Cervicovaginal fluid spiked with dapivirine and maraviroc were applied to ophthalmic tear strips or polyester-based swabs to mimic collection procedures used in clinical studies. Following sample extraction and the addition of isotopically labeled internal standards, samples were subjected to liquid chromatographic-tandem mass spectrometric (LC-MS/MS) analysis using a Waters BEH C8, 50mm×2.1mm, 1.7μm particle size column, on an API 4000 mass analyzer operated in selective reaction monitoring mode. The method was validated according to FDA Bioanalytical Method Validation guidelines. Due to the disparate saturation capacity of the tested collection devices, the analytical measuring ranges for dapivirine and maravirocin cervicovaginal fluid on the ophthalmic tear strip were 0.05-25ng/tear strip, and 0.025-25ng/tear strip, respectively. As for the polyester-based swab, the analytical measuring ranges were 0.25-125ng/swab for dapivirine and 0.125-125ng/swab for maraviroc. Dilutional studies were performed for both analytes to extended ranges of 25,000ng/tear strip and 11,250ng/swab. Standard curves were generated via weighted (1/x(2)) linear or quadratic regression of calibrators. Precision, accuracy, stability and matrix effects studies were all performed and deemed acceptable according to the recommendations of the FDA Bioanalytical Method Validation guidelines. A rugged LC-MS/MS method for the dual quantification of dapivirine and maraviroc in cervicovaginal fluid using two unique collection devices has been developed and validated. The described method meets the criteria to support large research trials. Copyright © 2014 Elsevier B.V. All rights reserved.
Weighardt, Florian; Barbati, Cristina; Paoletti, Claudia; Querci, Maddalena; Kay, Simon; De Beuckeleer, Marc; Van den Eede, Guy
2004-01-01
In Europe, a growing interest for reliable techniques for the quantification of genetically modified component(s) of food matrixes is arising from the need to comply with the European legislative framework on novel food products. Real-time polymerase chain reaction (PCR) is currently the most powerful technique for the quantification of specific nucleic acid sequences. Several real-time PCR methodologies based on different molecular principles have been developed for this purpose. The most frequently used approach in the field of genetically modified organism (GMO) quantification in food or feed samples is based on the 5'-3'-exonuclease activity of Taq DNA polymerase on specific degradation probes (TaqMan principle). A novel approach was developed for the establishment of a TaqMan quantification system assessing GMO contents around the 1% threshold stipulated under European Union (EU) legislation for the labeling of food products. The Zea mays T25 elite event was chosen as a model for the development of the novel GMO quantification approach. The most innovative aspect of the system is represented by the use of sequences cloned in plasmids as reference standards. In the field of GMO quantification, plasmids are an easy to use, cheap, and reliable alternative to Certified Reference Materials (CRMs), which are only available for a few of the GMOs authorized in Europe, have a relatively high production cost, and require further processing to be suitable for analysis. Strengths and weaknesses of the use of novel plasmid-based standards are addressed in detail. In addition, the quantification system was designed to avoid the use of a reference gene (e.g., a single copy, species-specific gene) as normalizer, i.e., to perform a GMO quantification based on an absolute instead of a relative measurement. In fact, experimental evidences show that the use of reference genes adds variability to the measurement system because a second independent real-time PCR-based measurement must be performed. Moreover, for some reference genes no sufficient information on copy number in and among genomes of different lines is available, making adequate quantification difficult. Once developed, the method was subsequently validated according to IUPAC and ISO 5725 guidelines. Thirteen laboratories from 8 EU countries participated in the trial. Eleven laboratories provided results complying with the predefined study requirements. Repeatability (RSDr) values ranged from 8.7 to 15.9%, with a mean value of 12%. Reproducibility (RSDR) values ranged from 16.3 to 25.5%, with a mean value of 21%. Following Codex Alimentarius Committee guidelines, both the limits of detection and quantitation were determined to be <0.1%.
Sjödin, Marcus O D; Wetterhall, Magnus; Kultima, Kim; Artemenko, Konstantin
2013-06-01
The analytical performance of three different strategies, iTRAQ (isobaric tag for relative and absolute quantification), dimethyl labeling (DML) and label free (LF) for relative protein quantification using shotgun proteomics have been evaluated. The methods have been explored using samples containing (i) Bovine proteins in known ratios and (ii) Bovine proteins in known ratios spiked into Escherichia coli. The latter case mimics the actual conditions in a typical biological sample with a few differentially expressed proteins and a bulk of proteins with unchanged ratios. Additionally, the evaluation was performed on both QStar and LTQ-FTICR mass spectrometers. LF LTQ-FTICR was found to have the highest proteome coverage while the highest accuracy based on the artificially regulated proteins was found for DML LTQ-FTICR (54%). A varying linearity (k: 0.55-1.16, r(2): 0.61-0.96) was shown for all methods within selected dynamic ranges. All methods were found to consistently underestimate Bovine protein ratios when matrix proteins were added. However, LF LTQ-FTICR was more tolerant toward a compression effect. A single peptide was demonstrated to be sufficient for a reliable quantification using iTRAQ. A ranking system utilizing several parameters important for quantitative proteomics demonstrated that the overall performance of the five different methods was; DML LTQ-FTICR>iTRAQ QStar>LF LTQ-FTICR>DML QStar>LF QStar. Copyright © 2013 Elsevier B.V. All rights reserved.
Nie, Pengcheng; Wu, Di; Sun, Da-Wen; Cao, Fang; Bao, Yidan; He, Yong
2013-01-01
Notoginseng is a classical traditional Chinese medical herb, which is of high economic and medical value. Notoginseng powder (NP) could be easily adulterated with Sophora flavescens powder (SFP) or corn flour (CF), because of their similar tastes and appearances and much lower cost for these adulterants. The objective of this study is to quantify the NP content in adulterated NP by using a rapid and non-destructive visible and near infrared (Vis-NIR) spectroscopy method. Three wavelength ranges of visible spectra, short-wave near infrared spectra (SNIR) and long-wave near infrared spectra (LNIR) were separately used to establish the model based on two calibration methods of partial least square regression (PLSR) and least-squares support vector machines (LS-SVM), respectively. Competitive adaptive reweighted sampling (CARS) was conducted to identify the most important wavelengths/variables that had the greatest influence on the adulterant quantification throughout the whole wavelength range. The CARS-PLSR models based on LNIR were determined as the best models for the quantification of NP adulterated with SFP, CF, and their mixtures, in which the rP values were 0.940, 0.939, and 0.867 for the three models respectively. The research demonstrated the potential of the Vis-NIR spectroscopy technique for the rapid and non-destructive quantification of NP containing adulterants. PMID:24129019
Ultraviolet, Visible, and Fluorescence Spectroscopy
NASA Astrophysics Data System (ADS)
Penner, Michael H.
Spectroscopy in the ultraviolet-visible (UV-Vis) range is one of the most commonly encountered laboratory techniques in food analysis. Diverse examples, such as the quantification of macrocomponents (total carbohydrate by the phenol-sulfuric acid method), quantification of microcomponents, (thiamin by the thiochrome fluorometric procedure), estimates of rancidity (lipid oxidation status by the thiobarbituric acid test), and surveillance testing (enzyme-linked immunoassays), are presented in this text. In each of these cases, the analytical signal for which the assay is based is either the emission or absorption of radiation in the UV-Vis range. This signal may be inherent in the analyte, such as the absorbance of radiation in the visible range by pigments, or a result of a chemical reaction involving the analyte, such as the colorimetric copper-based Lowry method for the analysis of soluble protein.
NASA Astrophysics Data System (ADS)
Krishnan, Karthik; Reddy, Kasireddy V.; Ajani, Bhavya; Yalavarthy, Phaneendra K.
2017-02-01
CT and MR perfusion weighted imaging (PWI) enable quantification of perfusion parameters in stroke studies. These parameters are calculated from the residual impulse response function (IRF) based on a physiological model for tissue perfusion. The standard approach for estimating the IRF is deconvolution using oscillatory-limited singular value decomposition (oSVD) or Frequency Domain Deconvolution (FDD). FDD is widely recognized as the fastest approach currently available for deconvolution of CT Perfusion/MR PWI. In this work, three faster methods are proposed. The first is a direct (model based) crude approximation to the final perfusion quantities (Blood flow, Blood volume, Mean Transit Time and Delay) using the Welch-Satterthwaite approximation for gamma fitted concentration time curves (CTC). The second method is a fast accurate deconvolution method, we call Analytical Fourier Filtering (AFF). The third is another fast accurate deconvolution technique using Showalter's method, we call Analytical Showalter's Spectral Filtering (ASSF). Through systematic evaluation on phantom and clinical data, the proposed methods are shown to be computationally more than twice as fast as FDD. The two deconvolution based methods, AFF and ASSF, are also shown to be quantitatively accurate compared to FDD and oSVD.
Polte, Christian L; Gao, Sinsia A; Johnsson, Åse A; Lagerstrand, Kerstin M; Bech-Hanssen, Odd
2017-06-15
Grading of chronic aortic regurgitation (AR) and mitral regurgitation (MR) by cardiovascular magnetic resonance (CMR) is currently based on thresholds, which are neither modality nor quantification method specific. Accordingly, this study sought to identify CMR-specific and quantification method-specific thresholds for regurgitant volumes (RVols), RVol indexes, and regurgitant fractions (RFs), which denote severe chronic AR or MR with an indication for surgery. The study comprised patients with moderate and severe chronic AR (n = 38) and MR (n = 40). Echocardiography and CMR was performed at baseline and in all operated AR/MR patients (n = 23/25) 10 ± 1 months after surgery. CMR quantification of AR: direct (aortic flow) and indirect method (left ventricular stroke volume [LVSV] - pulmonary stroke volume [PuSV]); MR: 2 indirect methods (LVSV - aortic forward flow [AoFF]; mitral inflow [MiIF] - AoFF). All operated patients had severe regurgitation and benefited from surgery, indicated by a significant postsurgical reduction in end-diastolic volume index and improvement or relief of symptoms. The discriminatory ability between moderate and severe AR was strong for RVol >40 ml, RVol index >20 ml/m 2 , and RF >30% (direct method) and RVol >62 ml, RVol index >31 ml/m 2 , and RF >36% (LVSV-PuSV) with a negative likelihood ratio ≤ 0.2. In MR, the discriminatory ability was very strong for RVol >64 ml, RVol index >32 ml/m 2 , and RF >41% (LVSV-AoFF) and RVol >40 ml, RVol index >20 ml/m 2 , and RF >30% (MiIF-AoFF) with a negative likelihood ratio < 0.1. In conclusion, CMR grading of chronic AR and MR should be based on modality-specific and quantification method-specific thresholds, as they differ largely from recognized guideline criteria, to assure appropriate clinical decision-making and timing of surgery. Copyright © 2017 Elsevier Inc. All rights reserved.
Afonso, J; Lopes, S; Gonçalves, R; Caldeira, P; Lago, P; Tavares de Sousa, H; Ramos, J; Gonçalves, A R; Ministro, P; Rosa, I; Vieira, A I; Dias, C C; Magro, F
2016-10-01
Therapeutic drug monitoring is a powerful strategy known to improve the clinical outcomes and to optimise the healthcare resources in the treatment of autoimmune diseases. Currently, most of the methods commercially available for the quantification of infliximab (IFX) are ELISA-based, with a turnaround time of approximately 8 h, and delaying the target dosage adjustment to the following infusion. To validate the first point-of-care IFX quantification device available in the market - the Quantum Blue Infliximab assay (Buhlmann, Schonenbuch, Switzerland) - by comparing it with two well-established methods. The three methods were used to assay the IFX concentration of spiked samples and of the serum of 299 inflammatory bowel diseases (IBD) patients undergoing IFX therapy. The point-of-care assay had an average IFX recovery of 92%, being the most precise among the tested methods. The Intraclass Correlation Coefficients of the point-of-care IFX assay vs. the two ELISA-based established methods were 0.889 and 0.939. Moreover, the accuracy of the point-of-care IFX compared with each of the two reference methods was 77% and 83%, and the kappa statistics revealed a substantial agreement (0.648 and 0.738). The Quantum Blue IFX assay can successfully replace the commonly used ELISA-based IFX quantification kits. This point-of-care IFX assay is able to deliver the results within 15 min makes it ideal for an immediate target concentration adjusted dosing. Moreover, it is a user-friendly desktop device that does not require specific laboratory facilities or highly specialised personnel. © 2016 John Wiley & Sons Ltd.
Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen
2014-07-01
Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. © 2014 Wiley Periodicals, Inc.
Detection of lead(II) ions with a DNAzyme and isothermal strand displacement signal amplification.
Li, Wenying; Yang, Yue; Chen, Jian; Zhang, Qingfeng; Wang, Yan; Wang, Fangyuan; Yu, Cong
2014-03-15
A DNAzyme based method for the sensitive and selective quantification of lead(II) ions has been developed. A DNAzyme that requires Pb(2+) for activation was selected. An RNA containing DNA substrate was cleaved by the DNAzyme in the presence of Pb(2+). The 2',3'-cyclic phosphate of the cleaved 5'-part of the substrate was efficiently removed by Exonuclease III. The remaining part of the single stranded DNA (9 or 13 base long) was subsequently used as the primer for the strand displacement amplification reaction (SDAR). The method is highly sensitive, 200 pM lead(II) could be easily detected. A number of interference ions were tested, and the sensor showed good selectivity. Underground water samples were also tested, which demonstrated the feasibility of the current approach for real sample applications. It is feasible that our method could be used for DNAzyme or aptazyme based new sensing method developments for the quantification of other target analytes with high sensitivity and selectivity. © 2013 Elsevier B.V. All rights reserved.
A Set of Handwriting Features for Use in Automated Writer Identification.
Miller, John J; Patterson, Robert Bradley; Gantz, Donald T; Saunders, Christopher P; Walch, Mark A; Buscaglia, JoAnn
2017-05-01
A writer's biometric identity can be characterized through the distribution of physical feature measurements ("writer's profile"); a graph-based system that facilitates the quantification of these features is described. To accomplish this quantification, handwriting is segmented into basic graphical forms ("graphemes"), which are "skeletonized" to yield the graphical topology of the handwritten segment. The graph-based matching algorithm compares the graphemes first by their graphical topology and then by their geometric features. Graphs derived from known writers can be compared against graphs extracted from unknown writings. The process is computationally intensive and relies heavily upon statistical pattern recognition algorithms. This article focuses on the quantification of these physical features and the construction of the associated pattern recognition methods for using the features to discriminate among writers. The graph-based system described in this article has been implemented in a highly accurate and approximately language-independent biometric recognition system of writers of cursive documents. © 2017 American Academy of Forensic Sciences.
An Excel‐based implementation of the spectral method of action potential alternans analysis
Pearman, Charles M.
2014-01-01
Abstract Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro‐arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T‐wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. PMID:25501439
Sun, Qian; Chang, Lu; Ren, Yanping; Cao, Liang; Sun, Yingguang; Du, Yingfeng; Shi, Xiaowei; Wang, Qiao; Zhang, Lantong
2012-11-01
A novel method based on high-performance liquid chromatography coupled with electrospray ionization tandem mass spectrometry was developed for simultaneous determination of the 11 major active components including ten flavonoids and one phenolic acid in Cirsium setosum. Separation was performed on a reversed-phase C(18) column with gradient elution of methanol and 0.1‰ acetic acid (v/v). The identification and quantification of the analytes were achieved on a hybrid quadrupole linear ion trap mass spectrometer. Multiple-reaction monitoring scanning was employed for quantification with switching electrospray ion source polarity between positive and negative modes in a single run. Full validation of the assay was carried out including linearity, precision, accuracy, stability, limits of detection and quantification. The results demonstrated that the method developed was reliable, rapid, and specific. The 25 batches of C. setosum samples from different sources were first determined using the developed method and the total contents of 11 analytes ranged from 1717.460 to 23028.258 μg/g. Among them, the content of linarin was highest, and its mean value was 7340.967 μg/g. Principal component analysis and hierarchical clustering analysis were performed to differentiate and classify the samples, which is helpful for comprehensive evaluation of the quality of C. setosum. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
NASA Astrophysics Data System (ADS)
Ojima, Nobutoshi; Okiyama, Natsuko; Okaguchi, Saya; Tsumura, Norimichi; Nakaguchi, Toshiya; Hori, Kimihiko; Miyake, Yoichi
2005-04-01
In the cosmetics industry, skin color is very important because skin color gives a direct impression of the face. In particular, many people suffer from melanin pigmentation such as liver spots and freckles. However, it is very difficult to evaluate melanin pigmentation using conventional colorimetric values because these values contain information on various skin chromophores simultaneously. Therefore, it is necessary to extract information of the chromophore of individual skins independently as density information. The isolation of the melanin component image based on independent component analysis (ICA) from a single skin image was reported in 2003. However, this technique has not developed a quantification method for melanin pigmentation. This paper introduces a quantification method based on the ICA of a skin color image to isolate melanin pigmentation. The image acquisition system we used consists of commercially available equipment such as digital cameras and lighting sources with polarized light. The images taken were analyzed using ICA to extract the melanin component images, and Laplacian of Gaussian (LOG) filter was applied to extract the pigmented area. As a result, for skin images including those showing melanin pigmentation and acne, the method worked well. Finally, the total amount of extracted area had a strong correspondence to the subjective rating values for the appearance of pigmentation. Further analysis is needed to recognize the appearance of pigmentation concerning the size of the pigmented area and its spatial gradation.
Dias, M Graça; Oliveira, Luísa; Camões, M Filomena G F C; Nunes, Baltazar; Versloot, Pieter; Hulshof, Paul J M
2010-05-21
Three sets of extraction/saponification/HPLC conditions for food carotenoid quantification were technically and economically compared. Samples were analysed for carotenoids alpha-carotene, beta-carotene, beta-cryptoxanthin, lutein, lycopene, and zeaxanthin. All methods demonstrated good performance in the analysis of a composite food standard reference material for the analytes they are applicable to. Methods using two serial connected C(18) columns and a mobile phase based on acetonitrile, achieved a better carotenoid separation than the method using a mobile phase based on methanol and one C(18)-column. Carotenoids from leafy green vegetable matrices appeared to be better extracted with a mixture of methanol and tetrahydrofuran than with tetrahydrofuran alone. Costs of carotenoid determination in foods were lower for the method with mobile phase based on methanol. However for some food matrices and in the case of E-Z isomer separations, this was not technically satisfactory. Food extraction with methanol and tetrahydrofuran with direct evaporation of these solvents, and saponification (when needed) using pyrogallol as antioxidant, combined with a HPLC system using a slight gradient mobile phase based on acetonitrile and a stationary phase composed by two serial connected C(18) columns was the most technically and economically favourable method. 2010. Published by Elsevier B.V.
Quantifying circular RNA expression from RNA-seq data using model-based framework.
Li, Musheng; Xie, Xueying; Zhou, Jing; Sheng, Mengying; Yin, Xiaofeng; Ko, Eun-A; Zhou, Tong; Gu, Wanjun
2017-07-15
Circular RNAs (circRNAs) are a class of non-coding RNAs that are widely expressed in various cell lines and tissues of many organisms. Although the exact function of many circRNAs is largely unknown, the cell type-and tissue-specific circRNA expression has implicated their crucial functions in many biological processes. Hence, the quantification of circRNA expression from high-throughput RNA-seq data is becoming important to ascertain. Although many model-based methods have been developed to quantify linear RNA expression from RNA-seq data, these methods are not applicable to circRNA quantification. Here, we proposed a novel strategy that transforms circular transcripts to pseudo-linear transcripts and estimates the expression values of both circular and linear transcripts using an existing model-based algorithm, Sailfish. The new strategy can accurately estimate transcript expression of both linear and circular transcripts from RNA-seq data. Several factors, such as gene length, amount of expression and the ratio of circular to linear transcripts, had impacts on quantification performance of circular transcripts. In comparison to count-based tools, the new computational framework had superior performance in estimating the amount of circRNA expression from both simulated and real ribosomal RNA-depleted (rRNA-depleted) RNA-seq datasets. On the other hand, the consideration of circular transcripts in expression quantification from rRNA-depleted RNA-seq data showed substantial increased accuracy of linear transcript expression. Our proposed strategy was implemented in a program named Sailfish-cir. Sailfish-cir is freely available at https://github.com/zerodel/Sailfish-cir . tongz@medicine.nevada.edu or wanjun.gu@gmail.com. Supplementary data are available at Bioinformatics online. © The Author (2017). Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com
Shivali, Garg; Praful, Lahorkar; Vijay, Gadgil
2012-01-01
Fourier transform infrared (FT-IR) spectroscopy is a technique widely used for detection and quantification of various chemical moieties. This paper describes the use of the FT-IR spectroscopy technique for the quantification of total lactones present in Inula racemosa and Andrographis paniculata. To validate the FT-IR spectroscopy method for quantification of total lactones in I. racemosa and A. paniculata. Dried and powdered I. racemosa roots and A. paniculata plant were extracted with ethanol and dried to remove ethanol completely. The ethanol extract was analysed in a KBr pellet by FT-IR spectroscopy. The FT-IR spectroscopy method was validated and compared with a known spectrophotometric method for quantification of lactones in A. paniculata. By FT-IR spectroscopy, the amount of total lactones was found to be 2.12 ± 0.47% (n = 3) in I. racemosa and 8.65 ± 0.51% (n = 3) in A. paniculata. The method showed comparable results with a known spectrophotometric method used for quantification of such lactones: 8.42 ± 0.36% (n = 3) in A. paniculata. Limits of detection and quantification for isoallantolactone were 1 µg and 10 µg respectively; for andrographolide they were 1.5 µg and 15 µg respectively. Recoveries were over 98%, with good intra- and interday repeatability: RSD ≤ 2%. The FT-IR spectroscopy method proved linear, accurate, precise and specific, with low limits of detection and quantification, for estimation of total lactones, and is less tedious than the UV spectrophotometric method for the compounds tested. This validated FT-IR spectroscopy method is readily applicable for the quality control of I. racemosa and A. paniculata. Copyright © 2011 John Wiley & Sons, Ltd.
Goal-Oriented Probability Density Function Methods for Uncertainty Quantification
2015-12-11
approximations or data-driven approaches. We investigated the accuracy of analytical tech- niques based Kubo -Van Kampen operator cumulant expansions for...analytical techniques based Kubo -Van Kampen operator cumulant expansions for Langevin equations driven by fractional Brownian motion and other noises
Wang, Yue; Adalý, Tülay; Kung, Sun-Yuan; Szabo, Zsolt
2007-01-01
This paper presents a probabilistic neural network based technique for unsupervised quantification and segmentation of brain tissues from magnetic resonance images. It is shown that this problem can be solved by distribution learning and relaxation labeling, resulting in an efficient method that may be particularly useful in quantifying and segmenting abnormal brain tissues where the number of tissue types is unknown and the distributions of tissue types heavily overlap. The new technique uses suitable statistical models for both the pixel and context images and formulates the problem in terms of model-histogram fitting and global consistency labeling. The quantification is achieved by probabilistic self-organizing mixtures and the segmentation by a probabilistic constraint relaxation network. The experimental results show the efficient and robust performance of the new algorithm and that it outperforms the conventional classification based approaches. PMID:18172510
NASA Technical Reports Server (NTRS)
Goldman, A.
2002-01-01
The Langley-D.U. collaboration on the analysis of high resolultion infrared atmospheric spectra covered a number of important studies of trace gases identification and quantification from field spectra, and spectral line parameters analysis. The collaborative work included: 1) Quantification and monitoring of trace gases from ground-based spectra available from various locations and seasons and from balloon flights; 2) Identification and preliminary quantification of several isotopic species, including oxygen and Sulfur isotopes; 3) Search for new species on the available spectra, including the use of selective coadding of ground-based spectra for high signal to noise; 4) Update of spectroscopic line parameters, by combining laboratory and atmospheric spectra with theoretical spectroscopy methods; 5) Study of trends and correlations of atmosphere trace constituents; and 6) Algorithms developments, retrievals intercomparisons and automatization of the analysis of NDSC spectra, for both column amounts and vertical profiles.
NASA Astrophysics Data System (ADS)
Akamatsu, G.; Ikari, Y.; Ohnishi, A.; Nishida, H.; Aita, K.; Sasaki, M.; Yamamoto, Y.; Sasaki, M.; Senda, M.
2016-08-01
Amyloid PET is useful for early and/or differential diagnosis of Alzheimer’s disease (AD). Quantification of amyloid deposition using PET has been employed to improve diagnosis and to monitor AD therapy, particularly in research. Although MRI is often used for segmentation of gray matter and for spatial normalization into standard Montreal Neurological Institute (MNI) space where region-of-interest (ROI) template is defined, 3D MRI is not always available in clinical practice. The purpose of this study was to examine the feasibility of PET-only amyloid quantification with an adaptive template and a pre-defined standard ROI template that has been empirically generated from typical cases. A total of 68 subjects who underwent brain 11C-PiB PET were examined. The 11C-PiB images were non-linearly spatially normalized to the standard MNI T1 atlas using the same transformation parameters of MRI-based normalization. The automatic-anatomical-labeling-ROI (AAL-ROI) template was applied to the PET images. All voxel values were normalized by the mean value of cerebellar cortex to generate the SUVR-scaled images. Eleven typical positive images and eight typical negative images were normalized and averaged, respectively, and were used as the positive and negative template. Positive and negative masks which consist of voxels with SUVR ⩾1.7 were extracted from both templates. Empirical PiB-prone ROI (EPP-ROI) was generated by subtracting the negative mask from the positive mask. The 11C-PiB image of each subject was non-rigidly normalized to the positive and negative template, respectively, and the one with higher cross-correlation was adopted. The EPP-ROI was then inversely transformed to individual PET images. We evaluated differences of SUVR between standard MRI-based method and PET-only method. We additionally evaluated whether the PET-only method would correctly categorize 11C-PiB scans as positive or negative. Significant correlation was observed between the SUVRs obtained with AAL-ROI and those with EPP-ROI when MRI-based normalization was used, the latter providing higher SUVR. When EPP-ROI was used, MRI-based method and PET-only method provided almost identical SUVR. All 11C-PiB scans were correctly categorized into positive and negative using a cutoff value of 1.7 as compared to visual interpretation. The 11C-PiB SUVR were 2.30 ± 0.24 and 1.25 ± 0.11 for the positive and negative images. PET-only amyloid quantification method with adaptive templates and EPP-ROI can provide accurate, robust and simple amyloid quantification without MRI.
Evaluation of the reliability of maize reference assays for GMO quantification.
Papazova, Nina; Zhang, David; Gruden, Kristina; Vojvoda, Jana; Yang, Litao; Buh Gasparic, Meti; Blejec, Andrej; Fouilloux, Stephane; De Loose, Marc; Taverniers, Isabel
2010-03-01
A reliable PCR reference assay for relative genetically modified organism (GMO) quantification must be specific for the target taxon and amplify uniformly along the commercialised varieties within the considered taxon. Different reference assays for maize (Zea mays L.) are used in official methods for GMO quantification. In this study, we evaluated the reliability of eight existing maize reference assays, four of which are used in combination with an event-specific polymerase chain reaction (PCR) assay validated and published by the Community Reference Laboratory (CRL). We analysed the nucleotide sequence variation in the target genomic regions in a broad range of transgenic and conventional varieties and lines: MON 810 varieties cultivated in Spain and conventional varieties from various geographical origins and breeding history. In addition, the reliability of the assays was evaluated based on their PCR amplification performance. A single base pair substitution, corresponding to a single nucleotide polymorphism (SNP) reported in an earlier study, was observed in the forward primer of one of the studied alcohol dehydrogenase 1 (Adh1) (70) assays in a large number of varieties. The SNP presence is consistent with a poor PCR performance observed for this assay along the tested varieties. The obtained data show that the Adh1 (70) assay used in the official CRL NK603 assay is unreliable. Based on our results from both the nucleotide stability study and the PCR performance test, we can conclude that the Adh1 (136) reference assay (T25 and Bt11 assays) as well as the tested high mobility group protein gene assay, which also form parts of CRL methods for quantification, are highly reliable. Despite the observed uniformity in the nucleotide sequence of the invertase gene assay, the PCR performance test reveals that this target sequence might occur in more than one copy. Finally, although currently not forming a part of official quantification methods, zein and SSIIb assays are found to be highly reliable in terms of nucleotide stability and PCR performance and are proposed as good alternative targets for a reference assay for maize.
Barroso, Pedro José; Martín, Julia; Santos, Juan Luis; Aparicio, Irene; Alonso, Esteban
2018-01-01
In this work, an analytical method, based on sonication-assisted extraction, clean-up by dispersive solid-phase extraction and determination by liquid chromatography-tandem mass spectrometry, has been developed and validated for the simultaneous determination of 15 emerging pollutants in leaves from four ornamental tree species. Target compounds include perfluorinated organic compounds, plasticizers, surfactants, brominated flame retardant, and preservatives. The method was optimized using Box-Behnken statistical experimental design with response surface methodology and validated in terms of recovery, accuracy, precision, and method detection and quantification limits. Quantification of target compounds was carried out using matrix-matched calibration curves. The highest recoveries were achieved for the perfluorinated organic compounds (mean values up to 87%) and preservatives (up to 88%). The lowest recoveries were achieved for plasticizers (51%) and brominated flame retardant (63%). Method detection and quantification limits were in the ranges 0.01-0.09 ng/g dry matter (dm) and 0.02-0.30 ng/g dm, respectively, for most of the target compounds. The method was successfully applied to the determination of the target compounds on leaves from four tree species used as urban ornamental trees (Citrus aurantium, Celtis australis, Platanus hispanica, and Jacaranda mimosifolia). Graphical abstract Analytical method for the biomonitorization of emerging pollutants in outdoor air.
Methods for measuring denitrification: Diverse approaches to a difficult problem
Groffman, Peter M; Altabet, Mary A.; Böhlke, J.K.; Butterbach-Bahl, Klaus; David, Mary B.; Firestone, Mary K.; Giblin, Anne E.; Kana, Todd M.; Nielsen , Lars Peter; Voytek, Mary A.
2006-01-01
Denitrification, the reduction of the nitrogen (N) oxides, nitrate (NO3−) and nitrite (NO2−), to the gases nitric oxide (NO), nitrous oxide (N2O), and dinitrogen (N2), is important to primary production, water quality, and the chemistry and physics of the atmosphere at ecosystem, landscape, regional, and global scales. Unfortunately, this process is very difficult to measure, and existing methods are problematic for different reasons in different places at different times. In this paper, we review the major approaches that have been taken to measure denitrification in terrestrial and aquatic environments and discuss the strengths, weaknesses, and future prospects for the different methods. Methodological approaches covered include (1) acetylene-based methods, (2) 15N tracers, (3) direct N2 quantification, (4) N2:Ar ratio quantification, (5) mass balance approaches, (6) stoichiometric approaches, (7) methods based on stable isotopes, (8) in situ gradients with atmospheric environmental tracers, and (9) molecular approaches. Our review makes it clear that the prospects for improved quantification of denitrification vary greatly in different environments and at different scales. While current methodology allows for the production of accurate estimates of denitrification at scales relevant to water and air quality and ecosystem fertility questions in some systems (e.g., aquatic sediments, well-defined aquifers), methodology for other systems, especially upland terrestrial areas, still needs development. Comparison of mass balance and stoichiometric approaches that constrain estimates of denitrification at large scales with point measurements (made using multiple methods), in multiple systems, is likely to propel more improvement in denitrification methods over the next few years.
PaCeQuant: A Tool for High-Throughput Quantification of Pavement Cell Shape Characteristics1[OPEN
Poeschl, Yvonne; Plötner, Romina
2017-01-01
Pavement cells (PCs) are the most frequently occurring cell type in the leaf epidermis and play important roles in leaf growth and function. In many plant species, PCs form highly complex jigsaw-puzzle-shaped cells with interlocking lobes. Understanding of their development is of high interest for plant science research because of their importance for leaf growth and hence for plant fitness and crop yield. Studies of PC development, however, are limited, because robust methods are lacking that enable automatic segmentation and quantification of PC shape parameters suitable to reflect their cellular complexity. Here, we present our new ImageJ-based tool, PaCeQuant, which provides a fully automatic image analysis workflow for PC shape quantification. PaCeQuant automatically detects cell boundaries of PCs from confocal input images and enables manual correction of automatic segmentation results or direct import of manually segmented cells. PaCeQuant simultaneously extracts 27 shape features that include global, contour-based, skeleton-based, and PC-specific object descriptors. In addition, we included a method for classification and analysis of lobes at two-cell junctions and three-cell junctions, respectively. We provide an R script for graphical visualization and statistical analysis. We validated PaCeQuant by extensive comparative analysis to manual segmentation and existing quantification tools and demonstrated its usability to analyze PC shape characteristics during development and between different genotypes. PaCeQuant thus provides a platform for robust, efficient, and reproducible quantitative analysis of PC shape characteristics that can easily be applied to study PC development in large data sets. PMID:28931626
An online sleep apnea detection method based on recurrence quantification analysis.
Nguyen, Hoa Dinh; Wilkins, Brek A; Cheng, Qi; Benjamin, Bruce Allen
2014-07-01
This paper introduces an online sleep apnea detection method based on heart rate complexity as measured by recurrence quantification analysis (RQA) statistics of heart rate variability (HRV) data. RQA statistics can capture nonlinear dynamics of a complex cardiorespiratory system during obstructive sleep apnea. In order to obtain a more robust measurement of the nonstationarity of the cardiorespiratory system, we use different fixed amount of neighbor thresholdings for recurrence plot calculation. We integrate a feature selection algorithm based on conditional mutual information to select the most informative RQA features for classification, and hence, to speed up the real-time classification process without degrading the performance of the system. Two types of binary classifiers, i.e., support vector machine and neural network, are used to differentiate apnea from normal sleep. A soft decision fusion rule is developed to combine the results of these classifiers in order to improve the classification performance of the whole system. Experimental results show that our proposed method achieves better classification results compared with the previous recurrence analysis-based approach. We also show that our method is flexible and a strong candidate for a real efficient sleep apnea detection system.
Automated quantification of myocardial perfusion SPECT using simplified normal limits.
Slomka, Piotr J; Nishina, Hidetaka; Berman, Daniel S; Akincioglu, Cigdem; Abidov, Aiden; Friedman, John D; Hayes, Sean W; Germano, Guido
2005-01-01
To simplify development of normal limits for myocardial perfusion SPECT (MPS), we implemented a quantification scheme in which normal limits are derived without visual scoring of abnormal scans or optimization of regional thresholds. Normal limits were derived from same-day TI-201 rest/Tc-99m-sestamibi stress scans of male (n = 40) and female (n = 40) low-likelihood patients. Defect extent, total perfusion deficit (TPD), and regional perfusion extents were derived by comparison to normal limits in polar-map coordinates. MPS scans from 256 consecutive patients without known coronary artery disease, who underwent coronary angiography, were analyzed. The new method of quantification (TPD) was compared with our previously developed quantification system and visual scoring. The receiver operator characteristic area under the curve for detection of 50% or greater stenoses by TPD (0.88 +/- 0.02) was higher than by visual scoring (0.83 +/- 0.03) ( P = .039) or standard quantification (0.82 +/- 0.03) ( P = .004). For detection of 70% or greater stenoses, it was higher for TPD (0.89 +/- 0.02) than for standard quantification (0.85 +/- 0.02) ( P = .014). Sensitivity and specificity were 93% and 79%, respectively, for TPD; 81% and 85%, respectively, for visual scoring; and 80% and 73%, respectively, for standard quantification. The use of stress mode-specific normal limits did not improve performance. Simplified quantification achieves performance better than or equivalent to visual scoring or quantification based on per-segment visual optimization of abnormality thresholds.
Egom, Emmanuel E.; Fitzgerald, Ross; Canning, Rebecca; Pharithi, Rebabonye B.; Murphy, Colin; Maher, Vincent
2017-01-01
Evidence suggests that high-density lipoprotein (HDL) components distinct from cholesterol, such as sphingosine-1-phosphate (S1P), may account for the anti-atherothrombotic effects attributed to this lipoprotein. The current method for the determination of plasma levels of S1P as well as levels associated with HDL particles is still cumbersome an assay method to be worldwide practical. Recently, a simplified protocol based on liquid chromatography-tandem mass spectrometry (LC-MS/MS) for the sensitive and specific quantification of plasma levels of S1P with good accuracy has been reported. This work utilized a triple quadrupole (QqQ)-based LC-MS/MS system. Here we adapt that method for the determination of plasma levels of S1P using a quadrupole time of flight (Q-Tof) based LC-MS system. Calibration curves were linear in the range of 0.05 to 2 µM. The lower limit of quantification (LOQ) was 0.05 µM. The concentration of S1P in human plasma was determined to be 1 ± 0.09 µM (n = 6). The average accuracy over the stated range of the method was found to be 100 ± 5.9% with precision at the LOQ better than 10% when predicting the calibration standards. The concentration of plasma S1P in the prepared samples was stable for 24 h at room temperature. We have demonstrated the quantification of plasma S1P using Q-Tof based LC-MS with very good sensitivity, accuracy, and precision that can used for future studies in this field. PMID:28820460
Reproducibility measurements of three methods for calculating in vivo MR-based knee kinematics.
Lansdown, Drew A; Zaid, Musa; Pedoia, Valentina; Subburaj, Karupppasamy; Souza, Richard; Benjamin, C; Li, Xiaojuan
2015-08-01
To describe three quantification methods for magnetic resonance imaging (MRI)-based knee kinematic evaluation and to report on the reproducibility of these algorithms. T2 -weighted, fast-spin echo images were obtained of the bilateral knees in six healthy volunteers. Scans were repeated for each knee after repositioning to evaluate protocol reproducibility. Semiautomatic segmentation defined regions of interest for the tibia and femur. The posterior femoral condyles and diaphyseal axes were defined using the previously defined tibia and femur. All segmentation was performed twice to evaluate segmentation reliability. Anterior tibial translation (ATT) and internal tibial rotation (ITR) were calculated using three methods: a tibial-based registration system, a combined tibiofemoral-based registration method with all manual segmentation, and a combined tibiofemoral-based registration method with automatic definition of condyles and axes. Intraclass correlation coefficients and standard deviations across multiple measures were determined. Reproducibility of segmentation was excellent (ATT = 0.98; ITR = 0.99) for both combined methods. ATT and ITR measurements were also reproducible across multiple scans in the combined registration measurements with manual (ATT = 0.94; ITR = 0.94) or automatic (ATT = 0.95; ITR = 0.94) condyles and axes. The combined tibiofemoral registration with automatic definition of the posterior femoral condyle and diaphyseal axes allows for improved knee kinematics quantification with excellent in vivo reproducibility. © 2014 Wiley Periodicals, Inc.
Petruševska, Marija; Urleb, Uroš; Peternel, Luka
2013-11-01
The excipient-mediated precipitation inhibition is classically determined by the quantification of the dissolved compound in the solution. In this study, two alternative approaches were evaluated, one is the light scattering (nephelometer) and other is the turbidity (plate reader) microtiter plate-based methods which are based on the quantification of the compound precipitate. Following the optimization of the nephelometer settings (beam focus, laser gain) and the experimental conditions, the screening of 23 excipients on the precipitation inhibition of poorly soluble fenofibrate and dipyridamole was performed. The light scattering method resulted in excellent correlation (r>0.91) between the calculated precipitation inhibitor parameters (PIPs) and the precipitation inhibition index (PI(classical)) obtained by the classical approach for fenofibrate and dipyridamole. Among the evaluated PIPs AUC100 (nephelometer) resulted in only four false positives and lack of false negatives. In the case of the turbidity-based method a good correlation of the PI(classical) was obtained for the PIP maximal optical density (OD(max), r=0.91), however, only for fenofibrate. In the case of the OD(max) (plate reader) five false positives and two false negatives were identified. In conclusion, the light scattering-based method outperformed the turbidity-based one and could be reliably used for identification of novel precipitation inhibitors. Copyright © 2013 Elsevier B.V. All rights reserved.
Dieringer, Matthias A.; Deimling, Michael; Santoro, Davide; Wuerfel, Jens; Madai, Vince I.; Sobesky, Jan; von Knobelsdorff-Brenkenhoff, Florian; Schulz-Menger, Jeanette; Niendorf, Thoralf
2014-01-01
Introduction Visual but subjective reading of longitudinal relaxation time (T1) weighted magnetic resonance images is commonly used for the detection of brain pathologies. For this non-quantitative measure, diagnostic quality depends on hardware configuration, imaging parameters, radio frequency transmission field (B1+) uniformity, as well as observer experience. Parametric quantification of the tissue T1 relaxation parameter offsets the propensity for these effects, but is typically time consuming. For this reason, this study examines the feasibility of rapid 2D T1 quantification using a variable flip angles (VFA) approach at magnetic field strengths of 1.5 Tesla, 3 Tesla, and 7 Tesla. These efforts include validation in phantom experiments and application for brain T1 mapping. Methods T1 quantification included simulations of the Bloch equations to correct for slice profile imperfections, and a correction for B1+. Fast gradient echo acquisitions were conducted using three adjusted flip angles for the proposed T1 quantification approach that was benchmarked against slice profile uncorrected 2D VFA and an inversion-recovery spin-echo based reference method. Brain T1 mapping was performed in six healthy subjects, one multiple sclerosis patient, and one stroke patient. Results Phantom experiments showed a mean T1 estimation error of (-63±1.5)% for slice profile uncorrected 2D VFA and (0.2±1.4)% for the proposed approach compared to the reference method. Scan time for single slice T1 mapping including B1+ mapping could be reduced to 5 seconds using an in-plane resolution of (2×2) mm2, which equals a scan time reduction of more than 99% compared to the reference method. Conclusion Our results demonstrate that rapid 2D T1 quantification using a variable flip angle approach is feasible at 1.5T/3T/7T. It represents a valuable alternative for rapid T1 mapping due to the gain in speed versus conventional approaches. This progress may serve to enhance the capabilities of parametric MR based lesion detection and brain tissue characterization. PMID:24621588
Giri, Anupam; Zelinkova, Zuzana; Wenzl, Thomas
2017-12-01
For the implementation of Regulation (EC) No 2065/2003 related to smoke flavourings used or intended for use in or on foods a method based on solid-phase micro extraction (SPME) GC/MS was developed for the characterisation of liquid smoke products. A statistically based experimental design (DoE) was used for method optimisation. The best general conditions to quantitatively analyse the liquid smoke compounds were obtained with a polydimethylsiloxane/divinylbenzene (PDMS/DVB) fibre, 60°C extraction temperature, 30 min extraction time, 250°C desorption temperature, 180 s desorption time, 15 s agitation time, and 250 rpm agitation speed. Under the optimised conditions, 119 wood pyrolysis products including furan/pyran derivatives, phenols, guaiacol, syringol, benzenediol, and their derivatives, cyclic ketones, and several other heterocyclic compounds were identified. The proposed method was repeatable (RSD% <5) and the calibration functions were linear for all compounds under study. Nine isotopically labelled internal standards were used for improving quantification of analytes by compensating matrix effects that might affect headspace equilibrium and extractability of compounds. The optimised isotope dilution SPME-GC/MS based analytical method proved to be fit for purpose, allowing the rapid identification and quantification of volatile compounds in liquid smoke flavourings.
Mehle, Nataša; Dobnik, David; Ravnikar, Maja; Pompe Novak, Maruša
2018-05-03
RNA viruses have a great potential for high genetic variability and rapid evolution that is generated by mutation and recombination under selection pressure. This is also the case of Potato virus Y (PVY), which comprises a high diversity of different recombinant and non-recombinant strains. Consequently, it is hard to develop reverse transcription real-time quantitative PCR (RT-qPCR) with the same amplification efficiencies for all PVY strains which would enable their equilibrate quantification; this is specially needed in mixed infections and other studies of pathogenesis. To achieve this, we initially transferred the PVY universal RT-qPCR assay to a reverse transcription droplet digital PCR (RT-ddPCR) format. RT-ddPCR is an absolute quantification method, where a calibration curve is not needed, and it is less prone to inhibitors. The RT-ddPCR developed and validated in this study achieved a dynamic range of quantification over five orders of magnitude, and in terms of its sensitivity, it was comparable to, or even better than, RT-qPCR. RT-ddPCR showed lower measurement variability. We have shown that RT-ddPCR can be used as a reference tool for the evaluation of different RT-qPCR assays. In addition, it can be used for quantification of RNA based on in-house reference materials that can then be used as calibrators in diagnostic laboratories.
Ullrich, Sebastian; Neef, Sylvia K; Schmarr, Hans-Georg
2018-02-01
Low-molecular-weight volatile sulfur compounds such as thiols, sulfides, disulfides as well as thioacetates cause a sulfidic off-flavor in wines even at low concentration levels. The proposed analytical method for quantification of these compounds in wine is based on headspace solid-phase microextraction, followed by gas chromatographic analysis with sulfur-specific detection using a pulsed flame photometric detector. Robust quantification was achieved via a stable isotope dilution assay using commercial and synthesized deuterated isotopic standards. The necessary chromatographic separation of analytes and isotopic standards benefits from the inverse isotope effect realized on an apolar polydimethylsiloxane stationary phase of increased film thickness. Interferences with sulfur-specific detection in wine caused by sulfur dioxide were minimized by addition of propanal. The method provides adequate validation data, with good repeatability and limits of detection and quantification. It suits the requirements of wine quality management, allowing the control of oenological treatments to counteract an eventual formation of excessively high concentration of such malodorous compounds. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Siebenhaar, Markus; Küllmer, Kai; Fernandes, Nuno Miguel de Barros; Hüllen, Volker; Hopf, Carsten
2015-09-01
Desorption electrospray ionization (DESI) mass spectrometry is an emerging technology for direct therapeutic drug monitoring in dried blood spots (DBS). Current DBS methods require manual application of small molecules as internal standards for absolute drug quantification. With industrial standardization in mind, we superseded the manual addition of standard and built a three-layer setup for robust quantification of salicylic acid directly from DBS. We combined a dioctyl sodium sulfosuccinate weave facilitating sample spreading with a cellulose layer for addition of isotope-labeled salicylic acid as internal standard and a filter paper for analysis of the standard-containing sample by DESI-MS. Using this setup, we developed a quantification method for salicylic acid from whole blood with a validated linear curve range from 10 to 2000 mg/L, a relative standard deviation (RSD%) ≤14%, and determination coefficients of 0.997. The limit of detection (LOD) was 8 mg/L and the lower limit of quantification (LLOQ) was 10 mg/L. Recovery rates in method verification by LC-MS/MS were 97 to 101% for blinded samples. Most importantly, a study in healthy volunteers after administration of a single dose of Aspirin provides evidence to suggest that the three-layer setup may enable individual pharmacokinetic and endpoint testing following blood collection by finger pricking by patients at home. Taken together, our data suggests that DBS-based quantification of drugs by DESI-MS on pre-manufactured three-layer cartridges may be a promising approach for future near-patient therapeutic drug monitoring.
Lin, Na; Chen, Si; Zhang, Hong; Li, Junmin; Fu, Linglin
2018-02-07
Major royal jelly protein 1 (MRJP1) is the most abundant protein in royal jelly (RJ), and the level of MRJP1 has been suggested as a promising parameter for standardization and evaluation of RJ authenticity in quality. Here, a quantitative method was developed for the quantification of MRJP1 in RJ based on a signature peptide and a stable isotope-labeled internal standard peptide FFDYDFGSDER*(R*, 13 C 6 , 15 N 4 ) by ultraperformance liquid chromatography-tandem mass spectrometry. Recoveries of the established method ranged from 85.33 to 95.80%, and both the intra- and interday precision were RSD < 4.97%. Quantification results showed that content of MRJP1 in fresh RJ was 41.96-55.01 mg/g. Abnormal levels of MRJP1 were found in three commercial RJs and implied that these samples were of low quality and might be adulterated. Results of the present work suggested that the developed method could be successfully applied to quantify MRJP1 in RJ and also could evaluate the quality of RJ.
Bojolly, Daline; Doyen, Périne; Le Fur, Bruno; Christaki, Urania; Verrez-Bagnis, Véronique; Grard, Thierry
2017-02-01
Bigeye tuna (Thunnus obesus) and yellowfin tuna (Thunnus albacares) are among the most widely used tuna species for canning purposes. Not only substitution but also mixing of tuna species is prohibited by the European regulation for canned tuna products. However, as juveniles of bigeye and yellowfin tunas are very difficult to distinguish, unintentional substitutions may occur during the canning process. In this study, two mitochondrial markers from NADH dehydrogenase subunit 2 and cytochrome c oxidase subunit II genes were used to identify bigeye tuna and yellowfin tuna, respectively, utilizing TaqMan qPCR methodology. Two different qPCR-based methods were developed to quantify the percentage of flesh of each species used for can processing. The first one was based on absolute quantification using standard curves realized with these two markers; the second one was founded on relative quantification with the universal 12S rRNA gene as the endogenous gene. On the basis of our results, we conclude that our methodology could be applied to authenticate these two closely related tuna species when used in a binary mix in tuna cans.
Schultealbert, Caroline; Baur, Tobias; Schütze, Andreas; Sauerwald, Tilman
2018-03-01
Dedicated methods for quantification and identification of reducing gases based on model-based temperature-cycled operation (TCO) using a single commercial MOS gas sensor are presented. During high temperature phases the sensor surface is highly oxidized, yielding a significant sensitivity increase after switching to lower temperatures (differential surface reduction, DSR). For low concentrations, the slope of the logarithmic conductance during this low-temperature phase is evaluated and can directly be used for quantification. For higher concentrations, the time constant for reaching a stable conductance during the same low-temperature phase is evaluated. Both signals represent the reaction rate of the reducing gas on the strongly oxidized surface at this low temperature and provide a linear calibration curve, which is exceptional for MOS sensors. By determining these reaction rates on different low-temperature plateaus and applying pattern recognition, the resulting footprint can be used for identification of different gases. All methods are tested over a wide concentration range from 10 ppb to 100 ppm (4 orders of magnitude) for four different reducing gases (CO, H₂, ammonia and benzene) using randomized gas exposures.
Inoue, Koichi; Miyazaki, Yasuto; Unno, Keiko; Min, Jun Zhe; Todoroki, Kenichiro; Toyo'oka, Toshimasa
2016-01-01
In this study, we developed the stable isotope dilution hydrophilic interaction liquid chromatography with tandem mass spectrometry (HILIC-MS/MS) technique for the accurate, reasonable and simultaneous quantification of glutamic acid (Glu), glutamine (Gln), pyroglutamic acid (pGlu), γ-aminobutyric acid (GABA) and theanine in mouse brain tissues. The quantification of these analytes was accomplished using stable isotope internal standards and the HILIC separating mode to fully correct the intramolecular cyclization during the electrospray ionization. It was shown that linear calibrations were available with high coefficients of correlation (r(2) > 0.999, range from 10 pmol/mL to 50 mol/mL). For application of the theanine intake, the determination of Glu, Gln, pGlu, GABA and theanine in the hippocampus and central cortex tissues was performed based on our developed method. In the region of the hippocampus, the concentration levels of Glu and pGlu were significantly reduced during reality-based theanine intake. Conversely, the concentration level of GABA increased. This result showed that transited theanine has an effect on the metabolic balance of Glu analogs in the hippocampus. Copyright © 2015 John Wiley & Sons, Ltd.
Schultealbert, Caroline; Baur, Tobias; Schütze, Andreas; Sauerwald, Tilman
2018-01-01
Dedicated methods for quantification and identification of reducing gases based on model-based temperature-cycled operation (TCO) using a single commercial MOS gas sensor are presented. During high temperature phases the sensor surface is highly oxidized, yielding a significant sensitivity increase after switching to lower temperatures (differential surface reduction, DSR). For low concentrations, the slope of the logarithmic conductance during this low-temperature phase is evaluated and can directly be used for quantification. For higher concentrations, the time constant for reaching a stable conductance during the same low-temperature phase is evaluated. Both signals represent the reaction rate of the reducing gas on the strongly oxidized surface at this low temperature and provide a linear calibration curve, which is exceptional for MOS sensors. By determining these reaction rates on different low-temperature plateaus and applying pattern recognition, the resulting footprint can be used for identification of different gases. All methods are tested over a wide concentration range from 10 ppb to 100 ppm (4 orders of magnitude) for four different reducing gases (CO, H2, ammonia and benzene) using randomized gas exposures. PMID:29494545
Assessment of parametric uncertainty for groundwater reactive transport modeling,
Shi, Xiaoqing; Ye, Ming; Curtis, Gary P.; Miller, Geoffery L.; Meyer, Philip D.; Kohler, Matthias; Yabusaki, Steve; Wu, Jichun
2014-01-01
The validity of using Gaussian assumptions for model residuals in uncertainty quantification of a groundwater reactive transport model was evaluated in this study. Least squares regression methods explicitly assume Gaussian residuals, and the assumption leads to Gaussian likelihood functions, model parameters, and model predictions. While the Bayesian methods do not explicitly require the Gaussian assumption, Gaussian residuals are widely used. This paper shows that the residuals of the reactive transport model are non-Gaussian, heteroscedastic, and correlated in time; characterizing them requires using a generalized likelihood function such as the formal generalized likelihood function developed by Schoups and Vrugt (2010). For the surface complexation model considered in this study for simulating uranium reactive transport in groundwater, parametric uncertainty is quantified using the least squares regression methods and Bayesian methods with both Gaussian and formal generalized likelihood functions. While the least squares methods and Bayesian methods with Gaussian likelihood function produce similar Gaussian parameter distributions, the parameter distributions of Bayesian uncertainty quantification using the formal generalized likelihood function are non-Gaussian. In addition, predictive performance of formal generalized likelihood function is superior to that of least squares regression and Bayesian methods with Gaussian likelihood function. The Bayesian uncertainty quantification is conducted using the differential evolution adaptive metropolis (DREAM(zs)) algorithm; as a Markov chain Monte Carlo (MCMC) method, it is a robust tool for quantifying uncertainty in groundwater reactive transport models. For the surface complexation model, the regression-based local sensitivity analysis and Morris- and DREAM(ZS)-based global sensitivity analysis yield almost identical ranking of parameter importance. The uncertainty analysis may help select appropriate likelihood functions, improve model calibration, and reduce predictive uncertainty in other groundwater reactive transport and environmental modeling.
A marker-based watershed method for X-ray image segmentation.
Zhang, Xiaodong; Jia, Fucang; Luo, Suhuai; Liu, Guiying; Hu, Qingmao
2014-03-01
Digital X-ray images are the most frequent modality for both screening and diagnosis in hospitals. To facilitate subsequent analysis such as quantification and computer aided diagnosis (CAD), it is desirable to exclude image background. A marker-based watershed segmentation method was proposed to segment background of X-ray images. The method consisted of six modules: image preprocessing, gradient computation, marker extraction, watershed segmentation from markers, region merging and background extraction. One hundred clinical direct radiograph X-ray images were used to validate the method. Manual thresholding and multiscale gradient based watershed method were implemented for comparison. The proposed method yielded a dice coefficient of 0.964±0.069, which was better than that of the manual thresholding (0.937±0.119) and that of multiscale gradient based watershed method (0.942±0.098). Special means were adopted to decrease the computational cost, including getting rid of few pixels with highest grayscale via percentile, calculation of gradient magnitude through simple operations, decreasing the number of markers by appropriate thresholding, and merging regions based on simple grayscale statistics. As a result, the processing time was at most 6s even for a 3072×3072 image on a Pentium 4 PC with 2.4GHz CPU (4 cores) and 2G RAM, which was more than one time faster than that of the multiscale gradient based watershed method. The proposed method could be a potential tool for diagnosis and quantification of X-ray images. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.
2010-01-01
Introduction Joint effusion is frequently associated with osteoarthritis (OA) flare-up and is an important marker of therapeutic response. This study aimed at developing and validating a fully automated system based on magnetic resonance imaging (MRI) for the quantification of joint effusion volume in knee OA patients. Methods MRI examinations consisted of two axial sequences: a T2-weighted true fast imaging with steady-state precession and a T1-weighted gradient echo. An automated joint effusion volume quantification system using MRI was developed and validated (a) with calibrated phantoms (cylinder and sphere) and effusion from knee OA patients; (b) with assessment by manual quantification; and (c) by direct aspiration. Twenty-five knee OA patients with joint effusion were included in the study. Results The automated joint effusion volume quantification was developed as a four stage sequencing process: bone segmentation, filtering of unrelated structures, segmentation of joint effusion, and subvoxel volume calculation. Validation experiments revealed excellent coefficients of variation with the calibrated cylinder (1.4%) and sphere (0.8%) phantoms. Comparison of the OA knee joint effusion volume assessed by the developed automated system and by manual quantification was also excellent (r = 0.98; P < 0.0001), as was the comparison with direct aspiration (r = 0.88; P = 0.0008). Conclusions The newly developed fully automated MRI-based system provided precise quantification of OA knee joint effusion volume with excellent correlation with data from phantoms, a manual system, and joint aspiration. Such an automated system will be instrumental in improving the reproducibility/reliability of the evaluation of this marker in clinical application. PMID:20846392
Multiscale recurrence quantification analysis of order recurrence plots
NASA Astrophysics Data System (ADS)
Xu, Mengjia; Shang, Pengjian; Lin, Aijing
2017-03-01
In this paper, we propose a new method of multiscale recurrence quantification analysis (MSRQA) to analyze the structure of order recurrence plots. The MSRQA is based on order patterns over a range of time scales. Compared with conventional recurrence quantification analysis (RQA), the MSRQA can show richer and more recognizable information on the local characteristics of diverse systems which successfully describes their recurrence properties. Both synthetic series and stock market indexes exhibit their properties of recurrence at large time scales that quite differ from those at a single time scale. Some systems present more accurate recurrence patterns under large time scales. It demonstrates that the new approach is effective for distinguishing three similar stock market systems and showing some inherent differences.
Crack Imaging and Quantification in Aluminum Plates with Guided Wave Wavenumber Analysis Methods
NASA Technical Reports Server (NTRS)
Yu, Lingyu; Tian, Zhenhua; Leckey, Cara A. C.
2015-01-01
Guided wavefield analysis methods for detection and quantification of crack damage in an aluminum plate are presented in this paper. New wavenumber components created by abrupt wave changes at the structural discontinuity are identified in the frequency-wavenumber spectra. It is shown that the new wavenumbers can be used to detect and characterize the crack dimensions. Two imaging based approaches, filter reconstructed imaging and spatial wavenumber imaging, are used to demonstrate how the cracks can be evaluated with wavenumber analysis. The filter reconstructed imaging is shown to be a rapid method to map the plate and any existing damage, but with less precision in estimating crack dimensions; while the spatial wavenumber imaging provides an intensity image of spatial wavenumber values with enhanced resolution of crack dimensions. These techniques are applied to simulated wavefield data, and the simulation based studies show that spatial wavenumber imaging method is able to distinguish cracks of different severities. Laboratory experimental validation is performed for a single crack case to confirm the methods' capabilities for imaging cracks in plates.
A Fatigue Crack Size Evaluation Method Based on Lamb Wave Simulation and Limited Experimental Data
He, Jingjing; Ran, Yunmeng; Liu, Bin; Yang, Jinsong; Guan, Xuefei
2017-01-01
This paper presents a systematic and general method for Lamb wave-based crack size quantification using finite element simulations and Bayesian updating. The method consists of construction of a baseline quantification model using finite element simulation data and Bayesian updating with limited Lamb wave data from target structure. The baseline model correlates two proposed damage sensitive features, namely the normalized amplitude and phase change, with the crack length through a response surface model. The two damage sensitive features are extracted from the first received S0 mode wave package. The model parameters of the baseline model are estimated using finite element simulation data. To account for uncertainties from numerical modeling, geometry, material and manufacturing between the baseline model and the target model, Bayesian method is employed to update the baseline model with a few measurements acquired from the actual target structure. A rigorous validation is made using in-situ fatigue testing and Lamb wave data from coupon specimens and realistic lap-joint components. The effectiveness and accuracy of the proposed method is demonstrated under different loading and damage conditions. PMID:28902148
Remily-Wood, Elizabeth R; Benson, Kaaron; Baz, Rachid C; Chen, Y Ann; Hussein, Mohamad; Hartley-Brown, Monique A; Sprung, Robert W; Perez, Brianna; Liu, Richard Z; Yoder, Sean J; Teer, Jamie K; Eschrich, Steven A; Koomen, John M
2014-10-01
Quantitative MS assays for Igs are compared with existing clinical methods in samples from patients with plasma cell dyscrasias, for example, multiple myeloma (MM). Using LC-MS/MS data, Ig constant region peptides, and transitions were selected for LC-MRM MS. Quantitative assays were used to assess Igs in serum from 83 patients. RNA sequencing and peptide-based LC-MRM are used to define peptides for quantification of the disease-specific Ig. LC-MRM assays quantify serum levels of Igs and their isoforms (IgG1-4, IgA1-2, IgM, IgD, and IgE, as well as kappa (κ) and lambda (λ) light chains). LC-MRM quantification has been applied to single samples from a patient cohort and a longitudinal study of an IgE patient undergoing treatment, to enable comparison with existing clinical methods. Proof-of-concept data for defining and monitoring variable region peptides are provided using the H929 MM cell line and two MM patients. LC-MRM assays targeting constant region peptides determine the type and isoform of the involved Ig and quantify its expression; the LC-MRM approach has improved sensitivity compared with the current clinical method, but slightly higher inter-assay variability. Detection of variable region peptides is a promising way to improve Ig quantification, which could produce a dramatic increase in sensitivity over existing methods, and could further complement current clinical techniques. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Ali, Arslan; Haq, Faraz Ul; Ul Arfeen, Qamar; Sharma, Khaga Raj; Adhikari, Achyut; Musharraf, Syed Ghulam
2017-10-01
Diabetes is a major global health problem which requires new studies for its prevention and control. Scoparia dulcis, a herbal product, is widely used for treatment of diabetes. Recent studies demonstrate coixol as a potent and nontoxic insulin secretagog from S. dulcis. This study focuses on developing two quantitative methods of coixol in S. dulcis methanol-based extracts. Quantification of coixol was performed using high-performance liquid chromatography-tandem mass spectrometry (method 1) and high-performance liquid chromatography-ultraviolet detection (method 2) with limits of detection of 0.26 and 11.6 pg/μL, respectively, and limits of quantification of 0.78 and 35.5 pg/μL, respectively. S. dulcis is rich in coixol content with values of 255.5 ± 2.1 mg/kg (method 1) and 220.4 ± 2.9 mg/kg (method 2). Excellent linearity with determination coefficients >0.999 was achieved for calibration curves from 10 to 7500 ng/mL (method 1) and from 175 to 7500 ng/mL (method 2). Good accuracy (bias < -8.6%) and precision (RSD < 8.5%) were obtained for both methods. Thus, they can be employed to analyze coixol in plant extracts and herbal formulations. Copyright © 2017 John Wiley & Sons, Ltd.
Comparative quantification of human intestinal bacteria based on cPCR and LDR/LCR.
Tang, Zhou-Rui; Li, Kai; Zhou, Yu-Xun; Xiao, Zhen-Xian; Xiao, Jun-Hua; Huang, Rui; Gu, Guo-Hao
2012-01-21
To establish a multiple detection method based on comparative polymerase chain reaction (cPCR) and ligase detection reaction (LDR)/ligase chain reaction (LCR) to quantify the intestinal bacterial components. Comparative quantification of 16S rDNAs from different intestinal bacterial components was used to quantify multiple intestinal bacteria. The 16S rDNAs of different bacteria were amplified simultaneously by cPCR. The LDR/LCR was examined to actualize the genotyping and quantification. Two beneficial (Bifidobacterium, Lactobacillus) and three conditionally pathogenic bacteria (Enterococcus, Enterobacterium and Eubacterium) were used in this detection. With cloned standard bacterial 16S rDNAs, standard curves were prepared to validate the quantitative relations between the ratio of original concentrations of two templates and the ratio of the fluorescence signals of their final ligation products. The internal controls were added to monitor the whole detection flow. The quantity ratio between two bacteria was tested. cPCR and LDR revealed obvious linear correlations with standard DNAs, but cPCR and LCR did not. In the sample test, the distributions of the quantity ratio between each two bacterial species were obtained. There were significant differences among these distributions in the total samples. But these distributions of quantity ratio of each two bacteria remained stable among groups divided by age or sex. The detection method in this study can be used to conduct multiple intestinal bacteria genotyping and quantification, and to monitor the human intestinal health status as well.
Spainhour, John Christian G; Janech, Michael G; Schwacke, John H; Velez, Juan Carlos Q; Ramakrishnan, Viswanathan
2014-01-01
Matrix assisted laser desorption/ionization time-of-flight (MALDI-TOF) coupled with stable isotope standards (SIS) has been used to quantify native peptides. This peptide quantification by MALDI-TOF approach has difficulties quantifying samples containing peptides with ion currents in overlapping spectra. In these overlapping spectra the currents sum together, which modify the peak heights and make normal SIS estimation problematic. An approach using Gaussian mixtures based on known physical constants to model the isotopic cluster of a known compound is proposed here. The characteristics of this approach are examined for single and overlapping compounds. The approach is compared to two commonly used SIS quantification methods for single compound, namely Peak Intensity method and Riemann sum area under the curve (AUC) method. For studying the characteristics of the Gaussian mixture method, Angiotensin II, Angiotensin-2-10, and Angiotenisn-1-9 and their associated SIS peptides were used. The findings suggest, Gaussian mixture method has similar characteristics as the two methods compared for estimating the quantity of isolated isotopic clusters for single compounds. All three methods were tested using MALDI-TOF mass spectra collected for peptides of the renin-angiotensin system. The Gaussian mixture method accurately estimated the native to labeled ratio of several isolated angiotensin peptides (5.2% error in ratio estimation) with similar estimation errors to those calculated using peak intensity and Riemann sum AUC methods (5.9% and 7.7%, respectively). For overlapping angiotensin peptides, (where the other two methods are not applicable) the estimation error of the Gaussian mixture was 6.8%, which is within the acceptable range. In summary, for single compounds the Gaussian mixture method is equivalent or marginally superior compared to the existing methods of peptide quantification and is capable of quantifying overlapping (convolved) peptides within the acceptable margin of error.
Event-based analysis of free-living behaviour.
Granat, Malcolm H
2012-11-01
The quantification of free-living physical activities is important in understanding how physical activity and sedentary behaviour impact on health and also on how interventions might modify free-living behaviour to enhance health. Quantification, and the terminology used, has in many ways been determined by the choice of measurement technique. The inter-related issues around measurement devices and terminology used are explored. This paper proposes a terminology and a systematic approach for the analysis of free-living activity information using event-based activity data. The event-based approach uses a flexible hierarchical classification of events and, dependent on the research question, analysis can then be undertaken on a selection of these events. The quantification of free-living behaviour is therefore the result of the analysis on the patterns of these chosen events. The application of this approach is illustrated with results from a range of published studies by our group showing how event-based analysis provides a flexible yet robust method of addressing the research question(s) and provides a deeper insight into free-living behaviour. It is proposed that it is through event-based analysis we can more clearly understand how behaviour is related to health and also how we can produce more relevant outcome measures.
2011-01-01
Background Botrytis cinerea is a phytopathogenic fungus responsible for the disease known as gray mold, which causes substantial losses of fruits at postharvest. This fungus is present often as latent infection and an apparently healthy fruit can deteriorate suddenly due to the development of this infection. For this reason, rapid and sensitive methods are necessary for its detection and quantification. This article describes the development of an indirect competitive enzyme-linked immunosorbent assay (ELISA) for quantification of B. cinerea in apple (Red Delicious), table grape (pink Moscatel), and pear (William's) tissues. Results The method was based in the competition for the binding site of monoclonal antibodies between B. cinerea antigens present in fruit tissues and B. cinerea purified antigens immobilized by a crosslinking agent onto the surface of the microtiter plates. The method was validated considering parameters such as selectivity, linearity, precision, accuracy and sensibility. The calculated detection limit was 0.97 μg mL-1 B. cinerea antigens. The immobilized antigen was perfectly stable for at least 4 months assuring the reproducibility of the assay. The fungus was detected and quantified in any of the fruits tested when the rot was not visible yet. Results were compared with a DNA quantification method and these studies showed good correlation. Conclusions The developed method allowed detects the presence of B. cinerea in asymptomatic fruits and provides the advantages of low cost, easy operation, and short analysis time determination for its possible application in the phytosanitary programs of the fruit industry worldwide. PMID:21970317
Zhang, Xiao-Hua; Wu, Hai-Long; Wang, Jian-Yao; Tu, De-Zhu; Kang, Chao; Zhao, Juan; Chen, Yao; Miu, Xiao-Xia; Yu, Ru-Qin
2013-05-01
This paper describes the use of second-order calibration for development of HPLC-DAD method to quantify nine polyphenols in five kinds of honey samples. The sample treatment procedure was simplified effectively relative to the traditional ways. Baselines drift was also overcome by means of regarding the drift as additional factor(s) as well as the analytes of interest in the mathematical model. The contents of polyphenols obtained by the alternating trilinear decomposition (ATLD) method have been successfully used to distinguish different types of honey. This method shows good linearity (r>0.99), rapidity (t<7.60 min) and accuracy, which may be extremely promising as an excellent routine strategy for identification and quantification of polyphenols in the complex matrices. Copyright © 2012 Elsevier Ltd. All rights reserved.
Miranda, Nahieh Toscano; Sequinel, Rodrigo; Hatanaka, Rafael Rodrigues; de Oliveira, José Eduardo; Flumignan, Danilo Luiz
2017-04-01
Benzene, toluene, ethylbenzene, and xylenes are some of the most hazardous constituents found in commercial gasoline samples; therefore, these components must be monitored to avoid toxicological problems. We propose a new routine method of ultrafast gas chromatography coupled to flame ionization detection for the direct determination of benzene, toluene, ethylbenzene, and xylenes in commercial gasoline. This method is based on external standard calibration to quantify each compound, including the validation step of the study of linearity, detection and quantification limits, precision, and accuracy. The time of analysis was less than 3.2 min, with quantitative statements regarding the separation and quantification of all compounds in commercial gasoline samples. Ultrafast gas chromatography is a promising alternative method to official analytical techniques. Government laboratories could consider using this method for quality control. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Sobhanardakani, S; Farmany, A; Abbasi, S; Cheraghi, J; Hushmandfar, R
2013-03-01
A new kinetic method has been developed for the determination of nitrite in fruit juice samples. The method is based on the catalytic effect of nitrite with the oxidation of Nile Blue A (NBA) by KBrO(3) in the sulfuric acid medium. The optimum conditions obtained are 1.2 mM sulfuric acid, 0.034 mM of NBA, 2.8 × 10(-3) M KBrO(3), reaction temperature of 20 °C, and reaction time of 100 s at 595.5 nm. Under the optimized conditions, the method allowed the quantification of nitrite in a range of 0.2-800 μg/mL with a detection limit of 0.02 μg/mL. The method was applied to the determination of nitrite in 15 brands of fruit juice samples.
Luminol-Based Chemiluminescent Signals: Clinical and Non-clinical Application and Future Uses
Khan, Parvez; Idrees, Danish; Moxley, Michael A.; Corbett, John A.; Ahmad, Faizan; von Figura, Guido; Sly, William S.; Waheed, Abdul
2015-01-01
Chemiluminescence (CL) is an important method for quantification and analysis of various macromolecules. A wide range of CL agents such as luminol, hydrogen peroxide, fluorescein, dioxetanes and derivatives of oxalate, and acridinium dyes are used according to their biological specificity and utility. This review describes the application of luminol chemiluminescence (LCL) in forensic, biomedical, and clinical sciences. LCL is a very useful detection method due to its selectivity, simplicity, low cost, and high sensitivity. LCL has a dynamic range of applications, including quantification and detection of macro and micromolecules such as proteins, carbohydrates, DNA, and RNA. Luminol-based methods are used in environmental monitoring as biosensors, in the pharmaceutical industry for cellular localization and as biological tracers, and in reporter gene-based assays and several other immunoassays. Here, we also provide information about different compounds that may enhance or inhibit the LCL along with the effect of pH and concentration on LCL. This review covers most of the significant information related to the applications of luminol in different fields. PMID:24752935
Laboureur, Laurent; Guérineau, Vincent; Auxilien, Sylvie; Yoshizawa, Satoko; Touboul, David
2018-02-16
A method based on supercritical fluid chromatography coupled to high resolution mass spectrometry for the profiling of canonical and modified nucleosides was optimized, and compared to classical reverse-phase liquid chromatography in terms of separation, number of detected modified nucleosides and sensitivity. Limits of detection and quantification were measured using statistical method and quantifications of twelve nucleosides of a tRNA digest from E. coli are in good agreement with previously reported data. Results highlight the complementarity of both separation techniques to cover the largest view of nucleoside modifications for forthcoming epigenetic studies. Copyright © 2017 Elsevier B.V. All rights reserved.
López-Heras, Isabel; Madrid, Yolanda; Cámara, Carmen
2014-06-01
In this work, we proposed an analytical approach based on asymmetrical flow field-flow fractionation combined to an inductively coupled plasma mass spectrometry (AsFlFFF-ICP-MS) for rutile titanium dioxide nanoparticles (TiO2NPs) characterization and quantification in cosmetic and food products. AsFlFFF-ICP-MS separation of TiO2NPs was performed using 0.2% (w/v) SDS, 6% (v/v) methanol at pH 8.7 as the carrier solution. Two problems were addressed during TiO2NPs analysis by AsFlFFF-ICP-MS: size distribution determination and element quantification of the NPs. Two approaches were used for size determination: size calibration using polystyrene latex standards of known sizes and transmission electron microscopy (TEM). A method based on focused sonication for preparing NPs dispersions followed by an on-line external calibration strategy based on AsFlFFF-ICP-MS, using rutile TiO2NPs as standards is presented here for the first time. The developed method suppressed non-specific interactions between NPs and membrane, and overcame possible erroneous results obtained when quantification is performed by using ionic Ti solutions. The applicability of the quantification method was tested on cosmetic products (moisturizing cream). Regarding validation, at the 95% confidence level, no significant differences were detected between titanium concentrations in the moisturizing cream prior sample mineralization (3865±139 mg Ti/kg sample), by FIA-ICP-MS analysis prior NPs extraction (3770±24 mg Ti/kg sample), and after using the optimized on-line calibration approach (3699±145 mg Ti/kg sample). Besides the high Ti content found in the studied food products (sugar glass and coffee cream), TiO2NPs were not detected. Copyright © 2014 Elsevier B.V. All rights reserved.
Xiang, Yun; Koomen, John M.
2012-01-01
Protein quantification with liquid chromatography-multiple reaction monitoring mass spectrometry (LC-MRM) has emerged as a powerful platform for assessing panels of biomarkers. In this study, direct infusion, using automated, chip-based nanoelectrospray ionization, coupled with MRM (DI-MRM) is used for protein quantification. Removal of the LC separation step increases the importance of evaluating the ratios between the transitions. Therefore, the effects of solvent composition, analyte concentration, spray voltage, and quadrupole resolution settings on fragmentation patterns have been studied using peptide and protein standards. After DI-MRM quantification was evaluated for standards, quantitative assays for the expression of heat shock proteins (HSPs) were translated from LC-MRM to DI-MRM for implementation in cell line models of multiple myeloma. Requirements for DI-MRM assay development are described. Then, the two methods are compared; criteria for effective DI-MRM analysis are reported based on the analysis of HSP expression in digests of whole cell lysates. The increased throughput of DI-MRM analysis is useful for rapid analysis of large batches of similar samples, such as time course measurements of cellular responses to therapy. PMID:22293045
Drzymala, Sarah S; Weiz, Stefan; Heinze, Julia; Marten, Silvia; Prinz, Carsten; Zimathies, Annett; Garbe, Leif-Alexander; Koch, Matthias
2015-05-01
Established maximum levels for the mycotoxin zearalenone (ZEN) in edible oil require monitoring by reliable analytical methods. Therefore, an automated SPE-HPLC online system based on dynamic covalent hydrazine chemistry has been developed. The SPE step comprises a reversible hydrazone formation by ZEN and a hydrazine moiety covalently attached to a solid phase. Seven hydrazine materials with different properties regarding the resin backbone, pore size, particle size, specific surface area, and loading have been evaluated. As a result, a hydrazine-functionalized silica gel was chosen. The final automated online method was validated and applied to the analysis of three maize germ oil samples including a provisionally certified reference material. Important performance criteria for the recovery (70-120 %) and precision (RSDr <25 %) as set by the Commission Regulation EC 401/2006 were fulfilled: The mean recovery was 78 % and RSDr did not exceed 8 %. The results of the SPE-HPLC online method were further compared to results obtained by liquid-liquid extraction with stable isotope dilution analysis LC-MS/MS and found to be in good agreement. The developed SPE-HPLC online system with fluorescence detection allows a reliable, accurate, and sensitive quantification (limit of quantification, 30 μg/kg) of ZEN in edible oils while significantly reducing the workload. To our knowledge, this is the first report on an automated SPE-HPLC method based on a covalent SPE approach.
Zhao, Xiangsheng; Wei, Jianhe; Yang, Meihua
2018-05-03
Morinda officinalis is an important herbal medicine and functional food, and its main constituents include anthraquinone and iridoid glycosides. Quantification of the main compounds is a necessary step to understand the quality and therapeutic properties of M. officinalis , but this has not yet been performed based on liquid chromatography/tandem mass spectrometry (LC-MS/MS). Analytes were extracted from M. officinalis by reflux method. Ultrahigh-performance liquid chromatography coupled with a triple quadrupole mass spectrometry (UPLC-QqQ-MS) using multiple reaction monitoring (MRM) mode was applied for quantification. Fragmentation pathways of deacetyl asperulosidic acid and rubiadin were investigated based on UPLC with quadrupole time-of-flight tandem mass spectrometry (Q/TOF-MS) in the MS E centroid mode. The method showed a good linearity over a wide concentration range (R² ≥ 0.9930). The limits of quantification of six compounds ranged from 2.6 to 27.57 ng/mL. The intra- and inter-day precisions of the investigated components exhibited an RSD within 4.5% with mean recovery rates of 95.32⁻99.86%. Contents of selected compounds in M. officinalis varied significantly depending on region. The fragmentation pathway of deacetyl asperulosidic and rubiadin was proposed. A selective and sensitive method was developed for determining six target compounds in M. officinalis by UPLC-MS/MS. Furthermore, the proposed method will be helpful for quality control and identification main compounds of M. officinalis .
An Excel-based implementation of the spectral method of action potential alternans analysis.
Pearman, Charles M
2014-12-01
Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro-arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T-wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. © 2014 The Author. Physiological Reports published by Wiley Periodicals, Inc. on behalf of the American Physiological Society and The Physiological Society.
Targeted Quantification of Phosphorylation Dynamics in the Context of EGFR-MAPK Pathway.
Yi, Lian; Shi, Tujin; Gritsenko, Marina A; X'avia Chan, Chi-Yuet; Fillmore, Thomas L; Hess, Becky M; Swensen, Adam C; Liu, Tao; Smith, Richard D; Wiley, H Steven; Qian, Wei-Jun
2018-04-17
Large-scale phosphoproteomics with coverage of over 10,000 sites of phosphorylation have now been routinely achieved with advanced mass spectrometry (MS)-based workflows. However, accurate targeted MS-based quantification of phosphorylation dynamics, an important direction for gaining quantitative understanding of signaling pathways or networks, has been much less investigated. Herein, we report an assessment of the targeted workflow in the context of signal transduction pathways, using the epidermal growth factor receptor (EGFR)-mitogen-activated protein kinase (MAPK) pathway as our model. A total of 43 phosphopeptides from the EGFR-MAPK pathway were selected for the study. The recovery and sensitivity of two commonly used enrichment methods, immobilized metal affinity chromatography (IMAC) and titanium oxide (TiO 2 ), combined with selected reaction monitoring (SRM)-MS were evaluated. The recovery of phosphopeptides by IMAC and TiO 2 enrichment was quantified to be 38 ± 5% and 58 ± 20%, respectively, based on internal standards. Moreover, both enrichment methods provided comparable sensitivity from 1 to 100 μg starting peptides. Robust quantification was consistently achieved for most targeted phosphopeptides when starting with 25-100 μg peptides. However, the numbers of quantified targets significantly dropped when peptide samples were in the 1-25 μg range. Finally, IMAC-SRM was applied to quantify signaling dynamics of EGFR-MAPK pathway in Hs578T cells following 10 ng/mL EGF treatment. The kinetics of phosphorylation clearly revealed early and late phases of phosphorylation, even for very low abundance proteins. These results demonstrate the feasibility of robust targeted quantification of phosphorylation dynamics for specific pathways, even starting with relatively small amounts of protein.
Targeted Quantification of Phosphorylation Dynamics in the Context of EGFR-MAPK Pathway
DOE Office of Scientific and Technical Information (OSTI.GOV)
Yi, Lian; Shi, Tujin; Gritsenko, Marina A.
2018-03-27
Large-scale phosphoproteomics with coverage of over 10,000 sites of phosphorylation have now been routinely achieved with advanced mass spectrometry (MS)-based workflows. However, accurate targeted MS-based quantification of phosphorylation dynamics, an important direction for gaining quantitative understanding of signaling pathways or networks, has been much less investigated. Herein, we report an assessment of the targeted workflow in the context of signal transduction pathways, using the epidermal growth factor receptor (EGFR)–mitogen-activated protein kinase (MAPK) pathway as our model. 43 phosphopeptides from the EGFR–MAPK pathway were selected for the study. The recovery and sensitivity of a workflow consisted of two commonly used enrichmentmore » methods, immobilized metal affinity chromatography (IMAC) and titanium oxide (TiO2), combined with selected reaction monitoring (SRM)-MS, were evaluated. The recovery of phosphopeptides by IMAC and TiO2 enrichment was quantified to be 38 ± 5% and 58 ± 20%, respectively, based on internal standards. Moreover, both enrichment methods provided comparable sensitivity from 1-100 g starting peptides. Robust quantification was consistently achieved for most targeted phosphopeptides when starting with 25-100 g peptides. However, the numbers of quantified targets significantly dropped when peptide samples were in the 1-25g range. Finally, IMAC-SRM was applied to quantify signaling dynamics of EGFR-MAPK pathway in Hs578T cells following 3 ng/mL EGF treatment. The kinetics of phosphorylation clearly revealed early and late phases of phosphorylation, even for very low abundance proteins. These results demonstrate the feasibility of robust targeted quantification of phosphorylation dynamics for specific pathways, even starting with relatively small amounts of protein.« less
PaCeQuant: A Tool for High-Throughput Quantification of Pavement Cell Shape Characteristics.
Möller, Birgit; Poeschl, Yvonne; Plötner, Romina; Bürstenbinder, Katharina
2017-11-01
Pavement cells (PCs) are the most frequently occurring cell type in the leaf epidermis and play important roles in leaf growth and function. In many plant species, PCs form highly complex jigsaw-puzzle-shaped cells with interlocking lobes. Understanding of their development is of high interest for plant science research because of their importance for leaf growth and hence for plant fitness and crop yield. Studies of PC development, however, are limited, because robust methods are lacking that enable automatic segmentation and quantification of PC shape parameters suitable to reflect their cellular complexity. Here, we present our new ImageJ-based tool, PaCeQuant, which provides a fully automatic image analysis workflow for PC shape quantification. PaCeQuant automatically detects cell boundaries of PCs from confocal input images and enables manual correction of automatic segmentation results or direct import of manually segmented cells. PaCeQuant simultaneously extracts 27 shape features that include global, contour-based, skeleton-based, and PC-specific object descriptors. In addition, we included a method for classification and analysis of lobes at two-cell junctions and three-cell junctions, respectively. We provide an R script for graphical visualization and statistical analysis. We validated PaCeQuant by extensive comparative analysis to manual segmentation and existing quantification tools and demonstrated its usability to analyze PC shape characteristics during development and between different genotypes. PaCeQuant thus provides a platform for robust, efficient, and reproducible quantitative analysis of PC shape characteristics that can easily be applied to study PC development in large data sets. © 2017 American Society of Plant Biologists. All Rights Reserved.
Garballo-Rubio, A; Soto-Chinchilla, J; Moreno, A; Zafra-Gómez, A
2017-04-01
Today, food security is one of the most important global issues with food quality control and identification of contaminants in foods and beverages, being crucial for human health and safety. In this paper, a novel single-step method for the simultaneous determination of 3-monochloropropanediol (3-MCPD) and glycidyl esters in samples of winterized and non-winterized fish oil by using gas chromatography tandem mass spectrometry (GC-MS/MS) is validated. The method is based on alkaline hydrolysis of esters at room temperature, using only 3-MCPD-d5 as internal standard, and a derivatization step with phenylboronic acid (PBA) at 90°C. The use of GC-MS/MS results in a simplified sample treatment and improvement of the limits of quantification and precision of the analytical method with no need of additional concentration of the extracts. A backflush tee placed between two HP-5 MS UI columns (15m×0.25µm×0.25mm) was used in order to minimize matrix effects and peak shape degradation usually observed in routine analyses. The method was validated in winterized and non-winterized fish oil, achieving a limit of quantification of 100ngg -1 and 50ngg -1 for 3-MCPD and glycidol, respectively. Method validation was accomplished by comparing our laboratory results with results obtained by an accredited reference laboratory (SGS Germany GmbH) and by calculating the recoveries obtained in an assay with spiked samples. For glycidol quantification, a mathematical equation was developed in order to compensate for the partial conversion of 3-MCPD into glycidol. This expression involves the quantification of 3-MBPD-d5 generated during hydrolysis reaction. Copyright © 2016 Elsevier B.V. All rights reserved.
WE-AB-204-05: Harmonizing PET/CT Quantification in Multicenter Studies: A Case Study
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marques da Silva, A; Fischer, A
2015-06-15
Purpose: To present the implementation of a strategy to harmonize FDG PET/CT quantification (SUV), performed with different scanner models and manufacturers. Methods: The strategy was based on Boellaard (2011) and EARL FDG-PET/CT accreditation program, that propose quality control measurements for harmonizing scanner performance. A NEMA IEC Body phantom study was performed using four different devices: PHP-1 (Gemini TF Base, Philips); PHP-2 (Gemini GXL, Philips); GEH (Discovery 600, General Electric); SMS (Biograph Hi-Rez 16, Siemens). The SUV Recovery Coefficient (RC) was calculated using the clinical protocol and other clinically relevant reconstruction parameters. The most appropriate reconstruction parameters (MARP) for SUV harmonization,more » in each scanner, are those which achieve EARL harmonizing standards. They were identified using the lowest root mean square errors (RMSE). To evaluate the strategy’s effectiveness, the Maximum Differences (MD) between the clinical and MARP RC values were calculated. Results: The reconstructions parameters that obtained the lowest RMSE are: FBP 5mm (PHP-1); LOR-RAMLA 2i0.008l (PHP-2); VuePointHD 2i32s10mm (GEH); and FORE+OSEM 4i8s6mm (SMS). Thus, to ensure that quantitative PET image measurements are interchangeable between these sites, images must be reconstructed with the above-mentioned parameters. Although, a decoupling between the best image for PET/CT qualitative analysis and the best image for quantification studies was observed. The MD showed that the strategy was effective in reducing the variability of SUV quantification for small structures (<17mm). Conclusion: The harmonization strategy of the SUV quantification implemented with these devices was effective in reducing the variability of small structures quantification, minimizing the inter-scanner and inter-institution differences in quantification. However, it is essential that, in addition to the harmonization of quantification, the standardization of the methodology of patient preparation must be maintained, in order to minimize the SUV variability due to biological factors. Financial support by CAPES.« less
Fan, Jing; Yang, Haowen; Liu, Ming; Wu, Dan; Jiang, Hongrong; Zeng, Xin; Elingarami, Sauli; Ll, Zhiyang; Li, Song; Liu, Hongna; He, Nongyue
2015-02-01
In this research, a novel method for relative fluorescent quantification of DNA based on Fe3O4@SiO2@Au gold-coated magnetic nanocomposites (GMNPs) and multiplex ligation- dependent probe amplification (MLPA) has been developed. With the help of self-assembly, seed-mediated growth and chemical reduction method, core-shell Fe3O4@SiO2@Au GMNPs were synthesized. Through modified streptavidin on the GMNPs surface, we obtained a bead chip which can capture the biotinylated probes. Then we designed MLPA probes which were tagged with biotin or Cy3 and target DNA on the basis of human APP gene sequence. The products from the thermostable DNA ligase induced ligation reactions and PCR amplifications were incubated with SA-GMNPs. After washing, magnetic separation, spotting, the fluorescent scanning results showed our method can be used for the relative quantitative analysis of the target DNA in the concentration range of 03004~0.5 µM.
Goldstein, Darlene R
2006-10-01
Studies of gene expression using high-density short oligonucleotide arrays have become a standard in a variety of biological contexts. Of the expression measures that have been proposed to quantify expression in these arrays, multi-chip-based measures have been shown to perform well. As gene expression studies increase in size, however, utilizing multi-chip expression measures is more challenging in terms of computing memory requirements and time. A strategic alternative to exact multi-chip quantification on a full large chip set is to approximate expression values based on subsets of chips. This paper introduces an extrapolation method, Extrapolation Averaging (EA), and a resampling method, Partition Resampling (PR), to approximate expression in large studies. An examination of properties indicates that subset-based methods can perform well compared with exact expression quantification. The focus is on short oligonucleotide chips, but the same ideas apply equally well to any array type for which expression is quantified using an entire set of arrays, rather than for only a single array at a time. Software implementing Partition Resampling and Extrapolation Averaging is under development as an R package for the BioConductor project.
Paul, Atish T; Vir, Sanjay; Bhutani, K K
2008-10-24
A new liquid chromatography-mass spectrometry (LC-MS)-based method coupled with pressurized liquid extraction (PLE) as an efficient sample preparation technique has been developed for the quantification and fingerprint analysis of Solanum xanthocarpum. Optimum separations of the samples were achieved on a Waters MSC-18 XTerra column, using 0.5% (v/v) formic acid in water (A) and acetonitrile (ACN):2-propanol:formic acid (94.5:5:0.5, v/v/v) (B) as mobile phase. The separation was carried out using linear gradient elution with a flow rate of 1.0mL/min. The gradient was: 0min, 20% B; 14min, 30% B; 20min, 30% B; 27min, 60% B and the column was re-equilibrated to the initial condition (20% B) for 10min prior to next injection. The steroidal glycoalkaloids (SGAs) which are the major active constituents were isolated as pure compounds from the crude methanolic extract of S. xanthocarpum by preparative LC-MS and after characterization were used as external standards for the development and validation of the method. Extracts prepared by conventional Soxhlet extraction, PLE and ultrasonication were used for analysis. The method was validated for repeatability, precision (intra- and inter-day variation), accuracy (recovery) and sensitivity (limit of detection and limit of quantitation). The purpose of the work was to develop a validated method, which can be used for the quantification of SGAs in commercialized S. xanthocarpum products and the fingerprint analysis for their routine quality control.
López-Rayo, Sandra; Lucena, Juan J; Laghi, Luca; Cremonini, Mauro A
2011-12-28
The application of nuclear magnetic resonance (NMR) for the quality control of fertilizers based on Fe(3+), Mn(2+), and Cu(2+) chelates and complexes is precluded by the strong paramagnetism of metals. Recently, a method based on the use of ferrocyanide has been described to remove iron from commercial iron chelates based on the o,o-EDDHA [ethylenediamine-N,N'bis(2-hydroxyphenylacetic)acid] chelating agent for their analysis and quantification by NMR. The present work extended that procedure to other paramagnetic ions, manganese and copper, and other chelating, EDTA (ethylenediaminetetraacetic acid), IDHA [N-(1,2-dicarboxyethyl)-d,l-aspartic acid], and complexing agents, gluconate and heptagluconate. Results showed that the removal of the paramagnetic ions was complete, allowing us to obtain (1)H NMR spectra characterized by narrow peaks. The quantification of the ligands by NMR and high-performance liquid chromatography showed that their complete recovery was granted. The NMR analysis enabled detection and quantification of unknown impurities without the need of pure compounds as internal standards.
Ding, Huanjun; Johnson, Travis; Lin, Muqing; Le, Huy Q.; Ducote, Justin L.; Su, Min-Ying; Molloi, Sabee
2013-01-01
Purpose: Quantification of breast density based on three-dimensional breast MRI may provide useful information for the early detection of breast cancer. However, the field inhomogeneity can severely challenge the computerized image segmentation process. In this work, the effect of the bias field in breast density quantification has been investigated with a postmortem study. Methods: T1-weighted images of 20 pairs of postmortem breasts were acquired on a 1.5 T breast MRI scanner. Two computer-assisted algorithms were used to quantify the volumetric breast density. First, standard fuzzy c-means (FCM) clustering was used on raw images with the bias field present. Then, the coherent local intensity clustering (CLIC) method estimated and corrected the bias field during the iterative tissue segmentation process. Finally, FCM clustering was performed on the bias-field-corrected images produced by CLIC method. The left–right correlation for breasts in the same pair was studied for both segmentation algorithms to evaluate the precision of the tissue classification. Finally, the breast densities measured with the three methods were compared to the gold standard tissue compositions obtained from chemical analysis. The linear correlation coefficient, Pearson's r, was used to evaluate the two image segmentation algorithms and the effect of bias field. Results: The CLIC method successfully corrected the intensity inhomogeneity induced by the bias field. In left–right comparisons, the CLIC method significantly improved the slope and the correlation coefficient of the linear fitting for the glandular volume estimation. The left–right breast density correlation was also increased from 0.93 to 0.98. When compared with the percent fibroglandular volume (%FGV) from chemical analysis, results after bias field correction from both the CLIC the FCM algorithms showed improved linear correlation. As a result, the Pearson's r increased from 0.86 to 0.92 with the bias field correction. Conclusions: The investigated CLIC method significantly increased the precision and accuracy of breast density quantification using breast MRI images by effectively correcting the bias field. It is expected that a fully automated computerized algorithm for breast density quantification may have great potential in clinical MRI applications. PMID:24320536
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wang, Yueqi; Lava, Pascal; Reu, Phillip
This study presents a theoretical uncertainty quantification of displacement measurements by subset-based 2D-digital image correlation. A generalized solution to estimate the random error of displacement measurement is presented. The obtained solution suggests that the random error of displacement measurements is determined by the image noise, the summation of the intensity gradient in a subset, the subpixel part of displacement, and the interpolation scheme. The proposed method is validated with virtual digital image correlation tests.
Wang, Yueqi; Lava, Pascal; Reu, Phillip; ...
2015-12-23
This study presents a theoretical uncertainty quantification of displacement measurements by subset-based 2D-digital image correlation. A generalized solution to estimate the random error of displacement measurement is presented. The obtained solution suggests that the random error of displacement measurements is determined by the image noise, the summation of the intensity gradient in a subset, the subpixel part of displacement, and the interpolation scheme. The proposed method is validated with virtual digital image correlation tests.
NASA Astrophysics Data System (ADS)
Bicanic, D.; Skenderović, H.; Marković, K.; Dóka, O.; Pichler, L.; Pichler, G.; Luterotti, S.
2010-03-01
The combined use of a high power light emitting diode (LED) and the compact photoacoustic (PA) detector offers the possibility for a rapid (no extraction needed), accurate (precision 1.5%) and inexpensive quantification of lycopene in different products derived from the thermally processed tomatoes. The concentration of lycopene in selected products ranges from a few mg to several tens mg per 100 g fresh weight. The HPLC was used as the well established reference method.
Mass Median Plume Angle: A novel approach to characterize plume geometry in solution based pMDIs.
Moraga-Espinoza, Daniel; Eshaghian, Eli; Smyth, Hugh D C
2018-05-30
High-speed laser imaging (HSLI) is the preferred technique to characterize the geometry of the plume in pressurized metered dose inhalers (pMDIs). However, current methods do not allow for simulation of inhalation airflow and do not use drug mass quantification to determine plume angles. To address these limitations, a Plume Induction Port Evaluator (PIPE) was designed to characterize the plume geometry based on mass deposition patterns. The method is easily adaptable to current pMDI characterization methodologies, uses similar calculations methods, and can be used under airflow. The effect of airflow and formulation on the plume geometry were evaluated using PIPE and HSLI. Deposition patterns in PIPE were highly reproducible and log-normal distributed. Mass Median Plume Angle (MMPA) was a new characterization parameter to describe the effective angle of the droplets deposited in the induction port. Plume angles determined by mass showed a significant decrease in size as ethanol increases which correlates to the decrease on vapor pressure in the formulation. Additionally, airflow significantly decreased the angle of the plumes when cascade impactor was operated under flow. PIPE is an alternative to laser-based characterization methods to evaluate the plume angle of pMDIs based on reliable drug quantification while simulating patient inhalation. Copyright © 2018. Published by Elsevier B.V.
TAPAS: tools to assist the targeted protein quantification of human alternative splice variants.
Yang, Jae-Seong; Sabidó, Eduard; Serrano, Luis; Kiel, Christina
2014-10-15
In proteomes of higher eukaryotes, many alternative splice variants can only be detected by their shared peptides. This makes it highly challenging to use peptide-centric mass spectrometry to distinguish and to quantify protein isoforms resulting from alternative splicing events. We have developed two complementary algorithms based on linear mathematical models to efficiently compute a minimal set of shared and unique peptides needed to quantify a set of isoforms and splice variants. Further, we developed a statistical method to estimate the splice variant abundances based on stable isotope labeled peptide quantities. The algorithms and databases are integrated in a web-based tool, and we have experimentally tested the limits of our quantification method using spiked proteins and cell extracts. The TAPAS server is available at URL http://davinci.crg.es/tapas/. luis.serrano@crg.eu or christina.kiel@crg.eu Supplementary data are available at Bioinformatics online. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
High throughput DNA damage quantification of human tissue with home-based collection device
DOE Office of Scientific and Technical Information (OSTI.GOV)
Costes, Sylvain V.; Tang, Jonathan; Yannone, Steven M.
Kits, methods and systems for providing a service to provide a subject with information regarding the state of a subject's DNA damage. Collection, processing and analysis of samples are also described.
Mehranian, Abolfazl; Arabi, Hossein; Zaidi, Habib
2016-04-15
In quantitative PET/MR imaging, attenuation correction (AC) of PET data is markedly challenged by the need of deriving accurate attenuation maps from MR images. A number of strategies have been developed for MRI-guided attenuation correction with different degrees of success. In this work, we compare the quantitative performance of three generic AC methods, including standard 3-class MR segmentation-based, advanced atlas-registration-based and emission-based approaches in the context of brain time-of-flight (TOF) PET/MRI. Fourteen patients referred for diagnostic MRI and (18)F-FDG PET/CT brain scans were included in this comparative study. For each study, PET images were reconstructed using four different attenuation maps derived from CT-based AC (CTAC) serving as reference, standard 3-class MR-segmentation, atlas-registration and emission-based AC methods. To generate 3-class attenuation maps, T1-weighted MRI images were segmented into background air, fat and soft-tissue classes followed by assignment of constant linear attenuation coefficients of 0, 0.0864 and 0.0975 cm(-1) to each class, respectively. A robust atlas-registration based AC method was developed for pseudo-CT generation using local weighted fusion of atlases based on their morphological similarity to target MR images. Our recently proposed MRI-guided maximum likelihood reconstruction of activity and attenuation (MLAA) algorithm was employed to estimate the attenuation map from TOF emission data. The performance of the different AC algorithms in terms of prediction of bones and quantification of PET tracer uptake was objectively evaluated with respect to reference CTAC maps and CTAC-PET images. Qualitative evaluation showed that the MLAA-AC method could sparsely estimate bones and accurately differentiate them from air cavities. It was found that the atlas-AC method can accurately predict bones with variable errors in defining air cavities. Quantitative assessment of bone extraction accuracy based on Dice similarity coefficient (DSC) showed that MLAA-AC and atlas-AC resulted in DSC mean values of 0.79 and 0.92, respectively, in all patients. The MLAA-AC and atlas-AC methods predicted mean linear attenuation coefficients of 0.107 and 0.134 cm(-1), respectively, for the skull compared to reference CTAC mean value of 0.138cm(-1). The evaluation of the relative change in tracer uptake within 32 distinct regions of the brain with respect to CTAC PET images showed that the 3-class MRAC, MLAA-AC and atlas-AC methods resulted in quantification errors of -16.2 ± 3.6%, -13.3 ± 3.3% and 1.0 ± 3.4%, respectively. Linear regression and Bland-Altman concordance plots showed that both 3-class MRAC and MLAA-AC methods result in a significant systematic bias in PET tracer uptake, while the atlas-AC method results in a negligible bias. The standard 3-class MRAC method significantly underestimated cerebral PET tracer uptake. While current state-of-the-art MLAA-AC methods look promising, they were unable to noticeably reduce quantification errors in the context of brain imaging. Conversely, the proposed atlas-AC method provided the most accurate attenuation maps, and thus the lowest quantification bias. Copyright © 2016 Elsevier Inc. All rights reserved.
NASA Astrophysics Data System (ADS)
Jones, C. E.; Kato, S.; Nakashima, Y.; Kajii, Y.
2013-12-01
Biogenic emissions supply the largest fraction of non-methane volatile organic compounds (VOC) from the biosphere to the atmospheric boundary layer, and typically comprise a complex mixture of reactive terpenes. Due to this chemical complexity, achieving comprehensive measurements of biogenic VOC (BVOC) in air within a satisfactory time resolution is analytically challenging. To address this, we have developed a novel, fully automated Fast Gas Chromatography (Fast-GC) based technique to provide higher time resolution monitoring of monoterpenes (and selected other C9-C15 terpenes) during plant emission studies and in ambient air. To our knowledge, this is the first study to apply a Fast-GC based separation technique to achieve quantification of terpenes in air. Three chromatography methods have been developed for atmospheric terpene analysis under different sampling scenarios. Each method facilitates chromatographic separation of selected BVOC within a significantly reduced analysis time compared to conventional GC methods, whilst maintaining the ability to quantify individual monoterpene structural isomers. Using this approach, the C10-C15 BVOC composition of single plant emissions may be characterised within a ~ 14 min analysis time. Moreover, in situ quantification of 12 monoterpenes in unpolluted ambient air may be achieved within an ~ 11 min chromatographic separation time (increasing to ~ 19 min when simultaneous quantification of multiple oxygenated C9-C10 terpenoids is required, and/or when concentrations of anthropogenic VOC are significant). This corresponds to a two- to fivefold increase in measurement frequency compared to conventional GC methods. Here we outline the technical details and analytical capability of this chromatographic approach, and present the first in situ Fast-GC observations of 6 monoterpenes and the oxygenated BVOC linalool in ambient air. During this field deployment within a suburban forest ~ 30 km west of central Tokyo, Japan, the Fast-GC limit of detection with respect to monoterpenes was 4-5 ppt, and the agreement between Fast-GC and PTR-MS derived total monoterpene mixing ratios was consistent with previous GC/PTR-MS comparisons. The measurement uncertainties associated with the Fast-GC quantification of monoterpenes are ≤ 10%, while larger uncertainties (up to ~ 25%) are associated with the OBVOC and sesquiterpene measurements.
A subsystem identification method based on the path concept with coupling strength estimation
NASA Astrophysics Data System (ADS)
Magrans, Francesc Xavier; Poblet-Puig, Jordi; Rodríguez-Ferran, Antonio
2018-02-01
For complex geometries, the definition of the subsystems is not a straightforward task. We present here a subsystem identification method based on the direct transfer matrix, which represents the first-order paths. The key ingredient is a cluster analysis of the rows of the powers of the transfer matrix. These powers represent high-order paths in the system and are more affected than low-order paths by damping. Once subsystems are identified, the proposed approach also provides a quantification of the degree of coupling between subsystems. This information is relevant to decide whether a subsystem may be analysed in a computer model or measured in the laboratory independently of the rest or subsystems or not. The two features (subsystem identification and quantification of the degree of coupling) are illustrated by means of numerical examples: plates coupled by means of springs and rooms connected by means of a cavity.
USDA-ARS?s Scientific Manuscript database
A method for the identification and quantification of citrus limonoid glucosides in juices based upon high performance liquid chromatography (HPLC) separation coupled to post-column reaction with Ehrlichs’s reagent has been developed. This method utilizes a phenyl stationary phase and an isocratic ...
Sewer infiltration/inflow: long-term monitoring based on diurnal variation of pollutant mass flux.
Bares, V; Stránský, D; Sýkora, P
2009-01-01
The paper deals with a method for quantification of infiltrating groundwater based on the variation of diurnal pollutant load and continuous water quality and quantity monitoring. Although the method gives us the potential to separate particular components of wastewater hygrograph, several aspects of the method should be discussed. Therefore, the paper investigates the cost-effectiveness, the relevance of pollutant load from surface waters (groundwater) and the influence of measurement time step. These aspects were studied in an experimental catchment of Prague sewer system, Czech Republic, within a three-month period. The results indicate high contribution of parasitic waters on night minimal discharge. Taking into account the uncertainty of the results and time-consuming maintenance of the sensor, the principal advantages of the method are evaluated. The study introduces a promising potential of the discussed measuring concept for quantification of groundwater infiltrating into the sewer system. It is shown that the conventional approach is sufficient and cost-effective even in those catchments, where significant contribution of foul sewage in night minima would have been assumed.
Lee, Da-Sheng
2010-01-01
Chip-based DNA quantification systems are widespread, and used in many point-of-care applications. However, instruments for such applications may not be maintained or calibrated regularly. Since machine reliability is a key issue for normal operation, this study presents a system model of the real-time Polymerase Chain Reaction (PCR) machine to analyze the instrument design through numerical experiments. Based on model analysis, a systematic approach was developed to lower the variation of DNA quantification and achieve a robust design for a real-time PCR-on-a-chip system. Accelerated lift testing was adopted to evaluate the reliability of the chip prototype. According to the life test plan, this proposed real-time PCR-on-a-chip system was simulated to work continuously for over three years with similar reproducibility in DNA quantification. This not only shows the robustness of the lab-on-a-chip system, but also verifies the effectiveness of our systematic method for achieving a robust design.
NASA Technical Reports Server (NTRS)
Goldman, Aaron
1999-01-01
The Langley-D.U. collaboration on the analysis of high resolution infrared atmospheric spectra covered a number of important studies of trace gases identification and quantification from field spectra, and spectral line parameters analysis. The collaborative work included: Quantification and monitoring of trace gases from ground-based spectra available from various locations and seasons and from balloon flights. Studies toward identification and quantification of isotopic species, mostly oxygen and Sulfur isotopes. Search for new species on the available spectra. Update of spectroscopic line parameters, by combining laboratory and atmospheric spectra with theoretical spectroscopy methods. Study of trends of atmosphere trace constituents. Algorithms developments, retrievals intercomparisons and automatization of the analysis of NDSC spectra, for both column amounts and vertical profiles.
Demeke, Tigst; Dobnik, David
2018-07-01
The number of genetically modified organisms (GMOs) on the market is steadily increasing. Because of regulation of cultivation and trade of GMOs in several countries, there is pressure for their accurate detection and quantification. Today, DNA-based approaches are more popular for this purpose than protein-based methods, and real-time quantitative PCR (qPCR) is still the gold standard in GMO analytics. However, digital PCR (dPCR) offers several advantages over qPCR, making this new technique appealing also for GMO analysis. This critical review focuses on the use of dPCR for the purpose of GMO quantification and addresses parameters which are important for achieving accurate and reliable results, such as the quality and purity of DNA and reaction optimization. Three critical factors are explored and discussed in more depth: correct classification of partitions as positive, correctly determined partition volume, and dilution factor. This review could serve as a guide for all laboratories implementing dPCR. Most of the parameters discussed are applicable to fields other than purely GMO testing. Graphical abstract There are generally three different options for absolute quantification of genetically modified organisms (GMOs) using digital PCR: droplet- or chamber-based and droplets in chambers. All have in common the distribution of reaction mixture into several partitions, which are all subjected to PCR and scored at the end-point as positive or negative. Based on these results GMO content can be calculated.
Duran, Maria Carolina; Willenbrock, Saskia; Müller, Jessika-M V; Nolte, Ingo; Feige, Karsten; Murua Escobar, Hugo
2013-04-01
Interleukin-12 (IL-12) and interferon gamma (IFN-γ) are key cytokines in immunemediated equine melanoma therapy. Currently, a method for accurate simultaneous quantification of these equine cytokines is lacking. Therefore, we sought to establish an assay that allows for accurate and simultaneous quantification of equine IL-12 (eIL-12) and IFN-γ (eIFN-γ). Several antibodies were evaluated for cross-reactivity to eIL-12 and eIFN-γ and were used to establish a bead-based Luminex assay, which was subsequently applied to quantify cytokine concentrations in biological samples. Cytokine detection ranged from 31.5-5,000 pg/ml and 15-10,000 pg/ml for eIL-12 and eIFN-γ, respectively. eIL-12 was detected in supernatants of stimulated peripheral blood mononuclear cells (PBMCs) and supernatants/cell lysates of eIL-12 expression plasmid-transfected cells. Low or undetectable cytokine concentrations were measured in negative controls. In equine serum samples, the mean measured eIL-12 concentration was 1,374 ± 8 pg/ml. The bead-based assay and ELISA for eIFN-γ used to measure eIFN-γ concentrations, showed similar concentrations. Results demonstrate, to our knowledge for the first time, that cross-reactive antibody pairs to eIL-12 and eIFN-γ and Luminex bead-based technology allow for accurate, simultaneous and multiplexed quantification of these key cytokines in biological samples.
A Constrained Genetic Algorithm with Adaptively Defined Fitness Function in MRS Quantification
NASA Astrophysics Data System (ADS)
Papakostas, G. A.; Karras, D. A.; Mertzios, B. G.; Graveron-Demilly, D.; van Ormondt, D.
MRS Signal quantification is a rather involved procedure and has attracted the interest of the medical engineering community, regarding the development of computationally efficient methodologies. Significant contributions based on Computational Intelligence tools, such as Neural Networks (NNs), demonstrated a good performance but not without drawbacks already discussed by the authors. On the other hand preliminary application of Genetic Algorithms (GA) has already been reported in the literature by the authors regarding the peak detection problem encountered in MRS quantification using the Voigt line shape model. This paper investigates a novel constrained genetic algorithm involving a generic and adaptively defined fitness function which extends the simple genetic algorithm methodology in case of noisy signals. The applicability of this new algorithm is scrutinized through experimentation in artificial MRS signals interleaved with noise, regarding its signal fitting capabilities. Although extensive experiments with real world MRS signals are necessary, the herein shown performance illustrates the method's potential to be established as a generic MRS metabolites quantification procedure.
Xia, Yun; Yan, Shuangqian; Zhang, Xian; Ma, Peng; Du, Wei; Feng, Xiaojun; Liu, Bi-Feng
2017-03-21
Digital loop-mediated isothermal amplification (dLAMP) is an attractive approach for absolute quantification of nucleic acids with high sensitivity and selectivity. Theoretical and numerical analysis of dLAMP provides necessary guidance for the design and analysis of dLAMP devices. In this work, a mathematical model was proposed on the basis of the Monte Carlo method and the theories of Poisson statistics and chemometrics. To examine the established model, we fabricated a spiral chip with 1200 uniform and discrete reaction chambers (9.6 nL) for absolute quantification of pathogenic DNA samples by dLAMP. Under the optimized conditions, dLAMP analysis on the spiral chip realized quantification of nucleic acids spanning over 4 orders of magnitude in concentration with sensitivity as low as 8.7 × 10 -2 copies/μL in 40 min. The experimental results were consistent with the proposed mathematical model, which could provide useful guideline for future development of dLAMP devices.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ding, H; Zhou, B; Beidokhti, D
Purpose: To investigate the feasibility of accurate quantification of iodine mass thickness in contrast-enhanced spectral mammography. Methods: Experimental phantom studies were performed on a spectral mammography system based on Si strip photon-counting detectors. Dual-energy images were acquired using 40 kVp and a splitting energy of 34 keV with 3 mm Al pre-filtration. The initial calibration was done with glandular and adipose tissue equivalent phantoms of uniform thicknesses and iodine disk phantoms of various concentrations. A secondary calibration was carried out using the iodine signal obtained from the dual-energy decomposed images and the known background phantom thicknesses and densities. The iodinemore » signal quantification method was validated using phantoms composed of a mixture of glandular and adipose materials, for various breast thicknesses and densities. Finally, the traditional dual-energy weighted subtraction method was also studied as a comparison. The measured iodine signal from both methods was compared to the known iodine concentrations of the disk phantoms to characterize the quantification accuracy. Results: There was good agreement between the iodine mass thicknesses measured using the proposed method and the known values. The root-mean-square (RMS) error was estimated to be 0.2 mg/cm2. The traditional weighted subtraction method also predicted a linear correlation between the measured signal and the known iodine mass thickness. However, the correlation slope and offset values were strongly dependent on the total breast thickness and density. Conclusion: The results of the current study suggest that iodine mass thickness can be accurately quantified with contrast-enhanced spectral mammography. The quantitative information can potentially improve the differentiation between benign and malignant legions. Grant funding from Philips Medical Systems.« less
Liu, Jia; Guo, Jinchao; Zhang, Haibo; Li, Ning; Yang, Litao; Zhang, Dabing
2009-11-25
Various polymerase chain reaction (PCR) methods were developed for the execution of genetically modified organism (GMO) labeling policies, of which an event-specific PCR detection method based on the flanking sequence of exogenous integration is the primary trend in GMO detection due to its high specificity. In this study, the 5' and 3' flanking sequences of the exogenous integration of MON89788 soybean were revealed by thermal asymmetric interlaced PCR. The event-specific PCR primers and TaqMan probe were designed based upon the revealed 5' flanking sequence, and the qualitative and quantitative PCR assays were established employing these designed primers and probes. In qualitative PCR, the limit of detection (LOD) was about 0.01 ng of genomic DNA corresponding to 10 copies of haploid soybean genomic DNA. In the quantitative PCR assay, the LOD was as low as two haploid genome copies, and the limit of quantification was five haploid genome copies. Furthermore, the developed PCR methods were in-house validated by five researchers, and the validated results indicated that the developed event-specific PCR methods can be used for identification and quantification of MON89788 soybean and its derivates.
Yang, Ting; Chen, Fei; Xu, Feifei; Wang, Fengliang; Xu, Qingqing; Chen, Yun
2014-09-25
P-glycoprotein (P-gp) can efflux drugs from cancer cells, and its overexpression is commonly associated with multi-drug resistance (MDR). Thus, the accurate quantification of P-gp would help predict the response to chemotherapy and for prognosis of breast cancer patients. An advanced liquid chromatography-tandem mass spectrometry (LC/MS/MS)-based targeted proteomics assay was developed and validated for monitoring P-gp levels in breast tissue. Tryptic peptide 368IIDNKPSIDSYSK380 was selected as a surrogate analyte for quantification, and immuno-depleted tissue extract was used as a surrogate matrix. Matched pairs of breast tissue samples from 60 patients who were suspected to have drug resistance were subject to analysis. The levels of P-gp were quantified. Using data from normal tissue, we suggested a P-gp reference interval. The experimental values of tumor tissue samples were compared with those obtained from Western blotting and immunohistochemistry (IHC). The result indicated that the targeted proteomics approach was comparable to IHC but provided a lower limit of quantification (LOQ) and could afford more reliable results at low concentrations than the other two methods. LC/MS/MS-based targeted proteomics may allow the quantification of P-gp in breast tissue in a more accurate manner. Copyright © 2014 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Pagola, Iñigo; Funcia, Ibai; Sánchez, Marcelino; Gil, Javier; González-Vallejo, Victoria; Bedoya, Maxi; Orellana, Guillermo
2017-06-01
The work presented in this paper offers a robust, effective and economically competitive method for online detection and monitoring of the presence of molecular hydrogen in the heat transfer fluids of parabolic trough collector plants. The novel method is based on a specific fluorescent sensor according to the ES2425002 patent ("Method for the detection and quantification of hydrogen in a heat transfer fluid").
NASA Astrophysics Data System (ADS)
Guo, Mengmeng; Wu, Haiyan; Jiang, Tao; Tan, Zhijun; Zhao, Chunxia; Zheng, Guanchao; Li, Zhaoxin; Zhai, Yuxiu
2017-07-01
In this study, we established a comprehensive method for simultaneous identification and quantification of tetrodotoxin (TTX) in fresh pufferfish tissues and pufferfish-based products using liquid chromatography/quadrupole-linear ion trap mass spectrometry (LC-QqLIT-MS). TTX was extracted by 1% acetic acid-methanol, and most of the lipids were then removed by freezing lipid precipitation, followed by purification and concentration using immunoaffinity columns (IACs). Matrix effects were substantially reduced due to the high specificity of the IACs, and thus, background interference was avoided. Quantitation analysis was therefore performed using an external calibration curve with standards prepared in mobile phase. The method was evaluated by fortifying samples at 1, 10, and 100 ng/g, respectively, and the recoveries ranged from 75.8%-107%, with a relative standard deviation of less than 15%. The TTX calibration curves were linear over the range of 1-1 000 μg/L, with a detection limit of 0.3 ng/g and a quantification limit of 1 ng/g. Using this method, samples can be further analyzed using an information-dependent acquisition (IDA) experiment, in the positive mode, from a single liquid chromatography-tandem mass spectrometry injection, which can provide an extra level of confirmation by matching the full product ion spectra acquired for a standard sample with those from an enhanced product ion (EPI) library. The scheduled multiple reaction monitoring method enabled TTX to be screened for, and TTX was positively identified using the IDA and EPI spectra. This method was successfully applied to analyze a total of 206 samples of fresh pufferfish tissues and pufferfish-based products. The results from this study show that the proposed method can be used to quantify and identify TTX in a single run with excellent sensitivity and reproducibility, and is suitable for the analysis of complex matrix pufferfish samples.
NASA Astrophysics Data System (ADS)
Qi, Yulin; Müller, Miriam; Stokes, Caroline S.; Volmer, Dietrich A.
2018-04-01
LC-MS/MS is widely utilized today for quantification of vitamin D in biological fluids. Mass spectrometric assays for vitamin D require very careful method optimization for precise and interference-free, accurate analyses however. Here, we explore chemical derivatization and matrix-assisted laser desorption/ionization (MALDI) as a rapid alternative for quantitative measurement of 25-hydroxyvitamin D3 in human serum, and compare it to results from LC-MS/MS. The method implemented an automated imaging step of each MALDI spot, to locate areas of high intensity, avoid sweet spot phenomena, and thus improve precision. There was no statistically significant difference in vitamin D quantification between the MALDI-MS/MS and LC-MS/MS: mean ± standard deviation for MALDI-MS—29.4 ± 10.3 ng/mL—versus LC-MS/MS—30.3 ± 11.2 ng/mL (P = 0.128)—for the sum of the 25-hydroxyvitamin D epimers. The MALDI-based assay avoided time-consuming chromatographic separation steps and was thus much faster than the LC-MS/MS assay. It also consumed less sample, required no organic solvents, and was readily automated. In this proof-of-concept study, MALDI-MS readily demonstrated its potential for mass spectrometric quantification of vitamin D compounds in biological fluids.
Li, Zhao; Liu, Yong; Wei, Qingquan; Liu, Yuanjie; Liu, Wenwen; Zhang, Xuelian; Yu, Yude
2016-01-01
Absolute, precise quantification methods expand the scope of nucleic acids research and have many practical applications. Digital polymerase chain reaction (dPCR) is a powerful method for nucleic acid detection and absolute quantification. However, it requires thermal cycling and accurate temperature control, which are difficult in resource-limited conditions. Accordingly, isothermal methods, such as recombinase polymerase amplification (RPA), are more attractive. We developed a picoliter well array (PWA) chip with 27,000 consistently sized picoliter reactions (314 pL) for isothermal DNA quantification using digital RPA (dRPA) at 39°C. Sample loading using a scraping liquid blade was simple, fast, and required small reagent volumes (i.e., <20 μL). Passivating the chip surface using a methoxy-PEG-silane agent effectively eliminated cross-contamination during dRPA. Our creative optical design enabled wide-field fluorescence imaging in situ and both end-point and real-time analyses of picoliter wells in a 6-cm(2) area. It was not necessary to use scan shooting and stitch serial small images together. Using this method, we quantified serial dilutions of a Listeria monocytogenes gDNA stock solution from 9 × 10(-1) to 4 × 10(-3) copies per well with an average error of less than 11% (N = 15). Overall dRPA-on-chip processing required less than 30 min, which was a 4-fold decrease compared to dPCR, requiring approximately 2 h. dRPA on the PWA chip provides a simple and highly sensitive method to quantify nucleic acids without thermal cycling or precise micropump/microvalve control. It has applications in fast field analysis and critical clinical diagnostics under resource-limited settings.
Li, Zhao; Liu, Yong; Wei, Qingquan; Liu, Yuanjie; Liu, Wenwen; Zhang, Xuelian; Yu, Yude
2016-01-01
Absolute, precise quantification methods expand the scope of nucleic acids research and have many practical applications. Digital polymerase chain reaction (dPCR) is a powerful method for nucleic acid detection and absolute quantification. However, it requires thermal cycling and accurate temperature control, which are difficult in resource-limited conditions. Accordingly, isothermal methods, such as recombinase polymerase amplification (RPA), are more attractive. We developed a picoliter well array (PWA) chip with 27,000 consistently sized picoliter reactions (314 pL) for isothermal DNA quantification using digital RPA (dRPA) at 39°C. Sample loading using a scraping liquid blade was simple, fast, and required small reagent volumes (i.e., <20 μL). Passivating the chip surface using a methoxy-PEG-silane agent effectively eliminated cross-contamination during dRPA. Our creative optical design enabled wide-field fluorescence imaging in situ and both end-point and real-time analyses of picoliter wells in a 6-cm2 area. It was not necessary to use scan shooting and stitch serial small images together. Using this method, we quantified serial dilutions of a Listeria monocytogenes gDNA stock solution from 9 × 10-1 to 4 × 10-3 copies per well with an average error of less than 11% (N = 15). Overall dRPA-on-chip processing required less than 30 min, which was a 4-fold decrease compared to dPCR, requiring approximately 2 h. dRPA on the PWA chip provides a simple and highly sensitive method to quantify nucleic acids without thermal cycling or precise micropump/microvalve control. It has applications in fast field analysis and critical clinical diagnostics under resource-limited settings. PMID:27074005
Amoah, Isaac Dennis; Singh, Gulshan; Stenström, Thor Axel; Reddy, Poovendhree
2017-05-01
It is estimated that over a billion people are infected with soil-transmitted helminths (STHs) globally with majority occurring in tropical and subtropical regions of the world. The roundworm (Ascaris lumbricoides), whipworm (Trichuris trichiura), and hookworms (Ancylostoma duodenale and Necator americanus) are the main species infecting people. These infections are mostly gained through exposure to faecally contaminated water, soil or contaminated food and with an increase in the risk of infections due to wastewater and sludge reuse in agriculture. Different methods have been developed for the detection and quantification of STHs eggs in environmental samples. However, there is a lack of a universally accepted technique which creates a challenge for comparative assessments of helminths egg concentrations both in different samples matrices as well as between locations. This review presents a comparison of reported methodologies for the detection of STHs eggs, an assessment of the relative performance of available detection methods and a discussion of new emerging techniques that could be applied for detection and quantification. It is based on a literature search using PubMed and Science Direct considering all geographical locations. Original research articles were selected based on their methodology and results sections. Methods reported in these articles were grouped into conventional, molecular and emerging techniques, the main steps in each method were then compared and discussed. The inclusion of a dissociation step aimed at detaching helminth eggs from particulate matter was found to improve the recovery of eggs. Additionally the selection and application of flotation solutions that take into account the relative densities of the eggs of different species of STHs also results in higher egg recovery. Generally the use of conventional methods was shown to be laborious and time consuming and prone to human error. The alternate use of nucleic acid-based techniques has improved the sensitivity of detection and made species specific identification possible. However, these nucleic acid based methods are expensive and less suitable in regions with limited resources and skill. The loop mediated isothermal amplification method shows promise for application in these settings due to its simplicity and use of basic equipment. In addition, the development of imaging soft-ware for the detection and quantification of STHs shows promise to further reduce human error associated with the analysis of environmental samples. It may be concluded that there is a need to comparatively assess the performance of different methods to determine their applicability in different settings as well as for use with different sample matrices (wastewater, sludge, compost, soil, vegetables etc.). Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Rodigast, Maria; Mutzel, Anke; Herrmann, Hartmut
2017-03-01
Methylglyoxal forms oligomeric compounds in the atmospheric aqueous particle phase, which could establish a significant contribution to the formation of aqueous secondary organic aerosol (aqSOA). Thus far, no suitable method for the quantification of methylglyoxal oligomers is available despite the great effort spent for structure elucidation. In the present study a simplified method was developed to quantify heat-decomposable methylglyoxal oligomers as a sum parameter. The method is based on the thermal decomposition of oligomers into methylglyoxal monomers. Formed methylglyoxal monomers were detected using PFBHA (o-(2,3,4,5,6-pentafluorobenzyl)hydroxylamine hydrochloride) derivatisation and gas chromatography-mass spectrometry (GC/MS) analysis. The method development was focused on the heating time (varied between 15 and 48 h), pH during the heating process (pH = 1-7), and heating temperature (50, 100 °C). The optimised values of these method parameters are presented. The developed method was applied to quantify heat-decomposable methylglyoxal oligomers formed during the OH-radical oxidation of 1,3,5-trimethylbenzene (TMB) in the Leipzig aerosol chamber (LEipziger AerosolKammer, LEAK). Oligomer formation was investigated as a function of seed particle acidity and relative humidity. A fraction of heat-decomposable methylglyoxal oligomers of up to 8 % in the produced organic particle mass was found, highlighting the importance of those oligomers formed solely by methylglyoxal for SOA formation. Overall, the present study provides a new and suitable method for quantification of heat-decomposable methylglyoxal oligomers in the aqueous particle phase.
Takeno, Shinya; Bamba, Takeshi; Nakazawa, Yoshihisa; Fukusaki, Eiichiro; Okazawa, Atsushi; Kobayashi, Akio
2008-04-01
Commercial development of trans-1,4-polyisoprene from Eucommia ulmoides Oliver (EU-rubber) requires specific knowledge on selection of high-rubber-content lines and establishment of agronomic cultivation methods for achieving maximum EU-rubber yield. The development can be facilitated by high-throughput and highly sensitive analytical techniques for EU-rubber extraction and quantification. In this paper, we described an efficient EU-rubber extraction method, and validated that the accuracy was equivalent to that of the conventional Soxhlet extraction method. We also described a highly sensitive quantification method for EU-rubber by Fourier transform infrared spectroscopy (FT-IR) and pyrolysis-gas chromatography/mass spectrometry (PyGC/MS). We successfully applied the extraction/quantification method for study of seasonal changes in EU-rubber content and molecular weight distribution.
Ding, Huanjun; Johnson, Travis; Lin, Muqing; Le, Huy Q; Ducote, Justin L; Su, Min-Ying; Molloi, Sabee
2013-12-01
Quantification of breast density based on three-dimensional breast MRI may provide useful information for the early detection of breast cancer. However, the field inhomogeneity can severely challenge the computerized image segmentation process. In this work, the effect of the bias field in breast density quantification has been investigated with a postmortem study. T1-weighted images of 20 pairs of postmortem breasts were acquired on a 1.5 T breast MRI scanner. Two computer-assisted algorithms were used to quantify the volumetric breast density. First, standard fuzzy c-means (FCM) clustering was used on raw images with the bias field present. Then, the coherent local intensity clustering (CLIC) method estimated and corrected the bias field during the iterative tissue segmentation process. Finally, FCM clustering was performed on the bias-field-corrected images produced by CLIC method. The left-right correlation for breasts in the same pair was studied for both segmentation algorithms to evaluate the precision of the tissue classification. Finally, the breast densities measured with the three methods were compared to the gold standard tissue compositions obtained from chemical analysis. The linear correlation coefficient, Pearson's r, was used to evaluate the two image segmentation algorithms and the effect of bias field. The CLIC method successfully corrected the intensity inhomogeneity induced by the bias field. In left-right comparisons, the CLIC method significantly improved the slope and the correlation coefficient of the linear fitting for the glandular volume estimation. The left-right breast density correlation was also increased from 0.93 to 0.98. When compared with the percent fibroglandular volume (%FGV) from chemical analysis, results after bias field correction from both the CLIC the FCM algorithms showed improved linear correlation. As a result, the Pearson's r increased from 0.86 to 0.92 with the bias field correction. The investigated CLIC method significantly increased the precision and accuracy of breast density quantification using breast MRI images by effectively correcting the bias field. It is expected that a fully automated computerized algorithm for breast density quantification may have great potential in clinical MRI applications.
Dieringer, Matthias A; Deimling, Michael; Santoro, Davide; Wuerfel, Jens; Madai, Vince I; Sobesky, Jan; von Knobelsdorff-Brenkenhoff, Florian; Schulz-Menger, Jeanette; Niendorf, Thoralf
2014-01-01
Visual but subjective reading of longitudinal relaxation time (T1) weighted magnetic resonance images is commonly used for the detection of brain pathologies. For this non-quantitative measure, diagnostic quality depends on hardware configuration, imaging parameters, radio frequency transmission field (B1+) uniformity, as well as observer experience. Parametric quantification of the tissue T1 relaxation parameter offsets the propensity for these effects, but is typically time consuming. For this reason, this study examines the feasibility of rapid 2D T1 quantification using a variable flip angles (VFA) approach at magnetic field strengths of 1.5 Tesla, 3 Tesla, and 7 Tesla. These efforts include validation in phantom experiments and application for brain T1 mapping. T1 quantification included simulations of the Bloch equations to correct for slice profile imperfections, and a correction for B1+. Fast gradient echo acquisitions were conducted using three adjusted flip angles for the proposed T1 quantification approach that was benchmarked against slice profile uncorrected 2D VFA and an inversion-recovery spin-echo based reference method. Brain T1 mapping was performed in six healthy subjects, one multiple sclerosis patient, and one stroke patient. Phantom experiments showed a mean T1 estimation error of (-63±1.5)% for slice profile uncorrected 2D VFA and (0.2±1.4)% for the proposed approach compared to the reference method. Scan time for single slice T1 mapping including B1+ mapping could be reduced to 5 seconds using an in-plane resolution of (2×2) mm2, which equals a scan time reduction of more than 99% compared to the reference method. Our results demonstrate that rapid 2D T1 quantification using a variable flip angle approach is feasible at 1.5T/3T/7T. It represents a valuable alternative for rapid T1 mapping due to the gain in speed versus conventional approaches. This progress may serve to enhance the capabilities of parametric MR based lesion detection and brain tissue characterization.
Determination of Microalgal Lipid Content and Fatty Acid for Biofuel Production
Chen, Zhipeng; Wang, Lingfeng
2018-01-01
Biofuels produced from microalgal biomass have received growing worldwide recognition as promising alternatives to conventional petroleum-derived fuels. Among the processes involved, the downstream refinement process for the extraction of lipids from biomass greatly influences the sustainability and efficiency of the entire biofuel system. This review summarizes and compares the current techniques for the extraction and measurement of microalgal lipids, including the gravimetric methods using organic solvents, CO2-based solvents, ionic liquids and switchable solvents, Nile red lipid visualization method, sulfo-phospho-vanillin method, and the thin-layer chromatography method. Each method has its own competitive advantages and disadvantages. For example, the organic solvents-based gravimetric method is mostly used and frequently employed as a reference standard to validate other methods, but it requires large amounts of samples and is time-consuming and expensive to recover solvents also with low selectivity towards desired products. The pretreatment approaches which aimed to disrupt cells and support subsequent lipid extraction through bead beating, microwave, ultrasonication, chemical methods, and enzymatic disruption are also introduced. Moreover, the principles and procedures for the production and quantification of fatty acids are finally described in detail, involving the preparation of fatty acid methyl esters and their quantification and composition analysis by gas chromatography.
Hame, Yrjo; Angelini, Elsa D; Hoffman, Eric A; Barr, R Graham; Laine, Andrew F
2014-07-01
The extent of pulmonary emphysema is commonly estimated from CT scans by computing the proportional area of voxels below a predefined attenuation threshold. However, the reliability of this approach is limited by several factors that affect the CT intensity distributions in the lung. This work presents a novel method for emphysema quantification, based on parametric modeling of intensity distributions and a hidden Markov measure field model to segment emphysematous regions. The framework adapts to the characteristics of an image to ensure a robust quantification of emphysema under varying CT imaging protocols, and differences in parenchymal intensity distributions due to factors such as inspiration level. Compared to standard approaches, the presented model involves a larger number of parameters, most of which can be estimated from data, to handle the variability encountered in lung CT scans. The method was applied on a longitudinal data set with 87 subjects and a total of 365 scans acquired with varying imaging protocols. The resulting emphysema estimates had very high intra-subject correlation values. By reducing sensitivity to changes in imaging protocol, the method provides a more robust estimate than standard approaches. The generated emphysema delineations promise advantages for regional analysis of emphysema extent and progression.
2011-01-01
Background Image segmentation is a crucial step in quantitative microscopy that helps to define regions of tissues, cells or subcellular compartments. Depending on the degree of user interactions, segmentation methods can be divided into manual, automated or semi-automated approaches. 3D image stacks usually require automated methods due to their large number of optical sections. However, certain applications benefit from manual or semi-automated approaches. Scenarios include the quantification of 3D images with poor signal-to-noise ratios or the generation of so-called ground truth segmentations that are used to evaluate the accuracy of automated segmentation methods. Results We have developed Gebiss; an ImageJ plugin for the interactive segmentation, visualisation and quantification of 3D microscopic image stacks. We integrated a variety of existing plugins for threshold-based segmentation and volume visualisation. Conclusions We demonstrate the application of Gebiss to the segmentation of nuclei in live Drosophila embryos and the quantification of neurodegeneration in Drosophila larval brains. Gebiss was developed as a cross-platform ImageJ plugin and is freely available on the web at http://imaging.bii.a-star.edu.sg/projects/gebiss/. PMID:21668958
Development of a method for detection and quantification of B. brongniartii and B. bassiana in soil
NASA Astrophysics Data System (ADS)
Canfora, L.; Malusà, E.; Tkaczuk, C.; Tartanus, M.; Łabanowska, B. H.; Pinzari, F.
2016-03-01
A culture independent method based on qPCR was developed for the detection and quantification of two fungal inoculants in soil. The aim was to adapt a genotyping approach based on SSR (Simple Sequence Repeat) marker to a discriminating tracing of two different species of bioinoculants in soil, after their in-field release. Two entomopathogenic fungi, Beauveria bassiana and B. brongniartii, were traced and quantified in soil samples obtained from field trials. These two fungal species were used as biological agents in Poland to control Melolontha melolontha (European cockchafer), whose larvae live in soil menacing horticultural crops. Specificity of SSR markers was verified using controls consisting of: i) soil samples containing fungal spores of B. bassiana and B. brongniartii in known dilutions; ii) the DNA of the fungal microorganisms; iii) soil samples singly inoculated with each fungus species. An initial evaluation of the protocol was performed with analyses of soil DNA and mycelial DNA. Further, the simultaneous detection and quantification of B. bassiana and B. brongniartii in soil was achieved in field samples after application of the bio-inoculants. The protocol can be considered as a relatively low cost solution for the detection, identification and traceability of fungal bio-inoculants in soil.
Colletes, T C; Garcia, P T; Campanha, R B; Abdelnur, P V; Romão, W; Coltro, W K T; Vaz, B G
2016-03-07
The analytical performance for paper spray (PS) using a new insert sample approach based on paper with paraffin barriers (PS-PB) is presented. The paraffin barrier is made using a simple, fast and cheap method based on the stamping of paraffin onto a paper surface. Typical operation conditions of paper spray such as the solvent volume applied on the paper surface, and the paper substrate type are evaluated. A paper substrate with paraffin barriers shows better performance on analysis of a range of typical analytes when compared to the conventional PS-MS using normal paper (PS-NP) and PS-MS using paper with two rounded corners (PS-RC). PS-PB was applied to detect sugars and their inhibitors in sugarcane bagasse liquors from a second generation ethanol process. Moreover, the PS-PB proved to be excellent, showing results for the quantification of glucose in hydrolysis liquors with excellent linearity (R(2) = 0.99), limits of detection (2.77 mmol L(-1)) and quantification (9.27 mmol L(-1)). The results are better than for PS-NP and PS-RC. The PS-PB was also excellent in performance when compared with the HPLC-UV method for glucose quantification on hydrolysis of liquor samples.
Development of a method for detection and quantification of B. brongniartii and B. bassiana in soil
Canfora, L.; Malusà, E.; Tkaczuk, C.; Tartanus, M.; Łabanowska, B.H.; Pinzari, F.
2016-01-01
A culture independent method based on qPCR was developed for the detection and quantification of two fungal inoculants in soil. The aim was to adapt a genotyping approach based on SSR (Simple Sequence Repeat) marker to a discriminating tracing of two different species of bioinoculants in soil, after their in-field release. Two entomopathogenic fungi, Beauveria bassiana and B. brongniartii, were traced and quantified in soil samples obtained from field trials. These two fungal species were used as biological agents in Poland to control Melolontha melolontha (European cockchafer), whose larvae live in soil menacing horticultural crops. Specificity of SSR markers was verified using controls consisting of: i) soil samples containing fungal spores of B. bassiana and B. brongniartii in known dilutions; ii) the DNA of the fungal microorganisms; iii) soil samples singly inoculated with each fungus species. An initial evaluation of the protocol was performed with analyses of soil DNA and mycelial DNA. Further, the simultaneous detection and quantification of B. bassiana and B. brongniartii in soil was achieved in field samples after application of the bio-inoculants. The protocol can be considered as a relatively low cost solution for the detection, identification and traceability of fungal bio-inoculants in soil. PMID:26975931
Quantification of Humic Substances in Natural Water Using Nitrogen-Doped Carbon Dots.
Guan, Yan-Fang; Huang, Bao-Cheng; Qian, Chen; Yu, Han-Qing
2017-12-19
Dissolved organic matter (DOM) is ubiquitous in aqueous environments and plays a significant role in pollutant mitigation, transformation and organic geochemical circulation. DOM is also capable of forming carcinogenic byproducts in the disinfection treatment processes of drinking water. Thus, efficient methods for DOM quantification are highly desired. In this work, a novel sensor for rapid and selective detection of humic substances (HS), a key component of DOM, based on fluorescence quenching of nitrogen-doped carbon quantum dots was developed. The experimental results show that the HS detection range could be broadened to 100 mg/L with a detection limit of 0.2 mg/L. Moreover, the detection was effective within a wide pH range of 3.0 to 12.0, and the interferences of ions on the HS measurement were negligible. A good detection result for real surface water samples further validated the feasibility of the developed detection method. Furthermore, a nonradiation electron transfer mechanism for quenching the nitrogen-doped carbon-dots fluorescence by HS was elucidated. In addition, we prepared a test paper and proved its effectiveness. This work provides a new efficient method for the HS quantification than the frequently used modified Lowry method in terms of sensitivity and detection range.
Chen, Guoqiang; Hoptroff, Michael; Fei, Xiaoqing; Su, Ya; Janssen, Hans-Gerd
2013-11-22
A sensitive and specific ultra-high-performance liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method was developed and validated for the measurement of climbazole deposition from hair care products onto artificial skin and human scalp. Deuterated climbazole was used as the internal standard. Atmospheric pressure chemical ionization (APCI) in positive mode was applied for the detection of climbazole. For quantification, multiple reaction monitoring (MRM) transition 293.0>69.0 was monitored for climbazole, and MRM transition 296.0>225.1 for the deuterated climbazole. The linear range ran from 4 to 2000 ng mL(-1). The limit of detection (LOD) and the limit of quantification (LOQ) were 1 ng mL(-1) and 4 ng mL(-1), respectively, which enabled quantification of climbazole on artificial skin and human scalp at ppb level (corresponding to 16 ng cm(-2)). For the sampling of climbazole from human scalp the buffer scrub method using a surfactant-modified phosphate buffered saline (PBS) solution was selected based on a performance comparison of tape stripping, the buffer scrub method and solvent extraction in in vitro studies. Using this method, climbazole deposition in in vitro and in vivo studies was successfully quantified. Copyright © 2013 Elsevier B.V. All rights reserved.
A Novel Weighted Kernel PCA-Based Method for Optimization and Uncertainty Quantification
NASA Astrophysics Data System (ADS)
Thimmisetty, C.; Talbot, C.; Chen, X.; Tong, C. H.
2016-12-01
It has been demonstrated that machine learning methods can be successfully applied to uncertainty quantification for geophysical systems through the use of the adjoint method coupled with kernel PCA-based optimization. In addition, it has been shown through weighted linear PCA how optimization with respect to both observation weights and feature space control variables can accelerate convergence of such methods. Linear machine learning methods, however, are inherently limited in their ability to represent features of non-Gaussian stochastic random fields, as they are based on only the first two statistical moments of the original data. Nonlinear spatial relationships and multipoint statistics leading to the tortuosity characteristic of channelized media, for example, are captured only to a limited extent by linear PCA. With the aim of coupling the kernel-based and weighted methods discussed, we present a novel mathematical formulation of kernel PCA, Weighted Kernel Principal Component Analysis (WKPCA), that both captures nonlinear relationships and incorporates the attribution of significance levels to different realizations of the stochastic random field of interest. We also demonstrate how new instantiations retaining defining characteristics of the random field can be generated using Bayesian methods. In particular, we present a novel WKPCA-based optimization method that minimizes a given objective function with respect to both feature space random variables and observation weights through which optimal snapshot significance levels and optimal features are learned. We showcase how WKPCA can be applied to nonlinear optimal control problems involving channelized media, and in particular demonstrate an application of the method to learning the spatial distribution of material parameter values in the context of linear elasticity, and discuss further extensions of the method to stochastic inversion.
Liu, Ruolin; Dickerson, Julie
2017-11-01
We propose a novel method and software tool, Strawberry, for transcript reconstruction and quantification from RNA-Seq data under the guidance of genome alignment and independent of gene annotation. Strawberry consists of two modules: assembly and quantification. The novelty of Strawberry is that the two modules use different optimization frameworks but utilize the same data graph structure, which allows a highly efficient, expandable and accurate algorithm for dealing large data. The assembly module parses aligned reads into splicing graphs, and uses network flow algorithms to select the most likely transcripts. The quantification module uses a latent class model to assign read counts from the nodes of splicing graphs to transcripts. Strawberry simultaneously estimates the transcript abundances and corrects for sequencing bias through an EM algorithm. Based on simulations, Strawberry outperforms Cufflinks and StringTie in terms of both assembly and quantification accuracies. Under the evaluation of a real data set, the estimated transcript expression by Strawberry has the highest correlation with Nanostring probe counts, an independent experiment measure for transcript expression. Strawberry is written in C++14, and is available as open source software at https://github.com/ruolin/strawberry under the MIT license.
Carbon Nanotubes Released from an Epoxy-Based Nanocomposite: Quantification and Particle Toxicity.
Schlagenhauf, Lukas; Buerki-Thurnherr, Tina; Kuo, Yu-Ying; Wichser, Adrian; Nüesch, Frank; Wick, Peter; Wang, Jing
2015-09-01
Studies combining both the quantification of free nanoparticle release and the toxicological investigations of the released particles from actual nanoproducts in a real-life exposure scenario are urgently needed, yet very rare. Here, a new measurement method was established to quantify the amount of free-standing and protruding multiwalled carbon nanotubes (MWCNTs) in the respirable fraction of particles abraded from a MWCNT-epoxy nanocomposite. The quantification approach involves the prelabeling of MWCNTs with lead ions, nanocomposite production, abrasion and collection of the inhalable particle fraction, and quantification of free-standing and protruding MWCNTs by measuring the concentration of released lead ions. In vitro toxicity studies for genotoxicity, reactive oxygen species formation, and cell viability were performed using A549 human alveolar epithelial cells and THP-1 monocyte-derived macrophages. The quantification experiment revealed that in the respirable fraction of the abraded particles, approximately 4000 ppm of the MWCNTs were released as exposed MWCNTs (which could contact lung cells upon inhalation) and approximately 40 ppm as free-standing MWCNTs in the worst-case scenario. The release of exposed MWCNTs was lower for nanocomposites containing agglomerated MWCNTs. The toxicity tests revealed that the abraded particles did not induce any acute cytotoxic effects.
Jeanneau, Laurent; Faure, Pierre
2010-09-01
The quantitative multimolecular approach (QMA) based on an exhaustive identification and quantification of molecules from the extractable organic matter (EOM) has been recently developed in order to investigate organic contamination in sediments by a more complete method than the restrictive quantification of target contaminants. Such an approach allows (i) the comparison between natural and anthropogenic inputs, (ii) between modern and fossil organic matter and (iii) the differentiation between several anthropogenic sources. However QMA is based on the quantification of molecules recovered by organic solvent and then analyzed by gas chromatography-mass spectrometry, which represent a small fraction of sedimentary organic matter (SOM). In order to extend the conclusions of QMA to SOM, radiocarbon analyses have been performed on organic extracts and decarbonated sediments. This analysis allows (i) the differentiation between modern biomass (contemporary (14)C) and fossil organic matter ((14)C-free) and (ii) the calculation of the modern carbon percentage (PMC). At the confluence between Fensch and Moselle Rivers, a catchment highly contaminated by both industrial activities and urbanization, PMC values in decarbonated sediments are well correlated with the percentage of natural molecular markers determined by QMA. It highlights that, for this type of contamination by fossil organic matter inputs, the conclusions of QMA can be scaled up to SOM. QMA is an efficient environmental diagnostic tool that leads to a more realistic quantification of fossil organic matter in sediments. Copyright 2010 Elsevier B.V. All rights reserved.
Moraleja, I; Mena, M L; Lázaro, A; Neumann, B; Tejedor, A; Jakubowski, N; Gómez-Gómez, M M; Esteban-Fernández, D
2018-02-01
Laser ablation inductively coupled plasma mass spectrometry (LA-ICP-MS) has been revealed as a convenient technique for trace elemental imaging in tissue sections, providing elemental 2D distribution at a quantitative level. For quantification purposes, in the last years several approaches have been proposed in the literature such as the use of CRMs or matrix matched standards. The use of Isotope Dilution (ID) for quantification by LA-ICP-MS has been also described, being mainly useful for bulk analysis but not feasible for spatial measurements so far. In this work, a quantification method based on ID analysis was developed by printing isotope-enriched inks onto kidney slices from rats treated with antitumoral Pt-based drugs using a commercial ink-jet device, in order to perform an elemental quantification in different areas from bio-images. For the ID experiments 194 Pt enriched platinum was used. The methodology was validated by deposition of natural Pt standard droplets with a known amount of Pt onto the surface of a control tissue, where could be quantified even 50pg of Pt, with recoveries higher than 90%. The amount of Pt present in the whole kidney slices was quantified for cisplatin, carboplatin and oxaliplatin-treated rats. The results obtained were in accordance with those previously reported. The amount of Pt distributed between the medullar and cortical areas was also quantified, observing different behavior for the three drugs. Copyright © 2017 Elsevier B.V. All rights reserved.
An Optimized Informatics Pipeline for Mass Spectrometry-Based Peptidomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Wu, Chaochao; Monroe, Matthew E.; Xu, Zhe
2015-12-26
Comprehensive MS analysis of peptidome, the intracellular and intercellular products of protein degradation, has the potential to provide novel insights on endogenous proteolytic processing and their utility in disease diagnosis and prognosis. Along with the advances in MS instrumentation, a plethora of proteomics data analysis tools have been applied for direct use in peptidomics; however an evaluation of the currently available informatics pipelines for peptidomics data analysis has yet to be reported. In this study, we set off by evaluating the results of several popular MS/MS database search engines including MS-GF+, SEQUEST and MS-Align+ for peptidomics data analysis, followed bymore » identification and label-free quantification using the well-established accurate mass and time (AMT) tag and newly developed informed quantification (IQ) approaches, both based on direct LC-MS analysis. Our result demonstrated that MS-GF+ outperformed both SEQUEST and MS-Align+ in identifying peptidome peptides. Using a database established from the MS-GF+ peptide identifications, both the AMT tag and IQ approaches provided significantly deeper peptidome coverage and less missing value for each individual data set than the MS/MS methods, while achieving robust label-free quantification. Besides having an excellent correlation with the AMT tag quantification results, IQ also provided slightly higher peptidome coverage than AMT. Taken together, we propose an optimal informatics pipeline combining MS-GF+ for initial database searching with IQ (or AMT) for identification and label-free quantification for high-throughput, comprehensive and quantitative peptidomics analysis.« less
Lu, Tzong-Shi; Yiao, Szu-Yu; Lim, Kenneth; Jensen, Roderick V; Hsiao, Li-Li
2010-07-01
The identification of differences in protein expression resulting from methodical variations is an essential component to the interpretation of true, biologically significant results. We used the Lowry and Bradford methods- two most commonly used methods for protein quantification, to assess whether differential protein expressions are a result of true biological or methodical variations. MATERIAL #ENTITYSTARTX00026; Differential protein expression patterns was assessed by western blot following protein quantification by the Lowry and Bradford methods. We have observed significant variations in protein concentrations following assessment with the Lowry versus Bradford methods, using identical samples. Greater variations in protein concentration readings were observed over time and in samples with higher concentrations, with the Bradford method. Identical samples quantified using both methods yielded significantly different expression patterns on Western blot. We show for the first time that methodical variations observed in these protein assay techniques, can potentially translate into differential protein expression patterns, that can be falsely taken to be biologically significant. Our study therefore highlights the pivotal need to carefully consider methodical approaches to protein quantification in techniques that report quantitative differences.
Automatic computational labeling of glomerular textural boundaries
NASA Astrophysics Data System (ADS)
Ginley, Brandon; Tomaszewski, John E.; Sarder, Pinaki
2017-03-01
The glomerulus, a specialized bundle of capillaries, is the blood filtering unit of the kidney. Each human kidney contains about 1 million glomeruli. Structural damages in the glomerular micro-compartments give rise to several renal conditions; most severe of which is proteinuria, where excessive blood proteins flow freely to the urine. The sole way to confirm glomerular structural damage in renal pathology is by examining histopathological or immunofluorescence stained needle biopsies under a light microscope. However, this method is extremely tedious and time consuming, and requires manual scoring on the number and volume of structures. Computational quantification of equivalent features promises to greatly ease this manual burden. The largest obstacle to computational quantification of renal tissue is the ability to recognize complex glomerular textural boundaries automatically. Here we present a computational pipeline to accurately identify glomerular boundaries with high precision and accuracy. The computational pipeline employs an integrated approach composed of Gabor filtering, Gaussian blurring, statistical F-testing, and distance transform, and performs significantly better than standard Gabor based textural segmentation method. Our integrated approach provides mean accuracy/precision of 0.89/0.97 on n = 200Hematoxylin and Eosin (HE) glomerulus images, and mean 0.88/0.94 accuracy/precision on n = 200 Periodic Acid Schiff (PAS) glomerulus images. Respective accuracy/precision of the Gabor filter bank based method is 0.83/0.84 for HE and 0.78/0.8 for PAS. Our method will simplify computational partitioning of glomerular micro-compartments hidden within dense textural boundaries. Automatic quantification of glomeruli will streamline structural analysis in clinic, and can help realize real time diagnoses and interventions.
A quantitative polymerase chain reaction (qPCR) method for the detection of entercocci fecal indicator bacteria has been shown to be generally applicable for the analysis of temperate fresh (Great Lakes) and marine coastal waters and for providing risk-based determinations of wat...
Mellerup, Anders; Ståhl, Marie
2015-01-01
The aim of this article was to define the sampling level and method combination that captures antibiotic resistance at pig herd level utilizing qPCR antibiotic resistance gene quantification and culture-based quantification of antibiotic resistant coliform indicator bacteria. Fourteen qPCR assays for commonly detected antibiotic resistance genes were developed, and used to quantify antibiotic resistance genes in total DNA from swine fecal samples that were obtained using different sampling and pooling methods. In parallel, the number of antibiotic resistant coliform indicator bacteria was determined in the same swine fecal samples. The results showed that the qPCR assays were capable of detecting differences in antibiotic resistance levels in individual animals that the coliform bacteria colony forming units (CFU) could not. Also, the qPCR assays more accurately quantified antibiotic resistance genes when comparing individual sampling and pooling methods. qPCR on pooled samples was found to be a good representative for the general resistance level in a pig herd compared to the coliform CFU counts. It had significantly reduced relative standard deviations compared to coliform CFU counts in the same samples, and therefore differences in antibiotic resistance levels between samples were more readily detected. To our knowledge, this is the first study to describe sampling and pooling methods for qPCR quantification of antibiotic resistance genes in total DNA extracted from swine feces. PMID:26114765
Crombach, Anton; Cicin-Sain, Damjan; Wotton, Karl R; Jaeger, Johannes
2012-01-01
Understanding the function and evolution of developmental regulatory networks requires the characterisation and quantification of spatio-temporal gene expression patterns across a range of systems and species. However, most high-throughput methods to measure the dynamics of gene expression do not preserve the detailed spatial information needed in this context. For this reason, quantification methods based on image bioinformatics have become increasingly important over the past few years. Most available approaches in this field either focus on the detailed and accurate quantification of a small set of gene expression patterns, or attempt high-throughput analysis of spatial expression through binary pattern extraction and large-scale analysis of the resulting datasets. Here we present a robust, "medium-throughput" pipeline to process in situ hybridisation patterns from embryos of different species of flies. It bridges the gap between high-resolution, and high-throughput image processing methods, enabling us to quantify graded expression patterns along the antero-posterior axis of the embryo in an efficient and straightforward manner. Our method is based on a robust enzymatic (colorimetric) in situ hybridisation protocol and rapid data acquisition through wide-field microscopy. Data processing consists of image segmentation, profile extraction, and determination of expression domain boundary positions using a spline approximation. It results in sets of measured boundaries sorted by gene and developmental time point, which are analysed in terms of expression variability or spatio-temporal dynamics. Our method yields integrated time series of spatial gene expression, which can be used to reverse-engineer developmental gene regulatory networks across species. It is easily adaptable to other processes and species, enabling the in silico reconstitution of gene regulatory networks in a wide range of developmental contexts.
Fee, Timothy; Downs, Crawford; Eberhardt, Alan; Zhou, Yong; Berry, Joel
2016-07-01
It is well documented that electrospun tissue engineering scaffolds can be fabricated with variable degrees of fiber alignment to produce scaffolds with anisotropic mechanical properties. Several attempts have been made to quantify the degree of fiber alignment within an electrospun scaffold using image-based methods. However, these methods are limited by the inability to produce a quantitative measure of alignment that can be used to make comparisons across publications. Therefore, we have developed a new approach to quantifying the alignment present within a scaffold from scanning electron microscopic (SEM) images. The alignment is determined by using the Sobel approximation of the image gradient to determine the distribution of gradient angles with an image. This data was fit to a Von Mises distribution to find the dispersion parameter κ, which was used as a quantitative measure of fiber alignment. We fabricated four groups of electrospun polycaprolactone (PCL) + Gelatin scaffolds with alignments ranging from κ = 1.9 (aligned) to κ = 0.25 (random) and tested our alignment quantification method on these scaffolds. It was found that our alignment quantification method could distinguish between scaffolds of different alignments more accurately than two other published methods. Additionally, the alignment parameter κ was found to be a good predictor the mechanical anisotropy of our electrospun scaffolds. The ability to quantify fiber alignment within and make direct comparisons of scaffold fiber alignment across publications can reduce ambiguity between published results where cells are cultured on "highly aligned" fibrous scaffolds. This could have important implications for characterizing mechanics and cellular behavior on aligned tissue engineering scaffolds. © 2016 Wiley Periodicals, Inc. J Biomed Mater Res Part A: 104A: 1680-1686, 2016. © 2016 Wiley Periodicals, Inc.
Jeong, Hyun Cheol; Hong, Hee-Do; Kim, Young-Chan; Rhee, Young Kyoung; Choi, Sang Yoon; Kim, Kyung-Tack; Kim, Sung Soo; Lee, Young-Chul; Cho, Chang-Won
2015-01-01
Background: Maltol, as a type of phenolic compounds, is produced by the browning reaction during the high-temperature treatment of ginseng. Thus, maltol can be used as a marker for the quality control of various ginseng products manufactured by high-temperature treatment including red ginseng. For the quantification of maltol in Korean ginseng products, an effective high-performance liquid chromatography-diode array detector (HPLC-DAD) method was developed. Materials and Methods: The HPLC-DAD method for maltol quantification coupled with a liquid-liquid extraction (LLE) method was developed and validated in terms of linearity, precision, and accuracy. An HPLC separation was performed on a C18 column. Results: The LLE methods and HPLC running conditions for maltol quantification were optimized. The calibration curve of the maltol exhibited good linearity (R2 = 1.00). The limit of detection value of maltol was 0.26 μg/mL, and the limit of quantification value was 0.79 μg/mL. The relative standard deviations (RSDs) of the data of the intra- and inter-day experiments were <1.27% and 0.61%, respectively. The results of the recovery test were 101.35–101.75% with an RSD value of 0.21–1.65%. The developed method was applied successfully to quantify the maltol in three ginseng products manufactured by different methods. Conclusion: The results of validation demonstrated that the proposed HPLC-DAD method was useful for the quantification of maltol in various ginseng products. PMID:26246746
Cilia, Giovanni; Cabbri, Riccardo; Maiorana, Giacomo; Cardaio, Ilaria; Dall'Olio, Raffaele; Nanetti, Antonio
2018-04-01
Nosema ceranae is now a widespread honey bee pathogen with high incidence in apiculture. Rapid and reliable detection and quantification methods are a matter of concern for research community, nowadays mainly relying on the use of biomolecular techniques such as PCR, RT-PCR or HRMA. The aim of this technical paper is to provide a new qPCR assay, based on the highly-conserved protein coding gene Hsp70, to detect and quantify the microsporidian Nosema ceranae affecting the western honey bee Apis mellifera. The validation steps to assess efficiency, sensitivity, specificity and robustness of the assay are described also. Copyright © 2018 Elsevier GmbH. All rights reserved.
Bezrukov, Ilja; Schmidt, Holger; Gatidis, Sergios; Mantlik, Frédéric; Schäfer, Jürgen F; Schwenzer, Nina; Pichler, Bernd J
2015-07-01
Pediatric imaging is regarded as a key application for combined PET/MR imaging systems. Because existing MR-based attenuation-correction methods were not designed specifically for pediatric patients, we assessed the impact of 2 potentially influential factors: inter- and intrapatient variability of attenuation coefficients and anatomic variability. Furthermore, we evaluated the quantification accuracy of 3 methods for MR-based attenuation correction without (SEGbase) and with bone prediction using an adult and a pediatric atlas (SEGwBONEad and SEGwBONEpe, respectively) on PET data of pediatric patients. The variability of attenuation coefficients between and within pediatric (5-17 y, n = 17) and adult (27-66 y, n = 16) patient collectives was assessed on volumes of interest (VOIs) in CT datasets for different tissue types. Anatomic variability was assessed on SEGwBONEad/pe attenuation maps by computing mean differences to CT-based attenuation maps for regions of bone tissue, lungs, and soft tissue. PET quantification was evaluated on VOIs with physiologic uptake and on 80% isocontour VOIs with elevated uptake in the thorax and abdomen/pelvis. Inter- and intrapatient variability of the bias was assessed for each VOI group and method. Statistically significant differences in mean VOI Hounsfield unit values and linear attenuation coefficients between adult and pediatric collectives were found in the lungs and femur. The prediction of attenuation maps using the pediatric atlas showed a reduced error in bone tissue and better delineation of bone structure. Evaluation of PET quantification accuracy showed statistically significant mean errors in mean standardized uptake values of -14% ± 5% and -23% ± 6% in bone marrow and femur-adjacent VOIs with physiologic uptake for SEGbase, which could be reduced to 0% ± 4% and -1% ± 5% using SEGwBONEpe attenuation maps. Bias in soft-tissue VOIs was less than 5% for all methods. Lung VOIs showed high SDs in the range of 15% for all methods. For VOIs with elevated uptake, mean and SD were less than 5% except in the thorax. The use of a dedicated atlas for the pediatric patient collective resulted in improved attenuation map prediction in osseous regions and reduced interpatient bias variation in femur-adjacent VOIs. For the lungs, in which intrapatient variation was higher for the pediatric collective, a patient- or group-specific attenuation coefficient might improve attenuation map accuracy. Mean errors of -14% and -23% in bone marrow and femur-adjacent VOIs can affect PET quantification in these regions when bone tissue is ignored. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
Impact of time-of-flight PET on quantification errors in MR imaging-based attenuation correction.
Mehranian, Abolfazl; Zaidi, Habib
2015-04-01
Time-of-flight (TOF) PET/MR imaging is an emerging imaging technology with great capabilities offered by TOF to improve image quality and lesion detectability. We assessed, for the first time, the impact of TOF image reconstruction on PET quantification errors induced by MR imaging-based attenuation correction (MRAC) using simulation and clinical PET/CT studies. Standard 4-class attenuation maps were derived by segmentation of CT images of 27 patients undergoing PET/CT examinations into background air, lung, soft-tissue, and fat tissue classes, followed by the assignment of predefined attenuation coefficients to each class. For each patient, 4 PET images were reconstructed: non-TOF and TOF both corrected for attenuation using reference CT-based attenuation correction and the resulting 4-class MRAC maps. The relative errors between non-TOF and TOF MRAC reconstructions were compared with their reference CT-based attenuation correction reconstructions. The bias was locally and globally evaluated using volumes of interest (VOIs) defined on lesions and normal tissues and CT-derived tissue classes containing all voxels in a given tissue, respectively. The impact of TOF on reducing the errors induced by metal-susceptibility and respiratory-phase mismatch artifacts was also evaluated using clinical and simulation studies. Our results show that TOF PET can remarkably reduce attenuation correction artifacts and quantification errors in the lungs and bone tissues. Using classwise analysis, it was found that the non-TOF MRAC method results in an error of -3.4% ± 11.5% in the lungs and -21.8% ± 2.9% in bones, whereas its TOF counterpart reduced the errors to -2.9% ± 7.1% and -15.3% ± 2.3%, respectively. The VOI-based analysis revealed that the non-TOF and TOF methods resulted in an average overestimation of 7.5% and 3.9% in or near lung lesions (n = 23) and underestimation of less than 5% for soft tissue and in or near bone lesions (n = 91). Simulation results showed that as TOF resolution improves, artifacts and quantification errors are substantially reduced. TOF PET substantially reduces artifacts and improves significantly the quantitative accuracy of standard MRAC methods. Therefore, MRAC should be less of a concern on future TOF PET/MR scanners with improved timing resolution. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.
Bhandari, Pamita; Kumar, Neeraj; Singh, Bikram; Singh, Virendra; Kaur, Inderjeet
2009-08-01
A high performance liquid chromatographic method using a silica-based monolithic column coupled with evaporative light scattering detector (HPLC-ELSD) was developed and validated for simultaneous quantification of bacosides (bacoside A, bacopaside I, bacoside A(3), bacopaside II, bacopaside X, bacopasaponin C) and apigenin in Bacopa monnieri. The chromatographic resolution was achieved on a Chromolith RP-18 (100x4.6 mm) column with acetonitrile/water (30:70) as mobile phase in isocratic elution at a flow rate of 0.7 mL/min. The drift tube temperature of the ELSD was set to 95 degrees C, and the nitrogen flow rate was 2.0 SLM (standard liter per minute). The calibration curves revealed a good linear relationship (r(2) > 0.9988) within the test ranges. The detection limits (S/N = 3) and the quantification limits (S/N = 10) for the compounds were in the range of 0.54-6.06 and 1.61-18.78 microg/mL, respectively. Satisfactory average recovery was observed in the range of 95.8-99.0%. The method showed good reproducibility for the quantification of these compounds in B. monnieri with intra- and inter-day precision of less than 0.69 and 0.67%, respectively. The validated method was successfully applied to quantify analytes in nine accessions of B. monnieri and thus provides a new basis for overall quality assessment of B. monnieri.
NASA Astrophysics Data System (ADS)
Zhang, Xing; Chen, Beibei; He, Man; Zhang, Yiwen; Xiao, Guangyang; Hu, Bin
2015-04-01
The absolute quantification of glycoproteins in complex biological samples is a challenge and of great significance. Herein, 4-mercaptophenylboronic acid functionalized magnetic beads were prepared to selectively capture glycoproteins, while antibody conjugated gold and silver nanoparticles were synthesized as element tags to label two different glycoproteins. Based on that, a new approach of magnetic immunoassay-inductively coupled plasma mass spectrometry (ICP-MS) was established for simultaneous quantitative analysis of glycoproteins. Taking biomarkers of alpha-fetoprotein (AFP) and carcinoembryonic antigen (CEA) as two model glycoproteins, experimental parameters involved in the immunoassay procedure were carefully optimized and analytical performance of the proposed method was evaluated. The limits of detection (LODs) for AFP and CEA were 0.086 μg L- 1 and 0.054 μg L- 1 with the relative standard deviations (RSDs, n = 7, c = 5 μg L- 1) of 6.5% and 6.2% for AFP and CEA, respectively. Linear range for both AFP and CEA was 0.2-50 μg L- 1. To validate the applicability of the proposed method, human serum samples were analyzed, and the obtained results were in good agreement with that obtained by the clinical chemiluminescence immunoassay. The developed method exhibited good selectivity and sensitivity for the simultaneous determination of AFP and CEA, and extended the applicability of metal nanoparticle tags based on ICP-MS methodology in multiple glycoprotein quantifications.
Measuring Mass-Based Hygroscopicity of Atmospheric Particles through in situ Imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piens, Dominique` Y.; Kelly, Stephen T.; Harder, Tristan
Quantifying how atmospheric particles interact with water vapor is critical for understanding the effects of aerosols on climate. We present a novel method to measure the mass-based hygroscopicity of particles while characterizing their elemental and carbon functional group compositions. Since mass-based hygroscopicity is insensitive to particle geometry, it is advantageous for probing the hygroscopic behavior of atmospheric particles, which can have irregular morphologies. Combining scanning electron microscopy with energy dispersive X-ray analysis (SEM/EDX), scanning transmission X-ray microscopy (STXM) analysis, and in situ STXM humidification experiments, this method was validated using laboratory-generated, atmospherically relevant particles. Then, the hygroscopicity and elemental compositionmore » of 15 complex atmospheric particles were analyzed by leveraging quantification of C, N, and O from STXM, and complementary elemental quantification from SEM/EDX. We found three types of hygroscopic responses, and correlated high hygroscopicity with Na and Cl content. The mixing state determined for 158 particles broadly agreed with those of the humidified particles, indicating the potential to infer the atmospheric hygroscopic behavior from a selected subset of particles. These methods offer unique quantitative capabilities to characterize and correlate the hygroscopicity and chemistry of individual submicron atmospheric particles.« less
Kang, Homan; Jeong, Sinyoung; Jo, Ahla; Chang, Hyejin; Yang, Jin-Kyoung; Jeong, Cheolhwan; Kyeong, San; Lee, Youn Woo; Samanta, Animesh; Maiti, Kaustabh Kumar; Cha, Myeong Geun; Kim, Taek-Keun; Lee, Sukmook; Jun, Bong-Hyun; Chang, Young-Tae; Chung, Junho; Lee, Ho-Young; Jeong, Dae Hong; Lee, Yoon-Sik
2018-02-01
Immunotargeting ability of antibodies may show significant difference between in vitro and in vivo. To select antibody leads with high affinity and specificity, it is necessary to perform in vivo validation of antibody candidates following in vitro antibody screening. Herein, a robust in vivo validation of anti-tetraspanin-8 antibody candidates against human colon cancer using ratiometric quantification method is reported. The validation is performed on a single mouse and analyzed by multiplexed surface-enhanced Raman scattering using ultrasensitive and near infrared (NIR)-active surface-enhanced resonance Raman scattering nanoprobes (NIR-SERRS dots). The NIR-SERRS dots are composed of NIR-active labels and Au/Ag hollow-shell assembled silica nanospheres. A 93% of NIR-SERRS dots is detectable at a single-particle level and signal intensity is 100-fold stronger than that from nonresonant molecule-labeled spherical Au NPs (80 nm). The result of SERRS-based antibody validation is comparable to that of the conventional method using single-photon-emission computed tomography. The NIR-SERRS-based strategy is an alternate validation method which provides cost-effective and accurate multiplexing measurements for antibody-based drug development. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Liu, Yingchun; Liu, Zhongbo; Sun, Guoxiang; Wang, Yan; Ling, Junhong; Gao, Jiayue; Huang, Jiahao
2015-01-01
A combination method of multi-wavelength fingerprinting and multi-component quantification by high performance liquid chromatography (HPLC) coupled with diode array detector (DAD) was developed and validated to monitor and evaluate the quality consistency of herbal medicines (HM) in the classical preparation Compound Bismuth Aluminate tablets (CBAT). The validation results demonstrated that our method met the requirements of fingerprint analysis and quantification analysis with suitable linearity, precision, accuracy, limits of detection (LOD) and limits of quantification (LOQ). In the fingerprint assessments, rather than using conventional qualitative "Similarity" as a criterion, the simple quantified ratio fingerprint method (SQRFM) was recommended, which has an important quantified fingerprint advantage over the "Similarity" approach. SQRFM qualitatively and quantitatively offers the scientific criteria for traditional Chinese medicines (TCM)/HM quality pyramid and warning gate in terms of three parameters. In order to combine the comprehensive characterization of multi-wavelength fingerprints, an integrated fingerprint assessment strategy based on information entropy was set up involving a super-information characteristic digitized parameter of fingerprints, which reveals the total entropy value and absolute information amount about the fingerprints and, thus, offers an excellent method for fingerprint integration. The correlation results between quantified fingerprints and quantitative determination of 5 marker compounds, including glycyrrhizic acid (GLY), liquiritin (LQ), isoliquiritigenin (ILG), isoliquiritin (ILQ) and isoliquiritin apioside (ILA), indicated that multi-component quantification could be replaced by quantified fingerprints. The Fenton reaction was employed to determine the antioxidant activities of CBAT samples in vitro, and they were correlated with HPLC fingerprint components using the partial least squares regression (PLSR) method. In summary, the method of multi-wavelength fingerprints combined with antioxidant activities has been proved to be a feasible and scientific procedure for monitoring and evaluating the quality consistency of CBAT.
Quantification of fibre polymerization through Fourier space image analysis
Nekouzadeh, Ali; Genin, Guy M.
2011-01-01
Quantification of changes in the total length of randomly oriented and possibly curved lines appearing in an image is a necessity in a wide variety of biological applications. Here, we present an automated approach based upon Fourier space analysis. Scaled, band-pass filtered power spectral densities of greyscale images are integrated to provide a quantitative measurement of the total length of lines of a particular range of thicknesses appearing in an image. A procedure is presented to correct for changes in image intensity. The method is most accurate for two-dimensional processes with fibres that do not occlude one another. PMID:24959096
NASA Astrophysics Data System (ADS)
Gao, Simon S.; Liu, Li; Bailey, Steven T.; Flaxel, Christina J.; Huang, David; Li, Dengwang; Jia, Yali
2016-07-01
Quantification of choroidal neovascularization (CNV) as visualized by optical coherence tomography angiography (OCTA) may have importance clinically when diagnosing or tracking disease. Here, we present an automated algorithm to quantify the vessel skeleton of CNV as vessel length. Initial segmentation of the CNV on en face angiograms was achieved using saliency-based detection and thresholding. A level set method was then used to refine vessel edges. Finally, a skeleton algorithm was applied to identify vessel centerlines. The algorithm was tested on nine OCTA scans from participants with CNV and comparisons of the algorithm's output to manual delineation showed good agreement.
Analysis of illegal peptide drugs via HILIC-DAD-MS.
Janvier, Steven; De Sutter, Evelien; Wynendaele, Evelien; De Spiegeleer, Bart; Vanhee, Celine; Deconinck, Eric
2017-11-01
Biopharmaceuticals have established themselves as highly efficient medicines, and are still one of the fastest growing parts of the health-product industry. Unfortunately, the introduction of these promising new drugs went hand in hand with the creation of a black market for illegal and counterfeit biotechnology drugs. Particularly popular are the lyophilised peptides with a molecular weight of less than 5kDa. Most of them are meant for subcutaneous injection and are easily accessible via the internet. In recent years, different methods based on reversed phase liquid chromatography have been developed to detect and quantify these peptides. The emerging of more polar peptides however requires the introduction of other separation techniques. Therefore, we set out to develop and validate an analytical method based on hydrophilic interaction liquid chromatography (HILIC) to identify and quantify the most frequently encountered illegal peptides on the European market. For this objective, five different HILIC columns were selected and screened for their chromatographic performance. Among those columns, the ZIC HILIC column showed the best performance under the tested screening conditions in terms of resolution and symmetry factor for the targeted peptide set. Hence, the operational conditions were further optimised for the identification of illegal preparations via mass spectrometry (MS) and quantification via UV. Validation was performed via accuracy profiles based on the ISO 17025 guideline. The obtained validated HILIC-method allows for the detection and quantification of the most frequently encountered illegal peptides on the internet in a total run time of 35min including post gradient equilibration and online cleaning step. Combined with a previously developed RPLC-method, the ZIC HILIC system allows for the detection and quantification of a wide spectrum of illicit peptide drugs available on the internet. Furthermore, the developed method could also be envisaged for the detection of new emerging polar peptide drugs. Copyright © 2017 Elsevier B.V. All rights reserved.
Rapid method for the quantification of hydroquinone concentration: chemiluminescent analysis.
Chen, Tung-Sheng; Liou, Show-Yih; Kuo, Wei-Wen; Wu, Hsi-Chin; Jong, Gwo-Ping; Wang, Hsueh-Fang; Shen, Chia-Yao; Padma, V Vijaya; Huang, Chih-Yang; Chang, Yen-Lin
2015-11-01
Topical hydroquinone serves as a skin whitener and is usually available in cosmetics or on prescription based on the hydroquinone concentration. Quantification of hydroquinone content therefore becomes an important issue in topical agents. High-performance liquid chromatography (HPLC) is the commonest method for determining hydroquinone content in topical agents, but this method is time-consuming and uses many solvents that can become an environmental issue. We report a rapid method for quantifying hydroquinone content by chemiluminescent analysis. Hydroquinone induces the production of hydrogen peroxide in the presence of basic compounds. Hydrogen peroxide induced by hydroquinone oxidized light-emitting materials such as lucigenin, resulted in the production of ultra-weak chemiluminescence that was detected by a chemiluminescence analyzer. The intensity of the chemiluminescence was found to be proportional to the hydroquinone concentration. We suggest that the rapid (measurement time, 60 s) and virtually solvent-free (solvent volume, <2 mL) chemiluminescent method described here for quantifying hydroquinone content may be an alternative to HPLC analysis. Copyright © 2015 John Wiley & Sons, Ltd.
Large scale systematic proteomic quantification from non-metastatic to metastatic colorectal cancer
NASA Astrophysics Data System (ADS)
Yin, Xuefei; Zhang, Yang; Guo, Shaowen; Jin, Hong; Wang, Wenhai; Yang, Pengyuan
2015-07-01
A systematic proteomic quantification of formalin-fixed, paraffin-embedded (FFPE) colorectal cancer tissues from stage I to stage IIIC was performed in large scale. 1017 proteins were identified with 338 proteins in quantitative changes by label free method, while 341 proteins were quantified with significant expression changes among 6294 proteins by iTRAQ method. We found that proteins related to migration expression increased and those for binding and adherent decreased during the colorectal cancer development according to the gene ontology (GO) annotation and ingenuity pathway analysis (IPA). The integrin alpha 5 (ITA5) in integrin family was focused, which was consistent with the metastasis related pathway. The expression level of ITA5 decreased in metastasis tissues and the result has been further verified by Western blotting. Another two cell migration related proteins vitronectin (VTN) and actin-related protein (ARP3) were also proved to be up-regulated by both mass spectrometry (MS) based quantification results and Western blotting. Up to now, our result shows one of the largest dataset in colorectal cancer proteomics research. Our strategy reveals a disease driven omics-pattern for the metastasis colorectal cancer.
Holzhauser, Thomas; Kleiner, Kornelia; Janise, Annabella; Röder, Martin
2014-11-15
A novel method to quantify species or DNA on the basis of a competitive quantitative real-time polymerase chain reaction (cqPCR) was developed. Potentially allergenic peanut in food served as one example. Based on an internal competitive DNA sequence for normalisation of DNA extraction and amplification, the cqPCR was threshold-calibrated against 100mg/kg incurred peanut in milk chocolate. No external standards were necessary. The competitive molecule successfully served as calibrator for quantification, matrix normalisation, and inhibition control. Although designed for verification of a virtual threshold of 100mg/kg, the method allowed quantification of 10-1,000 mg/kg peanut incurred in various food matrices and without further matrix adaption: On the basis of four PCR replicates per sample, mean recovery of 10-1,000 mg/kg peanut in chocolate, vanilla ice cream, cookie dough, cookie, and muesli was 87% (range: 39-147%) in comparison to 199% (range: 114-237%) by three commercial ELISA kits. Copyright © 2014 Elsevier Ltd. All rights reserved.
NASA Astrophysics Data System (ADS)
Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B. E.
2013-03-01
Accurate documentation of disease severity is a prerequisite for clinical research and the practice of evidence-based medicine. The quantification of skin diseases such as psoriasis currently relies heavily on clinical scores. Although these clinical scoring methods are well established and very useful in quantifying disease severity, they require an extensive clinical experience and carry a risk of subjectivity. We explore the opportunity to use in vivo near-infrared (NIR) spectra as an objective and noninvasive method for local disease severity assessment in 31 psoriasis patients in whom selected plaques were scored clinically. A partial least squares (PLS) regression model was used to analyze and predict the severity scores on the NIR spectra of psoriatic and uninvolved skin. The correlation between predicted and clinically assigned scores was R=0.94 (RMSE=0.96), suggesting that in vivo NIR provides accurate clinical quantification of psoriatic plaques. Hence, NIR may be a practical solution to clinical severity assessment of psoriasis, providing a continuous, linear, numerical value of severity.
Iron deposition quantification: Applications in the brain and liver.
Yan, Fuhua; He, Naying; Lin, Huimin; Li, Ruokun
2018-06-13
Iron has long been implicated in many neurological and other organ diseases. It is known that over and above the normal increases in iron with age, in certain diseases there is an excessive iron accumulation in the brain and liver. MRI is a noninvasive means by which to image the various structures in the brain in three dimensions and quantify iron over the volume of the object of interest. The quantification of iron can provide information about the severity of iron-related diseases as well as quantify changes in iron for patient follow-up and treatment monitoring. This article provides an overview of current MRI-based methods for iron quantification, specifically for the brain and liver, including: signal intensity ratio, R 2 , R2*, R2', phase, susceptibility weighted imaging and quantitative susceptibility mapping (QSM). Although there are numerous approaches to measuring iron, R 2 and R2* are currently preferred methods in imaging the liver and QSM has become the preferred approach for imaging iron in the brain. 5 Technical Efficacy: Stage 5 J. Magn. Reson. Imaging 2018. © 2018 International Society for Magnetic Resonance in Medicine.
Engel, A; Plöger, M; Mulac, D; Langer, K
2014-01-30
Nanoparticles composed of poly(DL-lactide-co-glycolide) (PLGA) represent promising colloidal drug carriers for improved drug targeting. Although most research activities are focused on intravenous application of these carriers the peroral administration is described to improve bioavailability of poorly soluble drugs. Based on these insights the manuscript describes a model tablet formulation for PLGA-nanoparticles and especially its analytical characterisation with regard to a nanosized drug carrier. Besides physico-chemical tablet characterisation according to pharmacopoeias the main goal of the study was the development of a suitable analytical method for the quantification of nanoparticle release from tablets. An analytical flow field-flow fractionation (AF4) method was established and validated which enables determination of nanoparticle content in solid dosage forms as well as quantification of particle release during dissolution testing. For particle detection a multi-angle light scattering (MALS) detector was coupled to the AF4-system. After dissolution testing, the presence of unaltered PLGA-nanoparticles was successfully proved by dynamic light scattering and scanning electron microscopy. Copyright © 2013 Elsevier B.V. All rights reserved.
Zhang, Weihua; Yi, Jing; Mekarski, Pawel; Ungar, Kurt; Hauck, Barry; Kramer, Gary H
2011-06-01
The purpose of this study is to investigate the possibility of verifying depleted uranium (DU), natural uranium (NU), low enriched uranium (LEU) and high enriched uranium (HEU) by a developed digital gamma-gamma coincidence spectroscopy. The spectroscopy consists of two NaI(Tl) scintillators and XIA LLC Digital Gamma Finder (DGF)/Pixie-4 software and card package. The results demonstrate that the spectroscopy provides an effective method of (235)U and (238)U quantification based on the count rate of their gamma-gamma coincidence counting signatures. The main advantages of this approach over the conventional gamma spectrometry include the facts of low background continuum near coincident signatures of (235)U and (238)U, less interference from other radionuclides by the gamma-gamma coincidence counting, and region-of-interest (ROI) imagine analysis for uranium enrichment determination. Compared to conventional gamma spectrometry, the method offers additional advantage of requiring minimal calibrations for (235)U and (238)U quantification at different sample geometries. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.
Ahn, Sung Hee; Bae, Yong Jin; Moon, Jeong Hee; Kim, Myung Soo
2013-09-17
We propose to divide matrix suppression in matrix-assisted laser desorption ionization into two parts, normal and anomalous. In quantification of peptides, the normal effect can be accounted for by constructing the calibration curve in the form of peptide-to-matrix ion abundance ratio versus concentration. The anomalous effect forbids reliable quantification and is noticeable when matrix suppression is larger than 70%. With this 70% rule, matrix suppression becomes a guideline for reliable quantification, rather than a nuisance. A peptide in a complex mixture can be quantified even in the presence of large amounts of contaminants, as long as matrix suppression is below 70%. The theoretical basis for the quantification method using a peptide as an internal standard is presented together with its weaknesses. A systematic method to improve quantification of high concentration analytes has also been developed.
Cai, Yicun; He, Yuping; Lv, Rong; Chen, Hongchao; Wang, Qiang; Pan, Liangwen
2017-01-01
Meat products often consist of meat from multiple animal species, and inaccurate food product adulteration and mislabeling can negatively affect consumers. Therefore, a cost-effective and reliable method for identification and quantification of animal species in meat products is required. In this study, we developed a duplex droplet digital PCR (dddPCR) detection and quantification system to simultaneously identify and quantify the source of meat in samples containing a mixture of beef (Bos taurus) and pork (Sus scrofa) in a single digital PCR reaction tube. Mixed meat samples of known composition were used to test the accuracy and applicability of this method. The limit of detection (LOD) and the limit of quantification (LOQ) of this detection and quantification system were also identified. We conclude that our dddPCR detection and quantification system is suitable for quality control and routine analyses of meat products.
Usability of calcium carbide gas pressure method in hydrological sciences
NASA Astrophysics Data System (ADS)
Arsoy, S.; Ozgur, M.; Keskin, E.; Yilmaz, C.
2013-10-01
Soil moisture is a key engineering variable with major influence on ecological and hydrological processes as well as in climate, weather, agricultural, civil and geotechnical applications. Methods for quantification of the soil moisture are classified into three main groups: (i) measurement with remote sensing, (ii) estimation via (soil water balance) simulation models, and (iii) measurement in the field (ground based). Remote sensing and simulation modeling require rapid ground truthing with one of the ground based methods. Calcium carbide gas pressure (CCGP) method is a rapid measurement procedure for obtaining soil moisture and relies on the chemical reaction of the calcium carbide reagent with the water in soil pores. However, the method is overlooked in hydrological science applications. Therefore, the purpose of this study is to evaluate the usability of the CCGP method in comparison with standard oven-drying and dielectric methods in terms of accuracy, time efficiency, operational ease, cost effectiveness and safety for quantification of the soil moisture over a wide range of soil types. The research involved over 250 tests that were carried out on 15 different soil types. It was found that the accuracy of the method is mostly within ±1% of soil moisture deviation range in comparison to oven-drying, and that CCGP method has significant advantages over dielectric methods in terms of accuracy, cost, operational ease and time efficiency for the purpose of ground truthing.
de Kinkelder, R; van der Veen, R L P; Verbaak, F D; Faber, D J; van Leeuwen, T G; Berendschot, T T J M
2011-01-01
Purpose Accurate assessment of the amount of macular pigment (MPOD) is necessary to investigate the role of carotenoids and their assumed protective functions. High repeatability and reliability are important to monitor patients in studies investigating the influence of diet and supplements on MPOD. We evaluated the Macuscope (Macuvision Europe Ltd., Lapworth, Solihull, UK), a recently introduced device for measuring MPOD using the technique of heterochromatic flicker photometry (HFP). We determined agreement with another HFP device (QuantifEye; MPS 9000 series: Tinsley Precision Instruments Ltd., Croydon, Essex, UK) and a fundus reflectance method. Methods The right eyes of 23 healthy subjects (mean age 33.9±15.1 years) were measured. We determined agreement with QuantifEye and correlation with a fundus reflectance method. Repeatability of QuantifEye was assessed in 20 other healthy subjects (mean age 32.1±7.3 years). Repeatability was also compared with measurements by a fundus reflectance method in 10 subjects. Results We found low agreement between test and retest measurements with Macuscope. The average difference and the limits of agreement were −0.041±0.32. We found high agreement between test and retest measurements of QuantifEye (−0.02±0.18) and the fundus reflectance method (−0.04±0.18). MPOD data obtained by Macuscope and QuantifEye showed poor agreement: −0.017±0.44. For Macuscope and the fundus reflectance method, the correlation coefficient was r=0.05 (P=0.83). A significant correlation of r=0.87 (P<0.001) was found between QuantifEye and the fundus reflectance method. Conclusions Because repeatability of Macuscope measurements was low (ie, wide limits of agreement) and MPOD values correlated poorly with the fundus reflectance method, and agreed poorly with QuantifEye, the tested Macuscope protocol seems less suitable for studying MPOD. PMID:21057522
Baldelli, Sara; Marrubini, Giorgio; Cattaneo, Dario; Clementi, Emilio; Cerea, Matteo
2017-10-01
The application of Quality by Design (QbD) principles in clinical laboratories can help to develop an analytical method through a systematic approach, providing a significant advance over the traditional heuristic and empirical methodology. In this work, we applied for the first time the QbD concept in the development of a method for drug quantification in human plasma using elvitegravir as the test molecule. The goal of the study was to develop a fast and inexpensive quantification method, with precision and accuracy as requested by the European Medicines Agency guidelines on bioanalytical method validation. The method was divided into operative units, and for each unit critical variables affecting the results were identified. A risk analysis was performed to select critical process parameters that should be introduced in the design of experiments (DoEs). Different DoEs were used depending on the phase of advancement of the study. Protein precipitation and high-performance liquid chromatography-tandem mass spectrometry were selected as the techniques to be investigated. For every operative unit (sample preparation, chromatographic conditions, and detector settings), a model based on factors affecting the responses was developed and optimized. The obtained method was validated and clinically applied with success. To the best of our knowledge, this is the first investigation thoroughly addressing the application of QbD to the analysis of a drug in a biological matrix applied in a clinical laboratory. The extensive optimization process generated a robust method compliant with its intended use. The performance of the method is continuously monitored using control charts.
Fischedick, Justin T; Glas, Ronald; Hazekamp, Arno; Verpoorte, Rob
2009-01-01
Cannabis and cannabinoid based medicines are currently under serious investigation for legitimate development as medicinal agents, necessitating new low-cost, high-throughput analytical methods for quality control. The goal of this study was to develop and validate, according to ICH guidelines, a simple rapid HPTLC method for the quantification of Delta(9)-tetrahydrocannabinol (Delta(9)-THC) and qualitative analysis of other main neutral cannabinoids found in cannabis. The method was developed and validated with the use of pure cannabinoid reference standards and two medicinal cannabis cultivars. Accuracy was determined by comparing results obtained from the HTPLC method with those obtained from a validated HPLC method. Delta(9)-THC gives linear calibration curves in the range of 50-500 ng at 206 nm with a linear regression of y = 11.858x + 125.99 and r(2) = 0.9968. Results have shown that the HPTLC method is reproducible and accurate for the quantification of Delta(9)-THC in cannabis. The method is also useful for the qualitative screening of the main neutral cannabinoids found in cannabis cultivars.
Alagandula, Ravali; Zhou, Xiang; Guo, Baochuan
2017-01-15
Liquid chromatography/tandem mass spectrometry (LC/MS/MS) is the gold standard of urine drug testing. However, current LC-based methods are time consuming, limiting the throughput of MS-based testing and increasing the cost. This is particularly problematic for quantification of drugs such as phenobarbital, which is often analyzed in a separate run because they must be negatively ionized. This study examined the feasibility of using a dilute-and-shoot flow-injection method without LC separation to quantify drugs with phenobarbital as a model system. Briefly, a urine sample containing phenobarbital was first diluted by 10 times, followed by flow injection of the diluted sample to mass spectrometer. Quantification and detection of phenobarbital were achieved by an electrospray negative ionization MS/MS system operated in the multiple reaction monitoring (MRM) mode with the stable-isotope-labeled drug as internal standard. The dilute-and-shoot flow-injection method developed was linear with a dynamic range of 50-2000 ng/mL of phenobarbital and correlation coefficient > 0.9996. The coefficients of variation and relative errors for intra- and inter-assays at four quality control (QC) levels (50, 125, 445 and 1600 ng/mL) were 3.0% and 5.0%, respectively. The total run time to quantify one sample was 2 min, and the sensitivity and specificity of the method did not deteriorate even after 1200 consecutive injections. Our method can accurately and robustly quantify phenobarbital in urine without LC separation. Because of its 2 min run time, the method can process 720 samples per day. This feasibility study shows that the dilute-and-shoot flow-injection method can be a general way for fast analysis of drugs in urine. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.
NASA Astrophysics Data System (ADS)
Rana, Sachin; Ertekin, Turgay; King, Gregory R.
2018-05-01
Reservoir history matching is frequently viewed as an optimization problem which involves minimizing misfit between simulated and observed data. Many gradient and evolutionary strategy based optimization algorithms have been proposed to solve this problem which typically require a large number of numerical simulations to find feasible solutions. Therefore, a new methodology referred to as GP-VARS is proposed in this study which uses forward and inverse Gaussian processes (GP) based proxy models combined with a novel application of variogram analysis of response surface (VARS) based sensitivity analysis to efficiently solve high dimensional history matching problems. Empirical Bayes approach is proposed to optimally train GP proxy models for any given data. The history matching solutions are found via Bayesian optimization (BO) on forward GP models and via predictions of inverse GP model in an iterative manner. An uncertainty quantification method using MCMC sampling in conjunction with GP model is also presented to obtain a probabilistic estimate of reservoir properties and estimated ultimate recovery (EUR). An application of the proposed GP-VARS methodology on PUNQ-S3 reservoir is presented in which it is shown that GP-VARS provides history match solutions in approximately four times less numerical simulations as compared to the differential evolution (DE) algorithm. Furthermore, a comparison of uncertainty quantification results obtained by GP-VARS, EnKF and other previously published methods shows that the P50 estimate of oil EUR obtained by GP-VARS is in close agreement to the true values for the PUNQ-S3 reservoir.
Fluorescent quantification of melanin.
Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur
2016-11-01
Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.
Collender, Philip A.; Kirby, Amy E.; Addiss, David G.; Freeman, Matthew C.; Remais, Justin V.
2015-01-01
Limiting the environmental transmission of soil-transmitted helminths (STH), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost effective methods to detect and quantify STH in the environment. We review the state of the art of STH quantification in soil, biosolids, water, produce, and vegetation with respect to four major methodological issues: environmental sampling; recovery of STH from environmental matrices; quantification of recovered STH; and viability assessment of STH ova. We conclude that methods for sampling and recovering STH require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment. PMID:26440788
Smart sensor for real-time quantification of common symptoms present in unhealthy plants.
Contreras-Medina, Luis M; Osornio-Rios, Roque A; Torres-Pacheco, Irineo; Romero-Troncoso, Rene de J; Guevara-González, Ramon G; Millan-Almaraz, Jesus R
2012-01-01
Plant responses to physiological function disorders are called symptoms and they are caused principally by pathogens and nutritional deficiencies. Plant symptoms are commonly used as indicators of the health and nutrition status of plants. Nowadays, the most popular method to quantify plant symptoms is based on visual estimations, consisting on evaluations that raters give based on their observation of plant symptoms; however, this method is inaccurate and imprecise because of its obvious subjectivity. Computational Vision has been employed in plant symptom quantification because of its accuracy and precision. Nevertheless, the systems developed so far lack in-situ, real-time and multi-symptom analysis. There exist methods to obtain information about the health and nutritional status of plants based on reflectance and chlorophyll fluorescence, but they use expensive equipment and are frequently destructive. Therefore, systems able of quantifying plant symptoms overcoming the aforementioned disadvantages that can serve as indicators of health and nutrition in plants are desirable. This paper reports an FPGA-based smart sensor able to perform non-destructive, real-time and in-situ analysis of leaf images to quantify multiple symptoms presented by diseased and malnourished plants; this system can serve as indicator of the health and nutrition in plants. The effectiveness of the proposed smart-sensor was successfully tested by analyzing diseased and malnourished plants.
Quantification of Global DNA Methylation Levels by Mass Spectrometry.
Fernandez, Agustin F; Valledor, Luis; Vallejo, Fernando; Cañal, Maria Jesús; Fraga, Mario F
2018-01-01
Global DNA methylation was classically considered the relative percentage of 5-methylcysine (5mC) with respect to total cytosine (C). Early approaches were based on the use of high-performance separation technologies and UV detection. However, the recent development of protocols using mass spectrometry for the detection has increased sensibility and permitted the precise identification of peak compounds based on their molecular masses. This allows work to be conducted with much less genomic DNA starting material and also to quantify 5-hydroxymethyl-cytosine (5hmC), a recently identified form of methylated cytosine that could play an important role in active DNA demethylation. Here, we describe the protocol that we currently use in our laboratory to analyze 5mC and 5hmC by mass spectrometry. The protocol, which is based on the method originally developed by Le and colleagues using Ultra Performance Liquid Chromatography (UPLC) and mass spectrometry (triple Quadrupole (QqQ)) detection, allows for the rapid and accurate quantification of relative global 5mC and 5hmC levels starting from just 1 μg of genomic DNA, which allows for the rapid and accurate quantification of relative global 5mC and 5hmC levels.
Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong
2014-07-01
Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Chaouachi, Maher; El Malki, Redouane; Berard, Aurélie; Romaniuk, Marcel; Laval, Valérie; Brunel, Dominique; Bertheau, Yves
2008-03-26
The labeling of products containing genetically modified organisms (GMO) is linked to their quantification since a threshold for the presence of fortuitous GMOs in food has been established. This threshold is calculated from a combination of two absolute quantification values: one for the specific GMO target and the second for an endogenous reference gene specific to the taxon. Thus, the development of reliable methods to quantify GMOs using endogenous reference genes in complex matrixes such as food and feed is needed. Plant identification can be difficult in the case of closely related taxa, which moreover are subject to introgression events. Based on the homology of beta-fructosidase sequences obtained from public databases, two couples of consensus primers were designed for the detection, quantification, and differentiation of four Solanaceae: potato (Solanum tuberosum), tomato (Solanum lycopersicum), pepper (Capsicum annuum), and eggplant (Solanum melongena). Sequence variability was studied first using lines and cultivars (intraspecies sequence variability), then using taxa involved in gene introgressions, and finally, using taxonomically close taxa (interspecies sequence variability). This study allowed us to design four highly specific TaqMan-MGB probes. A duplex real time PCR assay was developed for simultaneous quantification of tomato and potato. For eggplant and pepper, only simplex real time PCR tests were developed. The results demonstrated the high specificity and sensitivity of the assays. We therefore conclude that beta-fructosidase can be used as an endogenous reference gene for GMO analysis.
Fiamegkos, I; Cordeiro, F; Robouch, P; Vélez, D; Devesa, V; Raber, G; Sloth, J J; Rasmussen, R R; Llorente-Mirandes, T; Lopez-Sanchez, J F; Rubio, R; Cubadda, F; D'Amato, M; Feldmann, J; Raab, A; Emteborg, H; de la Calle, M B
2016-12-15
A collaborative trial was conducted to determine the performance characteristics of an analytical method for the quantification of inorganic arsenic (iAs) in food. The method is based on (i) solubilisation of the protein matrix with concentrated hydrochloric acid to denature proteins and allow the release of all arsenic species into solution, and (ii) subsequent extraction of the inorganic arsenic present in the acid medium using chloroform followed by back-extraction to acidic medium. The final detection and quantification is done by flow injection hydride generation atomic absorption spectrometry (FI-HG-AAS). The seven test items used in this exercise were reference materials covering a broad range of matrices: mussels, cabbage, seaweed (hijiki), fish protein, rice, wheat, mushrooms, with concentrations ranging from 0.074 to 7.55mgkg(-1). The relative standard deviation for repeatability (RSDr) ranged from 4.1 to 10.3%, while the relative standard deviation for reproducibility (RSDR) ranged from 6.1 to 22.8%. Copyright © 2016 The Authors. Published by Elsevier Ltd.. All rights reserved.
Gantner, Martin; Schwarzmann, Günter; Sandhoff, Konrad; Kolter, Thomas
2014-12-01
Within recent years, ganglioside patterns have been increasingly analyzed by MS. However, internal standards for calibration are only available for gangliosides GM1, GM2, and GM3. For this reason, we prepared homologous internal standards bearing nonnatural fatty acids of the major mammalian brain gangliosides GM1, GD1a, GD1b, GT1b, and GQ1b, and of the tumor-associated gangliosides GM2 and GD2. The fatty acid moieties were incorporated after selective chemical or enzymatic deacylation of bovine brain gangliosides. For modification of the sphingoid bases, we developed a new synthetic method based on olefin cross metathesis. This method was used for the preparation of a lyso-GM1 and a lyso-GM2 standard. The total yield of this method was 8.7% for the synthesis of d17:1-lyso-GM1 from d20:1/18:0-GM1 in four steps. The title compounds are currently used as calibration substances for MS quantification and are also suitable for functional studies. Copyright © 2014 by the American Society for Biochemistry and Molecular Biology, Inc.
Breast density quantification with cone-beam CT: A post-mortem study
Johnson, Travis; Ding, Huanjun; Le, Huy Q.; Ducote, Justin L.; Molloi, Sabee
2014-01-01
Forty post-mortem breasts were imaged with a flat-panel based cone-beam x-ray CT system at 50 kVp. The feasibility of breast density quantification has been investigated using standard histogram thresholding and an automatic segmentation method based on the fuzzy c-means algorithm (FCM). The breasts were chemically decomposed into water, lipid, and protein immediately after image acquisition was completed. The percent fibroglandular volume (%FGV) from chemical analysis was used as the gold standard for breast density comparison. Both image-based segmentation techniques showed good precision in breast density quantification with high linear coefficients between the right and left breast of each pair. When comparing with the gold standard using %FGV from chemical analysis, Pearson’s r-values were estimated to be 0.983 and 0.968 for the FCM clustering and the histogram thresholding techniques, respectively. The standard error of the estimate (SEE) was also reduced from 3.92% to 2.45% by applying the automatic clustering technique. The results of the postmortem study suggested that breast tissue can be characterized in terms of water, lipid and protein contents with high accuracy by using chemical analysis, which offers a gold standard for breast density studies comparing different techniques. In the investigated image segmentation techniques, the FCM algorithm had high precision and accuracy in breast density quantification. In comparison to conventional histogram thresholding, it was more efficient and reduced inter-observer variation. PMID:24254317
Forment, Josep V.; Jackson, Stephen P.
2016-01-01
Protein accumulation on chromatin has traditionally been studied using immunofluorescence microscopy or biochemical cellular fractionation followed by western immunoblot analysis. As a way to improve the reproducibility of this kind of analysis, make it easier to quantify and allow a stream-lined application in high-throughput screens, we recently combined a classical immunofluorescence microscopy detection technique with flow cytometry1. In addition to the features described above, and by combining it with detection of both DNA content and DNA replication, this method allows unequivocal and direct assignment of cell-cycle distribution of protein association to chromatin without the need for cell culture synchronization. Furthermore, it is relatively quick (no more than a working day from sample collection to quantification), requires less starting material compared to standard biochemical fractionation methods and overcomes the need for flat, adherent cell types that are required for immunofluorescence microscopy. PMID:26226461
Effects of bioirrigation of non-biting midges (Diptera: Chironomidae) on lake sediment respiration
Baranov, Viktor; Lewandowski, Jörg; Romeijn, Paul; Singer, Gabriel; Krause, Stefan
2016-01-01
Bioirrigation or the transport of fluids into the sediment matrix due to the activities of organisms such as bloodworms (larvae of Diptera, Chironomidae), has substantial impacts on sediment respiration in lakes. However, previous quantifications of bioirrigation impacts of Chironomidae have been limited by technical challenges such as the difficulty to separate faunal and bacterial respiration. This paper describes a novel method based on the bioreactive tracer resazurin for measuring respiration in-situ in non-sealed systems with constant oxygen supply. Applying this new method in microcosm experiments revealed that bioirrigation enhanced sediment respiration by up to 2.5 times. The new method is yielding lower oxygen consumption than previously reported, as it is only sensitive to aerobic heterotrophous respiration and not to other processes causing oxygen decrease. Hence it decouples the quantification of respiration of animals and inorganic oxygen consumption from microbe respiration in sediment. PMID:27256514
Effects of bioirrigation of non-biting midges (Diptera: Chironomidae) on lake sediment respiration.
Baranov, Viktor; Lewandowski, Jörg; Romeijn, Paul; Singer, Gabriel; Krause, Stefan
2016-06-03
Bioirrigation or the transport of fluids into the sediment matrix due to the activities of organisms such as bloodworms (larvae of Diptera, Chironomidae), has substantial impacts on sediment respiration in lakes. However, previous quantifications of bioirrigation impacts of Chironomidae have been limited by technical challenges such as the difficulty to separate faunal and bacterial respiration. This paper describes a novel method based on the bioreactive tracer resazurin for measuring respiration in-situ in non-sealed systems with constant oxygen supply. Applying this new method in microcosm experiments revealed that bioirrigation enhanced sediment respiration by up to 2.5 times. The new method is yielding lower oxygen consumption than previously reported, as it is only sensitive to aerobic heterotrophous respiration and not to other processes causing oxygen decrease. Hence it decouples the quantification of respiration of animals and inorganic oxygen consumption from microbe respiration in sediment.
Effects of bioirrigation of non-biting midges (Diptera: Chironomidae) on lake sediment respiration
NASA Astrophysics Data System (ADS)
Baranov, Viktor; Lewandowski, Jörg; Romeijn, Paul; Singer, Gabriel; Krause, Stefan
2016-06-01
Bioirrigation or the transport of fluids into the sediment matrix due to the activities of organisms such as bloodworms (larvae of Diptera, Chironomidae), has substantial impacts on sediment respiration in lakes. However, previous quantifications of bioirrigation impacts of Chironomidae have been limited by technical challenges such as the difficulty to separate faunal and bacterial respiration. This paper describes a novel method based on the bioreactive tracer resazurin for measuring respiration in-situ in non-sealed systems with constant oxygen supply. Applying this new method in microcosm experiments revealed that bioirrigation enhanced sediment respiration by up to 2.5 times. The new method is yielding lower oxygen consumption than previously reported, as it is only sensitive to aerobic heterotrophous respiration and not to other processes causing oxygen decrease. Hence it decouples the quantification of respiration of animals and inorganic oxygen consumption from microbe respiration in sediment.
NASA Astrophysics Data System (ADS)
Benítez, Hernán D.; Ibarra-Castanedo, Clemente; Bendada, AbdelHakim; Maldague, Xavier; Loaiza, Humberto; Caicedo, Eduardo
2008-01-01
It is well known that the methods of thermographic non-destructive testing based on the thermal contrast are strongly affected by non-uniform heating at the surface. Hence, the results obtained from these methods considerably depend on the chosen reference point. The differential absolute contrast (DAC) method was developed to eliminate the need of determining a reference point that defined the thermal contrast with respect to an ideal sound area. Although, very useful at early times, the DAC accuracy decreases when the heat front approaches the sample rear face. We propose a new DAC version by explicitly introducing the sample thickness using the thermal quadrupoles theory and showing that the new DAC range of validity increases for long times while preserving the validity for short times. This new contrast is used for defect quantification in composite, Plexiglas™ and aluminum samples.
Ramirez-Sanchez, Israel; Maya, Lisandro; Ceballos, Guillermo; Villarreal, Francisco
2010-12-01
Polyphenolic compounds of the flavanoid family are abundantly present in cacao seed and its cocoa products. Results from studies using cocoa products indicate beneficial effects of flavanols on cardiovascular endpoints. Evidence indicates that (-)-epicatechin is the main cacao flavanol associated with cardiovascular effects, so the accurate quantification of its content in cacao seeds or cocoa products is important. Common methods for the quantification of phenolic content in cocoa products are based on the reaction of phenols with colorimetric reagents such as the Folin-Ciocalteu (FC) In this study, we compared the FC method of phenolic determinations using 2 different standards (gallic acid and (-)-epicatechin) to construct calibration curves. We compare these results with those obtained from a simple fluorometric method (Ex(280)/Em(320) nm) used to determine catechin/(-)-epicatechin content in samples of cacao seeds and cocoa products. Values obtained from the FC method determination of polyphenols yield an overestimation of phenol (flavonoid) content when gallic acid is used as standard. Moreover, the epicatechin is a more reliable standard because of its abundance in cacao seeds and cocoa products. The use of fluorometric spectra yields a simple and highly quantitative means for a more precise and rapid quantification of cacao catechins. Fluorometric values are essentially in agreement with those reported using more cumbersome methods. In conclusion, the use of fluorescence emission spectra is a quick, practical and suitable means to quantifying catechins in cacao seeds and cocoa products.
A preliminary study for fully automated quantification of psoriasis severity using image mapping
NASA Astrophysics Data System (ADS)
Mukai, Kazuhiro; Iyatomi, Hitoshi
2014-03-01
Psoriasis is a common chronic skin disease and it detracts patients' QoL seriously. Since there is no known permanent cure so far, controlling appropriate disease condition is necessary and therefore quantification of its severity is important. In clinical, psoriasis area and severity index (PASI) is commonly used for abovementioned purpose, however it is often subjective and troublesome. A fully automatic computer-assisted area and severity index (CASI) was proposed to make an objective quantification of skin disease. It investigates the size and density of erythema based on digital image analysis, however it does not consider various inadequate effects caused by different geometrical conditions under clinical follow-up (i.e. variability in direction and distance between camera and patient). In this study, we proposed an image alignment method for clinical images and investigated to quantify the severity of psoriasis under clinical follow-up combined with the idea of CASI. The proposed method finds geometrical same points in patient's body (ROI) between images with Scale Invariant Feature Transform (SIFT) and performs the Affine transform to map the pixel value to the other. In this study, clinical images from 7 patients with psoriasis lesions on their trunk under clinical follow-up were used. In each series, our image alignment algorithm align images to the geometry of their first image. Our proposed method aligned images appropriately on visual assessment and confirmed that psoriasis areas were properly extracted using the approach of CASI. Although we cannot evaluate PASI and CASI directly due to their different definition of ROI, we confirmed that there is a large correlation between those scores with our image quantification method.
Boundary fitting based segmentation of fluorescence microscopy images
NASA Astrophysics Data System (ADS)
Lee, Soonam; Salama, Paul; Dunn, Kenneth W.; Delp, Edward J.
2015-03-01
Segmentation is a fundamental step in quantifying characteristics, such as volume, shape, and orientation of cells and/or tissue. However, quantification of these characteristics still poses a challenge due to the unique properties of microscopy volumes. This paper proposes a 2D segmentation method that utilizes a combination of adaptive and global thresholding, potentials, z direction refinement, branch pruning, end point matching, and boundary fitting methods to delineate tubular objects in microscopy volumes. Experimental results demonstrate that the proposed method achieves better performance than an active contours based scheme.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Silva-Rodríguez, Jesús, E-mail: jesus.silva.rodriguez@sergas.es; Aguiar, Pablo, E-mail: pablo.aguiar.fernandez@sergas.es; Servicio de Medicina Nuclear, Complexo Hospitalario Universidade de Santiago de Compostela
Purpose: Current procedure guidelines for whole body [18F]fluoro-2-deoxy-D-glucose (FDG)-positron emission tomography (PET) state that studies with visible dose extravasations should be rejected for quantification protocols. Our work is focused on the development and validation of methods for estimating extravasated doses in order to correct standard uptake value (SUV) values for this effect in clinical routine. Methods: One thousand three hundred sixty-seven consecutive whole body FDG-PET studies were visually inspected looking for extravasation cases. Two methods for estimating the extravasated dose were proposed and validated in different scenarios using Monte Carlo simulations. All visible extravasations were retrospectively evaluated using a manualmore » ROI based method. In addition, the 50 patients with higher extravasated doses were also evaluated using a threshold-based method. Results: Simulation studies showed that the proposed methods for estimating extravasated doses allow us to compensate the impact of extravasations on SUV values with an error below 5%. The quantitative evaluation of patient studies revealed that paravenous injection is a relatively frequent effect (18%) with a small fraction of patients presenting considerable extravasations ranging from 1% to a maximum of 22% of the injected dose. A criterion based on the extravasated volume and maximum concentration was established in order to identify this fraction of patients that might be corrected for paravenous injection effect. Conclusions: The authors propose the use of a manual ROI based method for estimating the effectively administered FDG dose and then correct SUV quantification in those patients fulfilling the proposed criterion.« less
Xu, Leilei; Wang, Fang; Xu, Ying; Wang, Yi; Zhang, Cuiping; Qin, Xue; Yu, Hongxiu; Yang, Pengyuan
2015-12-07
As a key post-translational modification mechanism, protein acetylation plays critical roles in regulating and/or coordinating cell metabolism. Acetylation is a prevalent modification process in enzymes. Protein acetylation modification occurs in sub-stoichiometric amounts; therefore extracting biologically meaningful information from these acetylation sites requires an adaptable, sensitive, specific, and robust method for their quantification. In this work, we combine immunoassays and multiple reaction monitoring-mass spectrometry (MRM-MS) technology to develop an absolute quantification for acetylation modification. With this hybrid method, we quantified the acetylation level of metabolic enzymes, which could demonstrate the regulatory mechanisms of the studied enzymes. The development of this quantitative workflow is a pivotal step for advancing our knowledge and understanding of the regulatory effects of protein acetylation in physiology and pathophysiology.
Dikow, Nicola; Nygren, Anders Oh; Schouten, Jan P; Hartmann, Carolin; Krämer, Nikola; Janssen, Bart; Zschocke, Johannes
2007-06-01
Standard methods used for genomic methylation analysis allow the detection of complete absence of either methylated or non-methylated alleles but are usually unable to detect changes in the proportion of methylated and unmethylated alleles. We compare two methods for quantitative methylation analysis, using the chromosome 15q11-q13 imprinted region as model. Absence of the non-methylated paternal allele in this region leads to Prader-Willi syndrome (PWS) whilst absence of the methylated maternal allele results in Angelman syndrome (AS). A proportion of AS is caused by mosaic imprinting defects which may be missed with standard methods and require quantitative analysis for their detection. Sequence-based quantitative methylation analysis (SeQMA) involves quantitative comparison of peaks generated through sequencing reactions after bisulfite treatment. It is simple, cost-effective and can be easily established for a large number of genes. However, our results support previous suggestions that methods based on bisulfite treatment may be problematic for exact quantification of methylation status. Methylation-specific multiplex ligation-dependent probe amplification (MS-MLPA) avoids bisulfite treatment. It detects changes in both CpG methylation as well as copy number of up to 40 chromosomal sequences in one simple reaction. Once established in a laboratory setting, the method is more accurate, reliable and less time consuming.
Hawkins, Cory A; Rud, Anna; Guthrie, Margaret L; Dietz, Mark L
2015-06-26
The separation of nine N,N'-dialkylimidazolium-based ionic liquids (ILs) by an isocratic hydrophilic interaction high-performance liquid chromatographic method using an unmodified silica column was investigated. The chosen analytical conditions using a 90:10 acetonitrile-ammonium formate buffer mobile phase on a high-purity, unmodified silica column were found to be efficient, robust, and sensitive for the determination of ILs in a variety of solutions. The retention window (k' = 2-11) was narrower than that of previous methods, resulting in a 7-min runtime for the nine IL homologues. The lower limit of quantification of the method, 2-3 μmol L(-1), was significantly lower than those reported previously for HPLC-UV methods. The effects of systematically modifying the IL cation alkyl chain length, column temperature, and mobile-phase water and buffer content on solute retention were examined. Cation exchange was identified as the dominant retention mechanism for most of the solutes, with a distinct (single methylene group) transition to a dominant partitioning mode at the highest solute polarity. Copyright © 2015 Elsevier B.V. All rights reserved.
Qian, Kuangnan; Edwards, Kathleen E; Dechert, Gary J; Jaffe, Stephen B; Green, Larry A; Olmstead, William N
2008-02-01
We report a new method for rapid measurement of total acid number (TAN) and TAN boiling point (BP) distribution for petroleum crude and products. The technology is based on negative ion electrospray ionization mass spectrometry (ESI-MS) for selective ionization of petroleum acid and quantification of acid structures and molecular weight distributions. A chip-based nanoelectrospray system enables microscale (<200 mg) and higher throughput (20 samples/h) measurement. Naphthenic acid structures were assigned based on nominal masses of a set of predefined acid structures. Stearic acid is used as an internal standard to calibrate ESI-MS response factors for quantification purposes. With the use of structure-property correlations, boiling point distributions of TAN values can be calculated from the composition. The rapid measurement of TAN BP distributions by ESI is demonstrated for a series of high-TAN crudes and distillation cuts. TAN values determined by the technique agree well with those by the titration method. The distributed properties compare favorably with those measured by distillation and measurement of TAN of corresponding cuts.
Microstructural Effects on Initiation Behavior in HMX
NASA Astrophysics Data System (ADS)
Molek, Christopher; Welle, Eric; Hardin, Barrett; Vitarelli, Jim; Wixom, Ryan; Samuels, Philip
Understanding the role microstructure plays on ignition and growth behavior has been the subject of a significant body of research within the detonation physics community. The pursuit of this understanding is important because safety and performance characteristics have been shown to strongly correlate to particle morphology. Historical studies have often correlated bulk powder characteristics to the performance or safety characteristics of pressed materials. We believe that a clearer and more relevant correlation is made between the pressed microstructure and the observed detonation behavior. This type of assessment is possible, as techniques now exist for the quantification of the pressed microstructures. Our talk will report on experimental efforts that correlate directly measured microstructural characteristics to initiation threshold behavior of HMX based materials. The internal microstructures were revealed using an argon ion cross-sectioning technique. This technique enabled the quantification of density and interface area of the pores within the pressed bed using methods of stereology. These bed characteristics are compared to the initiation threshold behavior of three HMX based materials using an electric gun based test method. Finally, a comparison of experimental threshold data to supporting theoretical efforts will be made.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Tsuchiya, Hikaru; Tanaka, Keiji, E-mail: tanaka-kj@igakuken.or.jp; Saeki, Yasushi, E-mail: saeki-ys@igakuken.or.jp
2013-06-28
Highlights: •The parallel reaction monitoring method was applied to ubiquitin quantification. •The ubiquitin PRM method is highly sensitive even in biological samples. •Using the method, we revealed that Ufd4 assembles the K29-linked ubiquitin chain. -- Abstract: Ubiquitylation is an essential posttranslational protein modification that is implicated in a diverse array of cellular functions. Although cells contain eight structurally distinct types of polyubiquitin chains, detailed function of several chain types including K29-linked chains has remained largely unclear. Current mass spectrometry (MS)-based quantification methods are highly inefficient for low abundant atypical chains, such as K29- and M1-linked chains, in complex mixtures thatmore » typically contain highly abundant proteins. In this study, we applied parallel reaction monitoring (PRM), a quantitative, high-resolution MS method, to quantify ubiquitin chains. The ubiquitin PRM method allows us to quantify 100 attomole amounts of all possible ubiquitin chains in cell extracts. Furthermore, we quantified ubiquitylation levels of ubiquitin-proline-β-galactosidase (Ub-P-βgal), a historically known model substrate of the ubiquitin fusion degradation (UFD) pathway. In wild-type cells, Ub-P-βgal is modified with ubiquitin chains consisting of 21% K29- and 78% K48-linked chains. In contrast, K29-linked chains are not detected in UFD4 knockout cells, suggesting that Ufd4 assembles the K29-linked ubiquitin chain(s) on Ub-P-βgal in vivo. Thus, the ubiquitin PRM is a novel, useful, quantitative method for analyzing the highly complicated ubiquitin system.« less
Serum protein measurement using a tapered fluorescent fibre-optic evanescent wave-based biosensor
NASA Astrophysics Data System (ADS)
Preejith, P. V.; Lim, C. S.; Chia, T. F.
2006-12-01
A novel method to measure the total serum protein concentration is described in this paper. The method is based on the principles of fibre-optic evanescent wave spectroscopy. The biosensor applies a fluorescent dye-immobilized porous glass coating on a multi-mode optical fibre. The evanescent wave's intensity at the fibre-optic core-cladding interface is used to monitor the protein-induced changes in the sensor element. The sensor offers a rapid, single-step method for quantifying protein concentrations without destroying the sample. This unique sensing method presents a sensitive and accurate platform for the quantification of protein.
Kim, Joo-Hwan; Kim, Jin Ho; Wang, Pengbin; Park, Bum Soo; Han, Myung-Soo
2016-01-01
The identification and quantification of Heterosigma akashiwo cysts in sediments by light microscopy can be difficult due to the small size and morphology of the cysts, which are often indistinguishable from those of other types of algae. Quantitative real-time PCR (qPCR) based assays represent a potentially efficient method for quantifying the abundance of H. akashiwo cysts, although standard curves must be based on cyst DNA rather than on vegetative cell DNA due to differences in gene copy number and DNA extraction yield between these two cell types. Furthermore, qPCR on sediment samples can be complicated by the presence of extracellular DNA debris. To solve these problems, we constructed a cyst-based standard curve and developed a simple method for removing DNA debris from sediment samples. This cyst-based standard curve was compared with a standard curve based on vegetative cells, as vegetative cells may have twice the gene copy number of cysts. To remove DNA debris from the sediment, we developed a simple method involving dilution with distilled water and heating at 75°C. A total of 18 sediment samples were used to evaluate this method. Cyst abundance determined using the qPCR assay without DNA debris removal yielded results up to 51-fold greater than with direct counting. By contrast, a highly significant correlation was observed between cyst abundance determined by direct counting and the qPCR assay in conjunction with DNA debris removal (r2 = 0.72, slope = 1.07, p < 0.001). Therefore, this improved qPCR method should be a powerful tool for the accurate quantification of H. akashiwo cysts in sediment samples.
A SIMPLE METHOD FOR THE EXTRACTION AND QUANTIFICATION OF PHOTOPIGMENTS FROM SYMBIODINIUM SPP.
John E. Rogers and Dragoslav Marcovich. Submitted. Simple Method for the Extraction and Quantification of Photopigments from Symbiodinium spp.. Limnol. Oceanogr. Methods. 19 p. (ERL,GB 1192).
We have developed a simple, mild extraction procedure using methanol which, when...
Sun, Xiaohong; Ouyang, Yue; Chu, Jinfang; Yan, Jing; Yu, Yan; Li, Xiaoqiang; Yang, Jun; Yan, Cunyu
2014-04-18
A sensitive and reliable in-advance stable isotope labeling strategy was developed for simultaneous relative quantification of 8 acidic plant hormones in sub-milligram amount of plant materials. Bromocholine bromide (BETA) and its deuterated counterpart D9-BETA were used to in-advance derivatize control and sample extracts individually, which were then combined and subjected to solid-phase extraction (SPE) purification followed by UPLC-MS/MS analysis. Relative quantification of target compounds was obtained by calculation of the peak area ratios of BETA/D9-BETA labeled plant hormones. The in-advance stable isotope labeling strategy realized internal standard-based relative quantification of multiple kinds of plant hormones independent of availability of internal standard of every analyte with enhanced sensitivity of 1-3 orders of magnitude. Meanwhile, the in-advance labeling contributes to higher sample throughput and more reliability. The method was successfully applied to determine 8 plant hormones in 0.8mg DW (dry weight) of seedlings and 4 plant hormones from single seed of Arabidopsis thaliana. The results show the potential of the method in relative quantification of multiple plant hormones in tiny plant tissues or organs, which will advance the knowledge of the crosstalk mechanism of plant hormones. Copyright © 2014 Elsevier B.V. All rights reserved.
Krajewska, Maryla; Smith, Layton H.; Rong, Juan; Huang, Xianshu; Hyer, Marc L.; Zeps, Nikolajs; Iacopetta, Barry; Linke, Steven P.; Olson, Allen H.; Reed, John C.; Krajewski, Stan
2009-01-01
Cell death is of broad physiological and pathological importance, making quantification of biochemical events associated with cell demise a high priority for experimental pathology. Fibrosis is a common consequence of tissue injury involving necrotic cell death. Using tissue specimens from experimental mouse models of traumatic brain injury, cardiac fibrosis, and cancer, as well as human tumor specimens assembled in tissue microarray (TMA) format, we undertook computer-assisted quantification of specific immunohistochemical and histological parameters that characterize processes associated with cell death. In this study, we demonstrated the utility of image analysis algorithms for color deconvolution, colocalization, and nuclear morphometry to characterize cell death events in tissue specimens: (a) subjected to immunostaining for detecting cleaved caspase-3, cleaved poly(ADP-ribose)-polymerase, cleaved lamin-A, phosphorylated histone H2AX, and Bcl-2; (b) analyzed by terminal deoxyribonucleotidyl transferase–mediated dUTP nick end labeling assay to detect DNA fragmentation; and (c) evaluated with Masson's trichrome staining. We developed novel algorithm-based scoring methods and validated them using TMAs as a high-throughput format. The proposed computer-assisted scoring methods for digital images by brightfield microscopy permit linear quantification of immunohistochemical and histochemical stainings. Examples are provided of digital image analysis performed in automated or semiautomated fashion for successful quantification of molecular events associated with cell death in tissue sections. (J Histochem Cytochem 57:649–663, 2009) PMID:19289554
Pérez-Castaño, Estefanía; Sánchez-Viñas, Mercedes; Gázquez-Evangelista, Domingo; Bagur-González, M Gracia
2018-01-15
This paper describes and discusses the application of trimethylsilyl (TMS)-4,4'-desmethylsterols derivatives chromatographic fingerprints (obtained from an off-line HPLC-GC-FID system) for the quantification of extra virgin olive oil in commercial vinaigrettes, dressing salad and in-house reference materials (i-HRM) using two different Partial Least Square-Regression (PLS-R) multivariate quantification methods. Different data pre-processing strategies were carried out being the whole one: (i) internal normalization; (ii) sampling based on The Nyquist Theorem; (iii) internal correlation optimized shifting, icoshift; (iv) baseline correction (v) mean centering and (vi) selecting zones. The first model corresponds to a matrix of dimensions 'n×911' variables and the second one to a matrix of dimensions 'n×431' variables. It has to be highlighted that the proposed two PLS-R models allow the quantification of extra virgin olive oil in binary blends, foodstuffs, etc., when the provided percentage is greater than 25%. Copyright © 2017 Elsevier Ltd. All rights reserved.
Recommended Immunological Assays to Screen for Ricin-Containing Samples
Simon, Stéphanie; Worbs, Sylvia; Avondet, Marc-André; Tracz, Dobryan M.; Dano, Julie; Schmidt, Lisa; Volland, Hervé; Dorner, Brigitte G.; Corbett, Cindi R.
2015-01-01
Ricin, a toxin from the plant Ricinus communis, is one of the most toxic biological agents known. Due to its availability, toxicity, ease of production and absence of curative treatments, ricin has been classified by the Centers for Disease Control and Prevention (CDC) as category B biological weapon and it is scheduled as a List 1 compound in the Chemical Weapons Convention. An international proficiency test (PT) was conducted to evaluate detection and quantification capabilities of 17 expert laboratories. In this exercise one goal was to analyse the laboratories’ capacity to detect and differentiate ricin and the less toxic, but highly homologuous protein R. communis agglutinin (RCA120). Six analytical strategies are presented in this paper based on immunological assays (four immunoenzymatic assays and two immunochromatographic tests). Using these immunological methods “dangerous” samples containing ricin and/or RCA120 were successfully identified. Based on different antibodies used the detection and quantification of ricin and RCA120 was successful. The ricin PT highlighted the performance of different immunological approaches that are exemplarily recommended for highly sensitive and precise quantification of ricin. PMID:26703725
Recommended Immunological Assays to Screen for Ricin-Containing Samples.
Simon, Stéphanie; Worbs, Sylvia; Avondet, Marc-André; Tracz, Dobryan M; Dano, Julie; Schmidt, Lisa; Volland, Hervé; Dorner, Brigitte G; Corbett, Cindi R
2015-11-26
Ricin, a toxin from the plant Ricinus communis, is one of the most toxic biological agents known. Due to its availability, toxicity, ease of production and absence of curative treatments, ricin has been classified by the Centers for Disease Control and Prevention (CDC) as category B biological weapon and it is scheduled as a List 1 compound in the Chemical Weapons Convention. An international proficiency test (PT) was conducted to evaluate detection and quantification capabilities of 17 expert laboratories. In this exercise one goal was to analyse the laboratories' capacity to detect and differentiate ricin and the less toxic, but highly homologuous protein R. communis agglutinin (RCA120). Six analytical strategies are presented in this paper based on immunological assays (four immunoenzymatic assays and two immunochromatographic tests). Using these immunological methods "dangerous" samples containing ricin and/or RCA120 were successfully identified. Based on different antibodies used the detection and quantification of ricin and RCA120 was successful. The ricin PT highlighted the performance of different immunological approaches that are exemplarily recommended for highly sensitive and precise quantification of ricin.
Chen, Si; Weddell, Jared; Gupta, Pavan; Conard, Grace; Parkin, James; Imoukhuede, Princess I
2017-01-01
Nanosensor-based detection of biomarkers can improve medical diagnosis; however, a critical factor in nanosensor development is deciding which biomarker to target, as most diseases present several biomarkers. Biomarker-targeting decisions can be informed via an understanding of biomarker expression. Currently, immunohistochemistry (IHC) is the accepted standard for profiling biomarker expression. While IHC provides a relative mapping of biomarker expression, it does not provide cell-by-cell readouts of biomarker expression or absolute biomarker quantification. Flow cytometry overcomes both these IHC challenges by offering biomarker expression on a cell-by-cell basis, and when combined with calibration standards, providing quantitation of biomarker concentrations: this is known as qFlow cytometry. Here, we outline the key components for applying qFlow cytometry to detect biomarkers within the angiogenic vascular endothelial growth factor receptor family. The key aspects of the qFlow cytometry methodology include: antibody specificity testing, immunofluorescent cell labeling, saturation analysis, fluorescent microsphere calibration, and quantitative analysis of both ensemble and cell-by-cell data. Together, these methods enable high-throughput quantification of biomarker expression.
Lee, Da-Sheng
2010-01-01
Chip-based DNA quantification systems are widespread, and used in many point-of-care applications. However, instruments for such applications may not be maintained or calibrated regularly. Since machine reliability is a key issue for normal operation, this study presents a system model of the real-time Polymerase Chain Reaction (PCR) machine to analyze the instrument design through numerical experiments. Based on model analysis, a systematic approach was developed to lower the variation of DNA quantification and achieve a robust design for a real-time PCR-on-a-chip system. Accelerated lift testing was adopted to evaluate the reliability of the chip prototype. According to the life test plan, this proposed real-time PCR-on-a-chip system was simulated to work continuously for over three years with similar reproducibility in DNA quantification. This not only shows the robustness of the lab-on-a-chip system, but also verifies the effectiveness of our systematic method for achieving a robust design. PMID:22315563
Fingerprinting and quantification of GMOs in the agro-food sector.
Taverniers, I; Van Bockstaele, E; De Loose, M
2003-01-01
Most strategies for analyzing GMOs in plants and derived food and feed products, are based on the polymerase chain reaction (PCR) technique. In conventional PCR methods, a 'known' sequence between two specific primers is amplified. To the contrary, with the 'anchor PCR' technique, unknown sequences adjacent to a known sequence, can be amplified. Because T-DNA/plant border sequences are being amplified, anchor PCR is the perfect tool for unique identification of transgenes, including non-authorized GMOs. In this work, anchor PCR was applied to characterize the 'transgene locus' and to clarify the complete molecular structure of at least six different commercial transgenic plants. Based on sequences of T-DNA/plant border junctions, obtained by anchor PCR, event specific primers were developed. The junction fragments, together with endogeneous reference gene targets, were cloned in plasmids. The latter were then used as event specific calibrators in real-time PCR, a new technique for the accurate relative quantification of GMOs. We demonstrate here the importance of anchor PCR for identification and the usefulness of plasmid DNA calibrators in quantification strategies for GMOs, throughout the agro-food sector.
Methods and systems for detecting abnormal digital traffic
Goranson, Craig A [Kennewick, WA; Burnette, John R [Kennewick, WA
2011-03-22
Aspects of the present invention encompass methods and systems for detecting abnormal digital traffic by assigning characterizations of network behaviors according to knowledge nodes and calculating a confidence value based on the characterizations from at least one knowledge node and on weighting factors associated with the knowledge nodes. The knowledge nodes include a characterization model based on prior network information. At least one of the knowledge nodes should not be based on fixed thresholds or signatures. The confidence value includes a quantification of the degree of confidence that the network behaviors constitute abnormal network traffic.
Robust approaches to quantification of margin and uncertainty for sparse data
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hund, Lauren; Schroeder, Benjamin B.; Rumsey, Kelin
Characterizing the tails of probability distributions plays a key role in quantification of margins and uncertainties (QMU), where the goal is characterization of low probability, high consequence events based on continuous measures of performance. When data are collected using physical experimentation, probability distributions are typically fit using statistical methods based on the collected data, and these parametric distributional assumptions are often used to extrapolate about the extreme tail behavior of the underlying probability distribution. In this project, we character- ize the risk associated with such tail extrapolation. Specifically, we conducted a scaling study to demonstrate the large magnitude of themore » risk; then, we developed new methods for communicat- ing risk associated with tail extrapolation from unvalidated statistical models; lastly, we proposed a Bayesian data-integration framework to mitigate tail extrapolation risk through integrating ad- ditional information. We conclude that decision-making using QMU is a complex process that cannot be achieved using statistical analyses alone.« less
Dewaraja, Yuni K.; Frey, Eric C.; Sgouros, George; Brill, A. Bertrand; Roberson, Peter; Zanzonico, Pat B.; Ljungberg, Michael
2012-01-01
In internal radionuclide therapy, a growing interest in voxel-level estimates of tissue-absorbed dose has been driven by the desire to report radiobiologic quantities that account for the biologic consequences of both spatial and temporal nonuniformities in these dose estimates. This report presents an overview of 3-dimensional SPECT methods and requirements for internal dosimetry at both regional and voxel levels. Combined SPECT/CT image-based methods are emphasized, because the CT-derived anatomic information allows one to address multiple technical factors that affect SPECT quantification while facilitating the patient-specific voxel-level dosimetry calculation itself. SPECT imaging and reconstruction techniques for quantification in radionuclide therapy are not necessarily the same as those designed to optimize diagnostic imaging quality. The current overview is intended as an introduction to an upcoming series of MIRD pamphlets with detailed radionuclide-specific recommendations intended to provide best-practice SPECT quantification–based guidance for radionuclide dosimetry. PMID:22743252
NASA Astrophysics Data System (ADS)
Zhang, Xuefeng; Liu, Bo; Wang, Jieqiong; Zhang, Zhe; Shi, Kaibo; Wu, Shuanglin
2014-08-01
Commonly used petrological quantification methods are visual estimation, counting, and image analyses. However, in this article, an Adobe Photoshop-based analyzing method (PSQ) is recommended for quantifying the rock textural data and porosities. Adobe Photoshop system provides versatile abilities in selecting an area of interest and the pixel number of a selection could be read and used to calculate its area percentage. Therefore, Adobe Photoshop could be used to rapidly quantify textural components, such as content of grains, cements, and porosities including total porosities and different genetic type porosities. This method was named as Adobe Photoshop Quantification (PSQ). The workflow of the PSQ method was introduced with the oolitic dolomite samples from the Triassic Feixianguan Formation, Northeastern Sichuan Basin, China, for example. And the method was tested by comparing with the Folk's and Shvetsov's "standard" diagrams. In both cases, there is a close agreement between the "standard" percentages and those determined by the PSQ method with really small counting errors and operator errors, small standard deviations and high confidence levels. The porosities quantified by PSQ were evaluated against those determined by the whole rock helium gas expansion method to test the specimen errors. Results have shown that the porosities quantified by the PSQ are well correlated to the porosities determined by the conventional helium gas expansion method. Generally small discrepancies (mostly ranging from -3% to 3%) are caused by microporosities which would cause systematic underestimation of 2% and/or by macroporosities causing underestimation or overestimation in different cases. Adobe Photoshop could be used to quantify rock textural components and porosities. This method has been tested to be precise and accurate. It is time saving compared with usual methods.
Seibert, Cathrin; Davidson, Brian R; Fuller, Barry J; Patterson, Laurence H; Griffiths, William J; Wang, Yuqin
2009-04-01
Here we report the identification and approximate quantification of cytochrome P450 (CYP) proteins in human liver microsomes as determined by nano-LC-MS/MS with application of the exponentially modified protein abundance index (emPAI) algorithm during database searching. Protocols based on 1D-gel protein separation and 2D-LC peptide separation gave comparable results. In total, 18 CYP isoforms were unambiguously identified based on unique peptide matches. Further, we have determined the absolute quantity of two CYP enzymes (2E1 and 1A2) in human liver microsomes using stable-isotope dilution mass spectrometry, where microsomal proteins were separated by 1D-gel electrophoresis, digested with trypsin in the presence of either a CYP2E1- or 1A2-specific stable-isotope labeled tryptic peptide and analyzed by LC-MS/MS. Using multiple reaction monitoring (MRM) for the isotope-labeled tryptic peptides and their natural unlabeled analogues quantification could be performed over the range of 0.1-1.5 pmol on column. Liver microsomes from four individuals were analyzed for CYP2E1 giving values of 88-200 pmol/mg microsomal protein. The CYP1A2 content of microsomes from a further three individuals ranged from 165 to 263 pmol/mg microsomal protein. Although, in this proof-of-concept study for CYP quantification, the two CYP isoforms were quantified from different samples, there are no practical reasons to prevent multiplexing the method to allow the quantification of multiple CYP isoforms in a single sample.
Seibert, Cathrin; Davidson, Brian R.; Fuller, Barry J.; Patterson, Laurence H.; Griffiths, William J.; Wang, Yuqin
2009-01-01
Here we report the identification and approximate quantification of cytochrome P450 (CYP) proteins in human liver microsomes as determined by nano-LC-MS/MS with application of the exponentially modified protein abundance index (emPAI) algorithm during database searching. Protocols based on 1D-gel protein separation and 2D-LC peptide separation gave comparable results. In total 18 CYP isoforms were unambiguously identified based on unique peptide matches. Further, we have determined the absolute quantity of two CYP enzymes (2E1 and 1A2) in human liver microsomes using stable-isotope dilution mass spectrometry, where microsomal proteins were separated by 1D-gel electrophoresis, digested with trypsin in the presence of either a CYP2E1- or 1A2-specific stable-isotope labelled tryptic peptide and analysed by LC-MS/MS. Using multiple reaction monitoring (MRM) for the isotope-labelled tryptic peptides and their natural unlabelled analogues quantification could be performed over the range of 0.1 – 1.5 pmol on column. Liver microsomes from four individuals were analysed for CYP2E1 giving values of 88 - 200 pmol/mg microsomal protein. The CYP1A2 content of microsomes from a further three individuals ranged from 165 – 263 pmol/mg microsomal protein. Although, in this proof-of-concept study for CYP quantification, the two CYP-isoforms were quantified from different samples, there are no practical reasons to prevent multiplexing the method to allow the quantification of multiple CYP-isoforms in a single sample. PMID:19714871
Barrera-Escorcia, Guadalupe; Wong-Chang, Irma; Fernández-Rendón, Carlos Leopoldo; Botello, Alfonso Vázquez; Gómez-Gil, Bruno; Lizárraga-Partida, Marcial Leonardo
2016-11-01
Oysters can accumulate potentially pathogenic water bacteria. The objective of this study was to compare two procedures to quantify Vibrio species present in oysters to determine the most sensitive method. We analyzed oyster samples from the Gulf of Mexico, commercialized in Mexico City. The samples were inoculated in tubes with alkaline peptone water (APW), based on three tubes and four dilutions (10 -1 to 10 -4 ). From these tubes, the first quantification of Vibrio species was performed (most probable number (MPN) from tubes) and bacteria were inoculated by streaking on thiosulfate-citrate-bile salts-sucrose (TCBS) petri dishes. Colonies were isolated for a second quantification (MPN from dishes). Polymerase chain reaction (PCR) was used to determine species with specific primers: ompW for Vibrio cholerae, tlh for Vibrio parahaemolyticus, and VvhA for Vibrio vulnificus. Simultaneously, the sanitary quality of oysters was determined. The quantification of V. parahaemolyticus was significantly higher in APW tubes than in TCBS dishes. Regarding V. vulnificus counts, the differences among both approaches were not significant. In contrast, the MPNs of V. cholerae obtained from dishes were higher than from tubes. The quantification of MPNs through PCR of V. parahaemolyticus and V. vulnificus obtained from APW was sensitive and recommendable for the detection of both species. In contrast, to quantify V. cholerae, it was necessary to isolate colonies on TCBS prior PCR. Culturing in APW at 42 °C could be an alternative to avoid colony isolation. The MPNs of V. cholerae from dishes was associated with the bad sanitary quality of the samples.
Quantifying differences in land use emission estimates implied by definition discrepancies
NASA Astrophysics Data System (ADS)
Stocker, B. D.; Joos, F.
2015-11-01
The quantification of CO2 emissions from anthropogenic land use and land use change (eLUC) is essential to understand the drivers of the atmospheric CO2 increase and to inform climate change mitigation policy. Reported values in synthesis reports are commonly derived from different approaches (observation-driven bookkeeping and process-modelling) but recent work has emphasized that inconsistencies between methods may imply substantial differences in eLUC estimates. However, a consistent quantification is lacking and no concise modelling protocol for the separation of primary and secondary components of eLUC has been established. Here, we review differences of eLUC quantification methods and apply an Earth System Model (ESM) of Intermediate Complexity to quantify them. We find that the magnitude of effects due to merely conceptual differences between ESM and offline vegetation model-based quantifications is ~ 20 % for today. Under a future business-as-usual scenario, differences tend to increase further due to slowing land conversion rates and an increasing impact of altered environmental conditions on land-atmosphere fluxes. We establish how coupled Earth System Models may be applied to separate secondary component fluxes of eLUC arising from the replacement of potential C sinks/sources and the land use feedback and show that secondary fluxes derived from offline vegetation models are conceptually and quantitatively not identical to either, nor their sum. Therefore, we argue that synthesis studies should resort to the "least common denominator" of different methods, following the bookkeeping approach where only primary land use emissions are quantified under the assumption of constant environmental boundary conditions.
Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong
2015-01-01
Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.
Collender, Philip A; Kirby, Amy E; Addiss, David G; Freeman, Matthew C; Remais, Justin V
2015-12-01
Limiting the environmental transmission of soil-transmitted helminths (STHs), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost-effective methods to detect and quantify STHs in the environment. We review the state-of-the-art of STH quantification in soil, biosolids, water, produce, and vegetation with regard to four major methodological issues: environmental sampling; recovery of STHs from environmental matrices; quantification of recovered STHs; and viability assessment of STH ova. We conclude that methods for sampling and recovering STHs require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment. Copyright © 2015 Elsevier Ltd. All rights reserved.
Comparison of tissue processing methods for microvascular visualization in axolotls.
Montoro, Rodrigo; Dickie, Renee
2017-01-01
The vascular system, the pipeline for oxygen and nutrient delivery to tissues, is essential for vertebrate development, growth, injury repair, and regeneration. With their capacity to regenerate entire appendages throughout their lifespan, axolotls are an unparalleled model for vertebrate regeneration, but they lack many of the molecular tools that facilitate vascular imaging in other animal models. The determination of vascular metrics requires high quality image data for the discrimination of vessels from background tissue. Quantification of the vasculature using perfused, cleared specimens is well-established in mammalian systems, but has not been widely employed in amphibians. The objective of this study was to optimize tissue preparation methods for the visualization of the microvascular network in axolotls, providing a basis for the quantification of regenerative angiogenesis. To accomplish this aim, we performed intracardiac perfusion of pigment-based contrast agents and evaluated aqueous and non-aqueous clearing techniques. The methods were verified by comparing the quality of the vascular images and the observable vascular density across treatment groups. Simple and inexpensive, these tissue processing techniques will be of use in studies assessing vascular growth and remodeling within the context of regeneration. Advantages of this method include: •Higher contrast of the vasculature within the 3D context of the surrounding tissue •Enhanced detection of microvasculature facilitating vascular quantification •Compatibility with other labeling techniques.
Fang, Yu-Hua Dean; Chiu, Shao-Chieh; Lu, Chin-Song; Weng, Yi-Hsin
2015-01-01
Purpose. We aimed at improving the existing methods for the fully automatic quantification of striatal uptake of [99mTc]-TRODAT with SPECT imaging. Procedures. A normal [99mTc]-TRODAT template was first formed based on 28 healthy controls. Images from PD patients (n = 365) and nPD subjects (28 healthy controls and 33 essential tremor patients) were spatially normalized to the normal template. We performed an inverse transform on the predefined striatal and reference volumes of interest (VOIs) and applied the transformed VOIs to the original image data to calculate the striatal-to-reference ratio (SRR). The diagnostic performance of the SRR was determined through receiver operating characteristic (ROC) analysis. Results. The SRR measured with our new and automatic method demonstrated excellent diagnostic performance with 92% sensitivity, 90% specificity, 92% accuracy, and an area under the curve (AUC) of 0.94. For the evaluation of the mean SRR and the clinical duration, a quadratic function fit the data with R 2 = 0.84. Conclusions. We developed and validated a fully automatic method for the quantification of the SRR in a large study sample. This method has an excellent diagnostic performance and exhibits a strong correlation between the mean SRR and the clinical duration in PD patients. PMID:26366413
Fang, Yu-Hua Dean; Chiu, Shao-Chieh; Lu, Chin-Song; Yen, Tzu-Chen; Weng, Yi-Hsin
2015-01-01
We aimed at improving the existing methods for the fully automatic quantification of striatal uptake of [(99m)Tc]-TRODAT with SPECT imaging. A normal [(99m)Tc]-TRODAT template was first formed based on 28 healthy controls. Images from PD patients (n = 365) and nPD subjects (28 healthy controls and 33 essential tremor patients) were spatially normalized to the normal template. We performed an inverse transform on the predefined striatal and reference volumes of interest (VOIs) and applied the transformed VOIs to the original image data to calculate the striatal-to-reference ratio (SRR). The diagnostic performance of the SRR was determined through receiver operating characteristic (ROC) analysis. The SRR measured with our new and automatic method demonstrated excellent diagnostic performance with 92% sensitivity, 90% specificity, 92% accuracy, and an area under the curve (AUC) of 0.94. For the evaluation of the mean SRR and the clinical duration, a quadratic function fit the data with R (2) = 0.84. We developed and validated a fully automatic method for the quantification of the SRR in a large study sample. This method has an excellent diagnostic performance and exhibits a strong correlation between the mean SRR and the clinical duration in PD patients.
New class of radioenzymatic assay for the quantification of p-tyramine and phenylethylamine
DOE Office of Scientific and Technical Information (OSTI.GOV)
Henry, D.P.; Van Huysse, J.W.; Bowsher, R.R.
Radioenzymatic assays are widely used for the quantification of a number of biogenic amines. All previous procedures have utilized methyltransferases derived from mammalian tissues. In this assay for the quantification of the trace aralkylamines, p-tyramine (p-tym) and phenylethylamine (PEA), an enzyme, tyramine N-methyltransferase isolated from sprouted barley roots was used. The enzyme was specific for phenylethylamines. Of 26 structurally-related compounds, only p-tym, PEA, m-tym and amphetamine were substrates in vitro. Theoretic maximal methylation of substrates occurred at 10-20/sup 0/C. When TLC was used to separate the radiolabeled reaction products, a specific method was developed for p-tym and PEA. The assaymore » had a sensitivity of 0.8 and 2.8 pg/tube with a C.V. < 5% and was applicable to human plasma and urine. Assay throughput is similar to that of other TLC based radioenzymatic assays.« less
Addressing matrix effects in ligand-binding assays through the use of new reagents and technology.
Chilewski, Shannon D; Mora, Johanna R; Gleason, Carol; DeSilva, Binodh
2014-04-01
Ligand-binding assays (LBAs) used in the quantification of biotherapeutics for pharmacokinetic determinations rely on interactions between reagents (antibodies or target molecule) and the biotherapeutic. Most LBAs do not employ an analyte extraction procedure and are susceptible to matrix interference. Here, we present a case study on the development of a LBA for the quantification of a PEGylated domain antibody where matrix interference was observed. The assay used to support the single ascending dose study was a plate-based electrochemiluminescent assay with a lower limit of quantification of 80 ng/mL. To meet sensitivity requirements of future studies, new reagents and the Gyrolab™ Workstation were evaluated. Assay sensitivity improved nearly threefold in the final method utilizing new antibody reagents, a buffer containing blockers to human anti-animal antibodies, and the Gyrolab Workstation. Experimental data indicate that all factors changed played a role in overcoming matrix effects.
Experimental validation of 2D uncertainty quantification for DIC.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reu, Phillip L.
Because digital image correlation (DIC) has become such an important and standard tool in the toolbox of experimental mechanicists, a complete uncertainty quantification of the method is needed. It should be remembered that each DIC setup and series of images will have a unique uncertainty based on the calibration quality and the image and speckle quality of the analyzed images. Any pretest work done with a calibrated DIC stereo-rig to quantify the errors using known shapes and translations, while useful, do not necessarily reveal the uncertainty of a later test. This is particularly true with high-speed applications where actual testmore » images are often less than ideal. Work has previously been completed on the mathematical underpinnings of DIC uncertainty quantification and is already published, this paper will present corresponding experimental work used to check the validity of the uncertainty equations.« less
Experimental validation of 2D uncertainty quantification for digital image correlation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Reu, Phillip L.
Because digital image correlation (DIC) has become such an important and standard tool in the toolbox of experimental mechanicists, a complete uncertainty quantification of the method is needed. It should be remembered that each DIC setup and series of images will have a unique uncertainty based on the calibration quality and the image and speckle quality of the analyzed images. Any pretest work done with a calibrated DIC stereo-rig to quantify the errors using known shapes and translations, while useful, do not necessarily reveal the uncertainty of a later test. This is particularly true with high-speed applications where actual testmore » images are often less than ideal. Work has previously been completed on the mathematical underpinnings of DIC uncertainty quantification and is already published, this paper will present corresponding experimental work used to check the validity of the uncertainty equations.« less
Le Troter, Arnaud; Fouré, Alexandre; Guye, Maxime; Confort-Gouny, Sylviane; Mattei, Jean-Pierre; Gondin, Julien; Salort-Campana, Emmanuelle; Bendahan, David
2016-04-01
Atlas-based segmentation is a powerful method for automatic structural segmentation of several sub-structures in many organs. However, such an approach has been very scarcely used in the context of muscle segmentation, and so far no study has assessed such a method for the automatic delineation of individual muscles of the quadriceps femoris (QF). In the present study, we have evaluated a fully automated multi-atlas method and a semi-automated single-atlas method for the segmentation and volume quantification of the four muscles of the QF and for the QF as a whole. The study was conducted in 32 young healthy males, using high-resolution magnetic resonance images (MRI) of the thigh. The multi-atlas-based segmentation method was conducted in 25 subjects. Different non-linear registration approaches based on free-form deformable (FFD) and symmetric diffeomorphic normalization algorithms (SyN) were assessed. Optimal parameters of two fusion methods, i.e., STAPLE and STEPS, were determined on the basis of the highest Dice similarity index (DSI) considering manual segmentation (MSeg) as the ground truth. Validation and reproducibility of this pipeline were determined using another MRI dataset recorded in seven healthy male subjects on the basis of additional metrics such as the muscle volume similarity values, intraclass coefficient, and coefficient of variation. Both non-linear registration methods (FFD and SyN) were also evaluated as part of a single-atlas strategy in order to assess longitudinal muscle volume measurements. The multi- and the single-atlas approaches were compared for the segmentation and the volume quantification of the four muscles of the QF and for the QF as a whole. Considering each muscle of the QF, the DSI of the multi-atlas-based approach was high 0.87 ± 0.11 and the best results were obtained with the combination of two deformation fields resulting from the SyN registration method and the STEPS fusion algorithm. The optimal variables for FFD and SyN registration methods were four templates and a kernel standard deviation ranging between 5 and 8. The segmentation process using a single-atlas-based method was more robust with DSI values higher than 0.9. From the vantage of muscle volume measurements, the multi-atlas-based strategy provided acceptable results regarding the QF muscle as a whole but highly variable results regarding individual muscle. On the contrary, the performance of the single-atlas-based pipeline for individual muscles was highly comparable to the MSeg, thereby indicating that this method would be adequate for longitudinal tracking of muscle volume changes in healthy subjects. In the present study, we demonstrated that both multi-atlas and single-atlas approaches were relevant for the segmentation of individual muscles of the QF in healthy subjects. Considering muscle volume measurements, the single-atlas method provided promising perspectives regarding longitudinal quantification of individual muscle volumes.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Du, Dan; Chen, Aiqiong; Xie, Yunying
2011-05-15
A new sandwich-like electrochemical immunosensor has been developed for quantification of organophosphorylated acetylcholinesterase (OP-AChE), an exposure biomarker of organophosphate pesticides and nerve agents. Zirconia nanoparticles (ZrO2 NPs) were anchored on a screen printed electrode (SPE) to preferably capture OP-AChE adducts by metal chelation with phospho-moieties, which was selectively recognized by lead phosphate-apoferritin labeled anti-AChE antibody (LPA-anti-AChE). The sandwich-like immunoreactions were performed among ZrO2 NPs, OP-AChE and LPA-anti-AChE to form ZrO2/OP-AChE/LPA-anti-AChE complex and the released lead ions were detected on a disposable SPE. The binding affinity was investigated by both square wave voltammetry (SWV) and quartz crystal microbalance (QCM) measurements. Themore » proposed immunosensor yielded a linear response current over a broad OP-AChE concentrations range from 0.05 nM to 10 nM, with detection limit of 0.02 nM, which has enough sensitivity for monitoring of low-dose exposure to OPs. This method avoids the drawback of unavailability of commercial OP-specific antibody as well as amplifies detection signal by using apoferritin encoded metallic phosphate nanoparticle tags. This nanoparticle-based immunosensor offers a new method for rapid, sensitive, selective and inexpensive quantification of phosphorylated adducts for monitoring of OP pesticides and nerve agents exposures.« less
Talio, María C; Zambrano, Karen; Kaplan, Marcos; Acosta, Mariano; Gil, Raúl A; Luconi, Marta O; Fernández, Liliana P
2015-10-01
A new environmental friendly methodology based on fluorescent signal enhancement of rhodamine B dye is proposed for Pb(II) traces quantification using a preconcentration step based on the coacervation phenomenon. A cationic surfactant (cetyltrimethylammonium bromide, CTAB) and potassium iodine were chosen for this aim. The coacervate phase was collected on a filter paper disk and the solid surface fluorescence signal was determined in a spectrofluorometer. Experimental variables that influence on preconcentration step and fluorimetric sensitivity have been optimized using uni-variation assays. The calibration graph using zero th order regression was linear from 7.4×10(-4) to 3.4 μg L(-1) with a correlation coefficient of 0.999. Under the optimal conditions, a limit of detection of 2.2×10(-4) μg L(-1) and a limit of quantification of 7.4×10(-4) μg L(-1) were obtained. The method showed good sensitivity, adequate selectivity with good tolerance to foreign ions, and was applied to the determination of trace amounts of Pb(II) in refill solutions for e-cigarettes with satisfactory results validated by ICP-MS. The proposed method represents an innovative application of coacervation processes and of paper filters to solid surface fluorescence methodology. Copyright © 2015 Elsevier B.V. All rights reserved.
Solid-phase microextraction method development for headspace analysis of volatile flavor compounds.
Roberts, D D; Pollien, P; Milo, C
2000-06-01
Solid-phase microextraction (SPME) fibers were evaluated for their ability to adsorb volatile flavor compounds under various conditions with coffee and aqueous flavored solutions. Experiments comparing different fibers showed that poly(dimethylsiloxane)/divinylbenzene had the highest overall sensitivity. Carboxen/poly(dimethylsiloxane) was the most sensitive to small molecules and acids. As the concentrations of compounds increased, the quantitative linear range was exceeded as shown by competition effects with 2-isobutyl-3-methoxypyrazine at concentrations above 1 ppm. A method based on a short-time sampling of the headspace (1 min) was shown to better represent the equilibrium headspace concentration. Analysis of coffee brew with a 1-min headspace adsorption time was verified to be within the linear range for most compounds and thus appropriate for relative headspace quantification. Absolute quantification of volatiles, using isotope dilution assays (IDA), is not subject to biases caused by excess compound concentrations or complex matrices. The degradation of coffee aroma volatiles during storage was followed by relative headspace measurements and absolute quantifications. Both methods gave similar values for 3-methylbutanal, 4-ethylguaiacol, and 2,3-pentanedione. Acetic acid, however, gave higher values during storage upon relative headspace measurements due to concurrent pH decreases that were not seen with IDA.
Torey, Angeline; Sasidharan, Sreenivasan; Yeng, Chen; Latha, Lachimanan Yoga
2010-05-10
Quality control standardizations of the various medicinal plants used in traditional medicine is becoming more important today in view of the commercialization of formulations based on these plants. An attempt at standardization of Cassia spectabilis leaf has been carried out with respect to authenticity, assay and chemical constituent analysis. The authentication involved many parameters, including gross morphology, microscopy of the leaves and functional group analysis by Fourier Transform Infrared (FTIR) spectroscopy. The assay part of standardization involved determination of the minimum inhibitory concentration (MIC) of the extract which could help assess the chemical effects and establish curative values. The MIC of the C. spectabilis leaf extracts was investigated using the Broth Dilution Method. The extracts showed a MIC value of 6.25 mg/mL, independent of the extraction time. The chemical constituent aspect of standardization involves quantification of the main chemical components in C. spectabilis. The GCMS method used for quantification of 2,4-(1H,3H)-pyrimidinedione in the extract was rapid, accurate, precise, linear (R(2) = 0.8685), rugged and robust. Hence this method was suitable for quantification of this component in C. spectabilis. The standardization of C. spectabilis is needed to facilitate marketing of medicinal plants, with a view to promoting the export of valuable Malaysian Traditional Medicinal plants such as C. spectabilis.
Varga-Szemes, Akos; Simor, Tamas; Lenkey, Zsofia; van der Geest, Rob J; Kirschner, Robert; Toth, Levente; Brott, Brigitta C.; Ada, Elgavish; Elgavish, Gabriel A.
2014-01-01
Purpose To study the feasibility of a myocardial infarct (MI) quantification method (Signal Intensity-based Percent Infarct Mapping, SI-PIM) that is able to evaluate not only the size, but also the density distribution of the MI. Methods In 14 male swine, MI was generated by 90 minutes of closed-chest balloon occlusion followed by reperfusion. Seven (n=7) or 56 (n=7) days after reperfusion, Gd-DTPA-bolus and continuous-infusion enhanced Late Gadolinium Enhancement (LGE) MRI, and R1-mapping were carried out and post mortem triphenyl-tetrazolium-chloride (TTC) staining was performed. MI was quantified using binary (2 or 5 standard deviations, SD), SI-PIM and R1-PIM methods. Infarct Fraction (IF), and Infarct-Involved Voxel Fraction (IIVF) were determined by each MRI method. Bias of each method was compared to the TTC technique. Results The accuracy of MI quantification did not depend on the method of contrast administration or the age of the MI. IFs obtained by either of the two PIM methods were statistically not different from the IFs derived from the TTC measurements at either MI age. IFs obtained from the binary 2SD method overestimated IF obtained from TTC. IIVF among the three different PIM methods did not vary, but with the binary methods the IIVF gradually decreased with increasing the threshold limit. Conclusions The advantage of SI-PIM over the conventional binary method is the ability to represent not only IF but also the density distribution of the MI. Since the SI-PIM methods are based on a single LGE acquisition, the bolus-data-based SI-PIM method can effortlessly be incorporated into the clinical image post-processing procedure. PMID:24718787
Quantification of mixed chimerism by real time PCR on whole blood-impregnated FTA cards.
Pezzoli, N; Silvy, M; Woronko, A; Le Treut, T; Lévy-Mozziconacci, A; Reviron, D; Gabert, J; Picard, C
2007-09-01
This study has investigated quantification of chimerism in sex-mismatched transplantations by quantitative real time PCR (RQ-PCR) using FTA paper for blood sampling. First, we demonstrate that the quantification of DNA from EDTA-blood which has been deposit on FTA card is accurate and reproducible. Secondly, we show that fraction of recipient cells detected by RQ-PCR was concordant between the FTA and salting-out method, reference DNA extraction method. Furthermore, the sensitivity of detection of recipient cells is relatively similar with the two methods. Our results show that this innovative method can be used for MC assessment by RQ-PCR.
NASA Astrophysics Data System (ADS)
Ramanjaneyulu, P. S.; Sayi, Y. S.; Ramakumar, K. L.
2008-08-01
Quantification of boron in diverse materials of relevance in nuclear technology is essential in view of its high thermal neutron absorption cross section. A simple and sensitive method has been developed for the determination of boron in uranium-aluminum-silicon alloy, based on leaching of boron with 6 M HCl and H 2O 2, its selective separation by solvent extraction with 2-ethyl hexane 1,3-diol and quantification by spectrophotometry using curcumin. The method has been evaluated by standard addition method and validated by inductively coupled plasma-atomic emission spectroscopy. Relative standard deviation and absolute detection limit of the method are 3.0% (at 1 σ level) and 12 ng, respectively. All possible sources of uncertainties in the methodology have been individually assessed, following the International Organization for Standardization guidelines. The combined uncertainty is calculated employing uncertainty propagation formulae. The expanded uncertainty in the measurement at 95% confidence level (coverage factor 2) is 8.840%.
Establishment of a method for determination of arsenic species in seafood by LC-ICP-MS.
Zmozinski, Ariane V; Llorente-Mirandes, Toni; López-Sánchez, José F; da Silva, Márcia M
2015-04-15
An analytical method for determination of arsenic species (inorganic arsenic (iAs), methylarsonic acid (MA), dimethylarsinic acid (DMA), arsenobetaine (AB), trimethylarsine oxide (TMAO) and arsenocholine (AC)) in Brazilian and Spanish seafood samples is reported. This study was focused on extraction and quantification of inorganic arsenic (iAs), the most toxic form. Arsenic speciation was carried out via LC with both anionic and cationic exchange with ICP-MS detection (LC-ICP-MS). The detection limits (LODs), quantification limits (LOQs), precision and accuracy for arsenic species were established. The proposed method was evaluated using eight reference materials (RMs). Arsenobetaine was the main species found in all samples. The total and iAs concentration in 22 seafood samples and RMs ranged between 0.27-35.2 and 0.02-0.71 mg As kg(-1), respectively. Recoveries ranging from 100% to 106% for iAs, based on spikes, were achieved. The proposed method provides reliable iAs data for future risk assessment analysis. Copyright © 2014 Elsevier Ltd. All rights reserved.
Ratiometric Raman Spectroscopy for Quantification of Protein Oxidative Damage
Jiang, Dongping; Yanney, Michael; Zou, Sige; Sygula, Andrzej
2009-01-01
A novel ratiometric Raman spectroscopic (RMRS) method has been developed for quantitative determination of protein carbonyl levels. Oxidized bovine serum albumin (BSA) and oxidized lysozyme were used as model proteins to demonstrate this method. The technique involves conjugation of protein carbonyls with dinitrophenyl hydrazine (DNPH), followed by drop coating deposition Raman spectral acquisition (DCDR). The RMRS method is easy to implement as it requires only one conjugation reaction, a single spectral acquisition, and does not require sample calibration. Characteristic peaks from both protein and DNPH moieties are obtained in a single spectral acquisition, allowing the protein carbonyl level to be calculated from the peak intensity ratio. Detection sensitivity for the RMRS method is ~0.33 pmol carbonyl/measurement. Fluorescence and/or immunoassay based techniques only detect a signal from the labeling molecule and thus yield no structural or quantitative information for the modified protein while the RMRS technique provides for protein identification and protein carbonyl quantification in a single experiment. PMID:19457432
Groves, Kate; Cryar, Adam; Walker, Michael; Quaglia, Milena
2018-01-01
Assessing the recovery of food allergens from solid processed matrixes is one of the most difficult steps that needs to be overcome to enable the accurate quantification of protein allergens by immunoassay and MS. A feasibility study is described herein applying International System of Units (SI)-traceably quantified milk protein solutions to assess recovery by an improved extraction method. Untargeted MS analysis suggests that this novel extraction method can be further developed to provide high recoveries for a broad range of food allergens. A solution of α-casein was traceably quantified to the SI for the content of α-S1 casein. Cookie dough was prepared by spiking a known amount of the SI-traceable quantified solution into a mixture of flour, sugar, and soya spread, followed by baking. A novel method for the extraction of protein food allergens from solid matrixes based on proteolytic digestion was developed, and its performance was compared with the performance of methods reported in the literature.
Sun, Juan; Li, Weixi; Zhang, Yan; Hu, Xuexu; Wu, Li; Wang, Bujun
2016-12-15
A method based on the QuEChERS (quick, easy, cheap, effective, rugged, and safe) purification combined with ultrahigh performance liquid chromatography tandem mass spectrometry (UPLC-MS/MS), was optimized for the simultaneous quantification of 25 mycotoxins in cereals. Samples were extracted with a solution containing 80% acetonitrile and 0.1% formic acid, and purified with QuEChERS before being separated by a C18 column. The mass spectrometry was conducted by using positive electrospray ionization (ESI+) and multiple reaction monitoring (MRM) models. The method gave good linear relations with regression coefficients ranging from 0.9950 to 0.9999. The detection limits ranged from 0.03 to 15.0 µg·kg -1 , and the average recovery at three different concentrations ranged from 60.2% to 115.8%, with relative standard deviations (RSD%) varying from 0.7% to 19.6% for the 25 mycotoxins. The method is simple, rapid, accurate, and an improvement compared with the existing methods published so far.
Cereal β-glucan quantification with calcofluor-application to cell culture supernatants.
Rieder, Anne; Knutsen, Svein H; Ballance, Simon; Grimmer, Stine; Airado-Rodríguez, Diego
2012-11-06
The specific binding of the fluorescent dye calcofluor to cereal β-glucan results in increased fluorescence intensity of the formed complex and is in use for the quantification of β-glucan above a critical molecular weight (MW) by flow injection analysis. In this study, this method was applied in a fast and easy batch mode. In order to emphasize the spectral information of the emission spectra of the calcofluor/β-glucan complexes, derivative signals were calculated. A linear relationship was found between the amplitude of the second derivative signals and the β-glucan concentration between 0.1 and 0.4 μg/mL. The low detection limit of this new method (0.045 μg/mL) enabled its use to study the transport of cereal β-glucans over differentiated Caco-2 cell monolayers. Additionally, the method was applied to quantify β-glucan in arabinoxylan samples, which correlated well with data by an enzyme based method. Copyright © 2012 Elsevier Ltd. All rights reserved.
Biniarz, Piotr; Łukaszewicz, Marcin
2017-06-01
The rapid and accurate quantification of biosurfactants in biological samples is challenging. In contrast to the orcinol method for rhamnolipids, no simple biochemical method is available for the rapid quantification of lipopeptides. Various liquid chromatography (LC) methods are promising tools for relatively fast and exact quantification of lipopeptides. Here, we report strategies for the quantification of the lipopeptides pseudofactin and surfactin in bacterial cultures using different high- (HPLC) and ultra-performance liquid chromatography (UPLC) systems. We tested three strategies for sample pretreatment prior to LC analysis. In direct analysis (DA), bacterial cultures were injected directly and analyzed via LC. As a modification, we diluted the samples with methanol and detected an increase in lipopeptide recovery in the presence of methanol. Therefore, we suggest this simple modification as a tool for increasing the accuracy of LC methods. We also tested freeze-drying followed by solvent extraction (FDSE) as an alternative for the analysis of "heavy" samples. In FDSE, the bacterial cultures were freeze-dried, and the resulting powder was extracted with different solvents. Then, the organic extracts were analyzed via LC. Here, we determined the influence of the extracting solvent on lipopeptide recovery. HPLC methods allowed us to quantify pseudofactin and surfactin with run times of 15 and 20 min per sample, respectively, whereas UPLC quantification was as fast as 4 and 5.5 min per sample, respectively. Our methods provide highly accurate measurements and high recovery levels for lipopeptides. At the same time, UPLC-MS provides the possibility to identify lipopeptides and their structural isoforms.
2015-01-01
Food consumption is an important behavior that is regulated by an intricate array of neuropeptides (NPs). Although many feeding-related NPs have been identified in mammals, precise mechanisms are unclear and difficult to study in mammals, as current methods are not highly multiplexed and require extensive a priori knowledge about analytes. New advances in data-independent acquisition (DIA) MS/MS and the open-source quantification software Skyline have opened up the possibility to identify hundreds of compounds and quantify them from a single DIA MS/MS run. An untargeted DIA MSE quantification method using Skyline software for multiplexed, discovery-driven quantification was developed and found to produce linear calibration curves for peptides at physiologically relevant concentrations using a protein digest as internal standard. By using this method, preliminary relative quantification of the crab Cancer borealis neuropeptidome (<2 kDa, 137 peptides from 18 families) was possible in microdialysates from 8 replicate feeding experiments. Of these NPs, 55 were detected with an average mass error below 10 ppm. The time-resolved profiles of relative concentration changes for 6 are shown, and there is great potential for the use of this method in future experiments to aid in correlation of NP changes with behavior. This work presents an unbiased approach to winnowing candidate NPs related to a behavior of interest in a functionally relevant manner, and demonstrates the success of such a UPLC-MSE quantification method using the open source software Skyline. PMID:25552291
Schmerberg, Claire M; Liang, Zhidan; Li, Lingjun
2015-01-21
Food consumption is an important behavior that is regulated by an intricate array of neuropeptides (NPs). Although many feeding-related NPs have been identified in mammals, precise mechanisms are unclear and difficult to study in mammals, as current methods are not highly multiplexed and require extensive a priori knowledge about analytes. New advances in data-independent acquisition (DIA) MS/MS and the open-source quantification software Skyline have opened up the possibility to identify hundreds of compounds and quantify them from a single DIA MS/MS run. An untargeted DIA MS(E) quantification method using Skyline software for multiplexed, discovery-driven quantification was developed and found to produce linear calibration curves for peptides at physiologically relevant concentrations using a protein digest as internal standard. By using this method, preliminary relative quantification of the crab Cancer borealis neuropeptidome (<2 kDa, 137 peptides from 18 families) was possible in microdialysates from 8 replicate feeding experiments. Of these NPs, 55 were detected with an average mass error below 10 ppm. The time-resolved profiles of relative concentration changes for 6 are shown, and there is great potential for the use of this method in future experiments to aid in correlation of NP changes with behavior. This work presents an unbiased approach to winnowing candidate NPs related to a behavior of interest in a functionally relevant manner, and demonstrates the success of such a UPLC-MS(E) quantification method using the open source software Skyline.
Uclés, A; Ulaszewska, M M; Hernando, M D; Ramos, M J; Herrera, S; García, E; Fernández-Alba, A R
2013-07-01
This work introduces a liquid chromatography-electrospray ionization-hybrid quadrupole/time-of-flight mass spectrometry (LC-ESI-QTOF-MS)-based method for qualitative and quantitative analysis of poly(amidoamine) (PAMAM) dendrimers of generations 0 to 3 in an aqueous matrix. The multiple charging of PAMAM dendrimers generated by means of ESI has provided key advantages in dendrimer identification by assignation of charge state through high resolution of isotopic clusters. Isotopic distribution in function of abundance of isotopes (12)C and (13)C yielded valuable and complementarity data for confident characterization. A mass accuracy below 3.8 ppm for the most abundant isotopes (diagnostic ions) provided unambiguous identification of PAMAM dendrimers. Validation of the LC-ESI-QTOF-MS method and matrix effect evaluation enabled reliable and reproducible quantification. The validation parameters, limits of quantification in the range of 0.012 to 1.73 μM, depending on the generation, good linear range (R > 0.996), repeatability (RSD < 13.4%), and reproducibility (RSD < 10.9%) demonstrated the suitability of the method for the quantification of dendrimers in aqueous matrices (water and wastewater). The added selectivity, achieved by multicharge phenomena, represents a clear advantage in screening aqueous mixtures due to the fact that the matrix had no significant effect on ionization, with what is evidenced by an absence of sensitivity loss in most generations of PAMAM dendrimers. Fig Liquid chromatography-electrospray ionization-hybrid quadrupole/time of flight mass spectrometry (LC-ESI-QTOF-MS) based method for qualitative and quantitative analysis of PAMAM dendrimers in aqueous matrix.
Dziadosz, Marek
2018-01-01
The aim of this work was to develop a fast, cost-effective and time-saving liquid chromatography-tandem mass spectrometry (LC-MS/MS) analytical method for the analysis of ethylene glycol (EG) in human serum. For these purposes, the formation/fragmentation of an EG adduct ion with sodium and sodium acetate was applied in the positive electrospray mode for signal detection. Adduct identification was performed with appropriate infusion experiments based on analyte solutions prepared in different concentrations. Corresponding analyte adduct ions and adduct ion fragments could be identified both for EG and the deuterated internal standard (EG-D4). Protein precipitation was used as sample preparation. The analysis of the supernatant was performed with a Luna 5μm C18 (2) 100A, 150mm×2mm analytical column and a mobile phase consisting of 95% A (H 2 O/methanol=95/5, v/v) and 5% B (H 2 O/methanol=3/97, v/v), both with 10mmolL -1 ammonium acetate and 0.1% acetic acid. Method linearity was examined in the range of 100-4000μg/mL and the calculated limit of detection/quantification was 35/98μg/mL. However, on the basis of the signal to noise ratio, quantification was recommended at a limit of 300μg/mL. Additionally, the examined precision, accuracy, stability, selectivity and matrix effect demonstrated that the method is a practicable alternative for EG quantification in human serum. In comparison to other methods based on liquid chromatography, the strategy presented made for the first time the EG analysis without analyte derivatisation possible. Copyright © 2017 Elsevier B.V. All rights reserved.
Rapid quantification and sex determination of forensic evidence materials.
Andréasson, Hanna; Allen, Marie
2003-11-01
DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.
Histogram analysis for smartphone-based rapid hematocrit determination
Jalal, Uddin M.; Kim, Sang C.; Shim, Joon S.
2017-01-01
A novel and rapid analysis technique using histogram has been proposed for the colorimetric quantification of blood hematocrits. A smartphone-based “Histogram” app for the detection of hematocrits has been developed integrating the smartphone embedded camera with a microfluidic chip via a custom-made optical platform. The developed histogram analysis shows its effectiveness in the automatic detection of sample channel including auto-calibration and can analyze the single-channel as well as multi-channel images. Furthermore, the analyzing method is advantageous to the quantification of blood-hematocrit both in the equal and varying optical conditions. The rapid determination of blood hematocrits carries enormous information regarding physiological disorders, and the use of such reproducible, cost-effective, and standard techniques may effectively help with the diagnosis and prevention of a number of human diseases. PMID:28717569
Silva, Cristina; Fresco, Paula; Monteiro, Joaquim; Rama, Ana Cristina Ribeiro
2013-08-01
Evidence-Based Practice requires health care decisions to be based on the best available evidence. The model "Information Mastery" proposes that clinicians should use sources of information that have previously evaluated relevance and validity, provided at the point of care. Drug databases (DB) allow easy and fast access to information and have the benefit of more frequent content updates. Relevant information, in the context of drug therapy, is that which supports safe and effective use of medicines. Accordingly, the European Guideline on the Summary of Product Characteristics (EG-SmPC) was used as a standard to evaluate the inclusion of relevant information contents in DB. To develop and test a method to evaluate relevancy of DB contents, by assessing the inclusion of information items deemed relevant for effective and safe drug use. Hierarchical organisation and selection of the principles defined in the EGSmPC; definition of criteria to assess inclusion of selected information items; creation of a categorisation and quantification system that allows score calculation; calculation of relative differences (RD) of scores for comparison with an "ideal" database, defined as the one that achieves the best quantification possible for each of the information items; pilot test on a sample of 9 drug databases, using 10 drugs frequently associated in literature with morbidity-mortality and also being widely consumed in Portugal. Main outcome measure Calculate individual and global scores for clinically relevant information items of drug monographs in databases, using the categorisation and quantification system created. A--Method development: selection of sections, subsections, relevant information items and corresponding requisites; system to categorise and quantify their inclusion; score and RD calculation procedure. B--Pilot test: calculated scores for the 9 databases; globally, all databases evaluated significantly differed from the "ideal" database; some DB performed better but performance was inconsistent at subsections level, within the same DB. The method developed allows quantification of the inclusion of relevant information items in DB and comparison with an "ideal database". It is necessary to consult diverse DB in order to find all the relevant information needed to support clinical drug use.
NASA Astrophysics Data System (ADS)
Mallah, Muhammad Ali; Sherazi, Syed Tufail Hussain; Bhanger, Muhammad Iqbal; Mahesar, Sarfaraz Ahmed; Bajeer, Muhammad Ashraf
2015-04-01
A transmission FTIR spectroscopic method was developed for direct, inexpensive and fast quantification of paracetamol content in solid pharmaceutical formulations. In this method paracetamol content is directly analyzed without solvent extraction. KBr pellets were formulated for the acquisition of FTIR spectra in transmission mode. Two chemometric models: simple Beer's law and partial least squares employed over the spectral region of 1800-1000 cm-1 for quantification of paracetamol content had a regression coefficient of (R2) of 0.999. The limits of detection and quantification using FTIR spectroscopy were 0.005 mg g-1 and 0.018 mg g-1, respectively. Study for interference was also done to check effect of the excipients. There was no significant interference from the sample matrix. The results obviously showed the sensitivity of transmission FTIR spectroscopic method for pharmaceutical analysis. This method is green in the sense that it does not require large volumes of hazardous solvents or long run times and avoids prior sample preparation.
A Pragmatic Smoothing Method for Improving the Quality of the Results in Atomic Spectroscopy
NASA Astrophysics Data System (ADS)
Bennun, Leonardo
2017-07-01
A new smoothing method for the improvement on the identification and quantification of spectral functions based on the previous knowledge of the signals that are expected to be quantified, is presented. These signals are used as weighted coefficients in the smoothing algorithm. This smoothing method was conceived to be applied in atomic and nuclear spectroscopies preferably to these techniques where net counts are proportional to acquisition time, such as particle induced X-ray emission (PIXE) and other X-ray fluorescence spectroscopic methods, etc. This algorithm, when properly applied, does not distort the form nor the intensity of the signal, so it is well suited for all kind of spectroscopic techniques. This method is extremely effective at reducing high-frequency noise in the signal much more efficient than a single rectangular smooth of the same width. As all of smoothing techniques, the proposed method improves the precision of the results, but in this case we found also a systematic improvement on the accuracy of the results. We still have to evaluate the improvement on the quality of the results when this method is applied over real experimental results. We expect better characterization of the net area quantification of the peaks, and smaller Detection and Quantification Limits. We have applied this method to signals that obey Poisson statistics, but with the same ideas and criteria, it could be applied to time series. In a general case, when this algorithm is applied over experimental results, also it would be required that the sought characteristic functions, required for this weighted smoothing method, should be obtained from a system with strong stability. If the sought signals are not perfectly clean, this method should be carefully applied
Quantitative Method for Simultaneous Analysis of Acetaminophen and 6 Metabolites.
Lammers, Laureen A; Achterbergh, Roos; Pistorius, Marcel C M; Romijn, Johannes A; Mathôt, Ron A A
2017-04-01
Hepatotoxicity after ingestion of high-dose acetaminophen [N-acetyl-para-aminophenol (APAP)] is caused by the metabolites of the drug. To gain more insight into factors influencing susceptibility to APAP hepatotoxicity, quantification of APAP and metabolites is important. A few methods have been developed to simultaneously quantify APAP and its most important metabolites. However, these methods require a comprehensive sample preparation and long run times. The aim of this study was to develop and validate a simplified, but sensitive method for the simultaneous quantification of acetaminophen, the main metabolites acetaminophen glucuronide and acetaminophen sulfate, and 4 Cytochrome P450-mediated metabolites by using liquid chromatography with mass spectrometric (LC-MS) detection. The method was developed and validated for the human plasma, and it entailed a single method for sample preparation, enabling quick processing of the samples followed by an LC-MS method with a chromatographic run time of 9 minutes. The method was validated for selectivity, linearity, accuracy, imprecision, dilution integrity, recovery, process efficiency, ionization efficiency, and carryover effect. The method showed good selectivity without matrix interferences. For all analytes, the mean process efficiency was >86%, and the mean ionization efficiency was >94%. Furthermore, the accuracy was between 90.3% and 112% for all analytes, and the within- and between-run imprecision were <20% for the lower limit of quantification and <14.3% for the middle level and upper limit of quantification. The method presented here enables the simultaneous quantification of APAP and 6 of its metabolites. It is less time consuming than previously reported methods because it requires only a single and simple method for the sample preparation followed by an LC-MS method with a short run time. Therefore, this analytical method provides a useful method for both clinical and research purposes.
Takabatake, Reona; Masubuchi, Tomoko; Futo, Satoshi; Minegishi, Yasutaka; Noguchi, Akio; Kondo, Kazunari; Teshima, Reiko; Kurashima, Takeyo; Mano, Junichi; Kitta, Kazumi
2016-01-01
A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) maize, 3272. We first attempted to obtain genome DNA from this maize using a DNeasy Plant Maxi kit and a DNeasy Plant Mini kit, which have been widely utilized in our previous studies, but DNA extraction yields from 3272 were markedly lower than those from non-GM maize seeds. However, lowering of DNA extraction yields was not observed with GM quicker or Genomic-tip 20/G. We chose GM quicker for evaluation of the quantitative method. We prepared a standard plasmid for 3272 quantification. The conversion factor (Cf), which is required to calculate the amount of a genetically modified organism (GMO), was experimentally determined for two real-time PCR instruments, the Applied Biosystems 7900HT (the ABI 7900) and the Applied Biosystems 7500 (the ABI7500). The determined Cf values were 0.60 and 0.59 for the ABI 7900 and the ABI 7500, respectively. To evaluate the developed method, a blind test was conducted as part of an interlaboratory study. The trueness and precision were evaluated as the bias and reproducibility of the relative standard deviation (RSDr). The determined values were similar to those in our previous validation studies. The limit of quantitation for the method was estimated to be 0.5% or less, and we concluded that the developed method would be suitable and practical for detection and quantification of 3272.
Silva-Rodríguez, Jesús; Aguiar, Pablo; Sánchez, Manuel; Mosquera, Javier; Luna-Vega, Víctor; Cortés, Julia; Garrido, Miguel; Pombar, Miguel; Ruibal, Alvaro
2014-05-01
Current procedure guidelines for whole body [18F]fluoro-2-deoxy-D-glucose (FDG)-positron emission tomography (PET) state that studies with visible dose extravasations should be rejected for quantification protocols. Our work is focused on the development and validation of methods for estimating extravasated doses in order to correct standard uptake value (SUV) values for this effect in clinical routine. One thousand three hundred sixty-seven consecutive whole body FDG-PET studies were visually inspected looking for extravasation cases. Two methods for estimating the extravasated dose were proposed and validated in different scenarios using Monte Carlo simulations. All visible extravasations were retrospectively evaluated using a manual ROI based method. In addition, the 50 patients with higher extravasated doses were also evaluated using a threshold-based method. Simulation studies showed that the proposed methods for estimating extravasated doses allow us to compensate the impact of extravasations on SUV values with an error below 5%. The quantitative evaluation of patient studies revealed that paravenous injection is a relatively frequent effect (18%) with a small fraction of patients presenting considerable extravasations ranging from 1% to a maximum of 22% of the injected dose. A criterion based on the extravasated volume and maximum concentration was established in order to identify this fraction of patients that might be corrected for paravenous injection effect. The authors propose the use of a manual ROI based method for estimating the effectively administered FDG dose and then correct SUV quantification in those patients fulfilling the proposed criterion.
Validated method for quantification of genetically modified organisms in samples of maize flour.
Kunert, Renate; Gach, Johannes S; Vorauer-Uhl, Karola; Engel, Edwin; Katinger, Hermann
2006-02-08
Sensitive and accurate testing for trace amounts of biotechnology-derived DNA from plant material is the prerequisite for detection of 1% or 0.5% genetically modified ingredients in food products or raw materials thereof. Compared to ELISA detection of expressed proteins, real-time PCR (RT-PCR) amplification has easier sample preparation and detection limits are lower. Of the different methods of DNA preparation CTAB method with high flexibility in starting material and generation of sufficient DNA with relevant quality was chosen. Previous RT-PCR data generated with the SYBR green detection method showed that the method is highly sensitive to sample matrices and genomic DNA content influencing the interpretation of results. Therefore, this paper describes a real-time DNA quantification based on the TaqMan probe method, indicating high accuracy and sensitivity with detection limits of lower than 18 copies per sample applicable and comparable to highly purified plasmid standards as well as complex matrices of genomic DNA samples. The results were evaluated with ValiData for homology of variance, linearity, accuracy of the standard curve, and standard deviation.
NASA Astrophysics Data System (ADS)
Meyer, Jesse G.; D'Souza, Alexandria K.; Sorensen, Dylan J.; Rardin, Matthew J.; Wolfe, Alan J.; Gibson, Bradford W.; Schilling, Birgit
2016-11-01
Post-translational modification of lysine residues by NƐ-acylation is an important regulator of protein function. Many large-scale protein acylation studies have assessed relative changes of lysine acylation sites after antibody enrichment using mass spectrometry-based proteomics. Although relative acylation fold-changes are important, this does not reveal site occupancy, or stoichiometry, of individual modification sites, which is critical to understand functional consequences. Recently, methods for determining lysine acetylation stoichiometry have been proposed based on ratiometric analysis of endogenous levels to those introduced after quantitative per-acetylation of proteins using stable isotope-labeled acetic anhydride. However, in our hands, we find that these methods can overestimate acetylation stoichiometries because of signal interferences when endogenous levels of acylation are very low, which is especially problematic when using MS1 scans for quantification. In this study, we sought to improve the accuracy of determining acylation stoichiometry using data-independent acquisition (DIA). Specifically, we use SWATH acquisition to comprehensively collect both precursor and fragment ion intensity data. The use of fragment ions for stoichiometry quantification not only reduces interferences but also allows for determination of site-level stoichiometry from peptides with multiple lysine residues. We also demonstrate the novel extension of this method to measurements of succinylation stoichiometry using deuterium-labeled succinic anhydride. Proof of principle SWATH acquisition studies were first performed using bovine serum albumin for both acetylation and succinylation occupancy measurements, followed by the analysis of more complex samples of E. coli cell lysates. Although overall site occupancy was low (<1%), some proteins contained lysines with relatively high acetylation occupancy.
Air Quality Cumulative Effects Assessment for U.S. Air Force Bases.
1998-01-29
forecasted activities. Consideration of multimedia effects and transmedia impacts is important, however, in CEA. Any quantification method developed...substantive areas, such as water quality, ecology, planing, archeology , and landscape architecture? 9. Are there public concerns due to the impact risks of...methods developed for CEA should consider multimedia effects and transmedia impacts. Portions of this research can be used, or modified, to address other
Xenopoulos, Alex; Fadgen, Keith; Murphy, Jim; Skilton, St. John; Prentice, Holly; Stapels, Martha
2012-01-01
Assays for identification and quantification of host-cell proteins (HCPs) in biotherapeutic proteins over 5 orders of magnitude in concentration are presented. The HCP assays consist of two types: HCP identification using comprehensive online two-dimensional liquid chromatography coupled with high resolution mass spectrometry (2D-LC/MS), followed by high-throughput HCP quantification by liquid chromatography, multiple reaction monitoring (LC-MRM). The former is described as a “discovery” assay, the latter as a “monitoring” assay. Purified biotherapeutic proteins (e.g., monoclonal antibodies) were digested with trypsin after reduction and alkylation, and the digests were fractionated using reversed-phase (RP) chromatography at high pH (pH 10) by a step gradient in the first dimension, followed by a high-resolution separation at low pH (pH 2.5) in the second dimension. As peptides eluted from the second dimension, a quadrupole time-of-flight mass spectrometer was used to detect the peptides and their fragments simultaneously by alternating the collision cell energy between a low and an elevated energy (MSE methodology). The MSE data was used to identify and quantify the proteins in the mixture using a proven label-free quantification technique (“Hi3” method). The same data set was mined to subsequently develop target peptides and transitions for monitoring the concentration of selected HCPs on a triple quadrupole mass spectrometer in a high-throughput manner (20 min LC-MRM analysis). This analytical methodology was applied to the identification and quantification of low-abundance HCPs in six samples of PTG1, a recombinant chimeric anti-phosphotyrosine monoclonal antibody (mAb). Thirty three HCPs were identified in total from the PTG1 samples among which 21 HCP isoforms were selected for MRM monitoring. The absolute quantification of three selected HCPs was undertaken on two different LC-MRM platforms after spiking isotopically labeled peptides in the samples. Finally, the MRM quantitation results were compared with TOF-based quantification based on the Hi3 peptides, and the TOF and MRM data sets correlated reasonably well. The results show that the assays provide detailed valuable information to understand the relative contributions of purification schemes to the nature and concentrations of HCP impurities in biopharmaceutical samples, and the assays can be used as generic methods for HCP analysis in the biopharmaceutical industry. PMID:22327428
Onofrejová, Lucia; Farková, Marta; Preisler, Jan
2009-04-13
The application of an internal standard in quantitative analysis is desirable in order to correct for variations in sample preparation and instrumental response. In mass spectrometry of organic compounds, the internal standard is preferably labelled with a stable isotope, such as (18)O, (15)N or (13)C. In this study, a method for the quantification of fructo-oligosaccharides using matrix-assisted laser desorption/ionisation time-of-flight mass spectrometry (MALDI TOF MS) was proposed and tested on raftilose, a partially hydrolysed inulin with a degree of polymeration 2-7. A tetraoligosaccharide nystose, which is chemically identical to the raftilose tetramer, was used as an internal standard rather than an isotope-labelled analyte. Two mathematical approaches used for data processing, conventional calculations and artificial neural networks (ANN), were compared. The conventional data processing relies on the assumption that a constant oligomer dispersion profile will change after the addition of the internal standard and some simple numerical calculations. On the other hand, ANN was found to compensate for a non-linear MALDI response and variations in the oligomer dispersion profile with raftilose concentration. As a result, the application of ANN led to lower quantification errors and excellent day-to-day repeatability compared to the conventional data analysis. The developed method is feasible for MS quantification of raftilose in the range of 10-750 pg with errors below 7%. The content of raftilose was determined in dietary cream; application can be extended to other similar polymers. It should be stressed that no special optimisation of the MALDI process was carried out. A common MALDI matrix and sample preparation were used and only the basic parameters, such as sampling and laser energy, were optimised prior to quantification.
Koehler, Christian J; Arntzen, Magnus Ø; Thiede, Bernd
2015-05-15
Stable isotopic labeling techniques are useful for quantitative proteomics. A cost-effective and convenient method for diethylation by reductive amination was established. The impact using either carbon-13 or deuterium on quantification accuracy and precision was investigated using diethylation. We established an effective approach for stable isotope labeling by diethylation of amino groups of peptides. The approach was validated using matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) and nanospray liquid chromatography/electrospray ionization (nanoLC/ESI)-ion trap/orbitrap for mass spectrometric analysis as well as MaxQuant for quantitative data analysis. Reaction conditions with low reagent costs, high yields and minor side reactions were established for diethylation. Furthermore, we showed that diethylation can be applied to up to sixplex labeling. For duplex experiments, we compared diethylation in the analysis of the proteome of HeLa cells using acetaldehyde-(13) C(2)/(12) C(2) and acetaldehyde-(2) H(4)/(1) H(4). Equal numbers of proteins could be identified and quantified; however, (13) C(4)/(12) C(4) -diethylation revealed a lower variance of quantitative peptide ratios within proteins resulting in a higher precision of quantified proteins and less falsely regulated proteins. The results were compared with dimethylation showing minor effects because of the lower number of deuteriums. The described approach for diethylation of primary amines is a cost-effective and accurate method for up to sixplex relative quantification of proteomes. (13) C(4)/(12) C(4) -diethylation enables duplex quantification based on chemical labeling without using deuterium which reduces identification of false-negatives and increases the quality of the quantification results. Copyright © 2015 John Wiley & Sons, Ltd.
USDA-ARS?s Scientific Manuscript database
High performance liquid chromatography of dabsyl derivatives of amino acids was employed for quantification of physiologic amino acids in cucurbits. This method is particularly useful because the dabsyl derivatives of glutamine and citrulline are sufficiently separated to allow quantification of ea...
Are LOD and LOQ Reliable Parameters for Sensitivity Evaluation of Spectroscopic Methods?
Ershadi, Saba; Shayanfar, Ali
2018-03-22
The limit of detection (LOD) and the limit of quantification (LOQ) are common parameters to assess the sensitivity of analytical methods. In this study, the LOD and LOQ of previously reported terbium sensitized analysis methods were calculated by different methods, and the results were compared with sensitivity parameters [lower limit of quantification (LLOQ)] of U.S. Food and Drug Administration guidelines. The details of the calibration curve and standard deviation of blank samples of three different terbium-sensitized luminescence methods for the quantification of mycophenolic acid, enrofloxacin, and silibinin were used for the calculation of LOD and LOQ. A comparison of LOD and LOQ values calculated by various methods and LLOQ shows a considerable difference. The significant difference of the calculated LOD and LOQ with various methods and LLOQ should be considered in the sensitivity evaluation of spectroscopic methods.
EPRI/NRC-RES fire human reliability analysis guidelines.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lewis, Stuart R.; Cooper, Susan E.; Najafi, Bijan
2010-03-01
During the 1990s, the Electric Power Research Institute (EPRI) developed methods for fire risk analysis to support its utility members in the preparation of responses to Generic Letter 88-20, Supplement 4, 'Individual Plant Examination - External Events' (IPEEE). This effort produced a Fire Risk Assessment methodology for operations at power that was used by the majority of U.S. nuclear power plants (NPPs) in support of the IPEEE program and several NPPs overseas. Although these methods were acceptable for accomplishing the objectives of the IPEEE, EPRI and the U.S. Nuclear Regulatory Commission (NRC) recognized that they required upgrades to support currentmore » requirements for risk-informed, performance-based (RI/PB) applications. In 2001, EPRI and the USNRC's Office of Nuclear Regulatory Research (RES) embarked on a cooperative project to improve the state-of-the-art in fire risk assessment to support a new risk-informed environment in fire protection. This project produced a consensus document, NUREG/CR-6850 (EPRI 1011989), entitled 'Fire PRA Methodology for Nuclear Power Facilities' which addressed fire risk for at power operations. NUREG/CR-6850 developed high level guidance on the process for identification and inclusion of human failure events (HFEs) into the fire PRA (FPRA), and a methodology for assigning quantitative screening values to these HFEs. It outlined the initial considerations of performance shaping factors (PSFs) and related fire effects that may need to be addressed in developing best-estimate human error probabilities (HEPs). However, NUREG/CR-6850 did not describe a methodology to develop best-estimate HEPs given the PSFs and the fire-related effects. In 2007, EPRI and RES embarked on another cooperative project to develop explicit guidance for estimating HEPs for human failure events under fire generated conditions, building upon existing human reliability analysis (HRA) methods. This document provides a methodology and guidance for conducting a fire HRA. This process includes identification and definition of post-fire human failure events, qualitative analysis, quantification, recovery, dependency, and uncertainty. This document provides three approaches to quantification: screening, scoping, and detailed HRA. Screening is based on the guidance in NUREG/CR-6850, with some additional guidance for scenarios with long time windows. Scoping is a new approach to quantification developed specifically to support the iterative nature of fire PRA quantification. Scoping is intended to provide less conservative HEPs than screening, but requires fewer resources than a detailed HRA analysis. For detailed HRA quantification, guidance has been developed on how to apply existing methods to assess post-fire fire HEPs.« less
Diagnostics of Tree Diseases Caused by Phytophthora austrocedri Species.
Mulholland, Vincent; Elliot, Matthew; Green, Sarah
2015-01-01
We present methods for the detection and quantification of four Phytophthora species which are pathogenic on trees; Phytophthora ramorum, Phytophthora kernoviae, Phytophthora lateralis, and Phytophthora austrocedri. Nucleic acid extraction methods are presented for phloem tissue from trees, soil, and pure cultures on agar plates. Real-time PCR methods are presented and include primer and probe sets for each species, general advice on real-time PCR setup and data analysis. A method for sequence-based identification, useful for pure cultures, is also included.
Romero, Peggy; Miller, Ted; Garakani, Arman
2009-12-01
Current methods to assess neurodegradation in dorsal root ganglion cultures as a model for neurodegenerative diseases are imprecise and time-consuming. Here we describe two new methods to quantify neuroprotection in these cultures. The neurite quality index (NQI) builds upon earlier manual methods, incorporating additional morphological events to increase detection sensitivity for the detection of early degeneration events. Neurosight is a machine vision-based method that recapitulates many of the strengths of NQI while enabling high-throughput screening applications with decreased costs.
van der Ham, M; Albersen, M; de Koning, T J; Visser, G; Middendorp, A; Bosma, M; Verhoeven-Duif, N M; de Sain-van der Velden, M G M
2012-01-27
Since vitamin B6 is essential for normal functioning of the central nervous system, there is growing need for sensitive analysis of B6 vitamers in cerebrospinal fluid (CSF). This manuscript describes the development and validation of a rapid, sensitive and accurate method for quantification of the vitamin B6 vitamers pyridoxal (PL), pyridoxamine (PM), pyridoxine (PN), pyridoxic acid (PA), pyridoxal 5'-phosphate (PLP), pyridoxamine 5'-phosphate (PMP) and pyridoxine 5'-phosphate (PNP) in human CSF. The method is based on ultra performance liquid chromatography-tandem mass spectrometry (UPLC-MS/MS) with a simple sample preparation procedure of protein precipitation using 50 g L(-1) trichloroacetic acid containing stable isotope labeled internal standards: PL-D(3) for PL and PM, PN-(13)C(4) for PN, PA-D(2) for PA and PLP-D(3) for the phosphorylated vitamers. B6 vitamers were separated (Acquity HSS-T3 UPLC column) with a buffer containing acetic acid, heptafluorobutyric acid and acetonitrile. Positive electrospray ionization was used to monitor transitions m/z 168.1→150.1 (PL), 169.1→134.1 (PM), 170.1→134.1 (PN), 184.1→148.1 (PA), 248.1→150.1 (PLP), 249.1→232.1 (PMP) and 250.1→134.1 (PNP). The method was validated at three concentration levels for each B6 vitamer in CSF. Recoveries of the internal standards were between 93% and 96%. Intra- and inter-assay variations were below 20%. Accuracy tests showed deviations from 3% (PN) to 39% (PMP). Limits of quantification were in the range of 0.03-5.37 nM. Poor results were obtained for quantification of PNP. The method was applied to CSF samples of 20 subjects and two patients on pyridoxine supplementation. Using minimal CSF volumes this method is suitable for implementation in a routine diagnostic setting. Copyright © 2011 Elsevier B.V. All rights reserved.
Measuring mass-based hygroscopicity of atmospheric particles through in situ imaging
DOE Office of Scientific and Technical Information (OSTI.GOV)
Piens, Dominique S.; Kelly, Stephen T.; Harder, Tristan H.
Quantifying how atmospheric particles interact with water vapor is critical for understanding the effects of aerosols on climate. We present a novel method to measure the mass-based hygroscopicity of particles while characterizing their elemental and carbon functional group compositions. Since mass-based hygroscopicity is insensitive to particle geometry, it is advantageous for probing the hygroscopic behavior of atmospheric particles, which can have irregular morphologies. Combining scanning electron microscopy with energy dispersive X-ray analysis (SEM/EDX), scanning transmission X-ray microscopy (STXM) analysis, and in situ STXM humidification experiments, this method was validated using laboratory-generated, atmospherically relevant particles. Then, the hygroscopicity and elemental compositionmore » of 15 complex atmospheric particles were analyzed by leveraging quantification of C, N, and O from STXM, and complementary elemental quantification from SEM/EDX. We found three types of hygroscopic responses, and correlated high hygroscopicity with Na and Cl content. The mixing state of 158 other particles from the sample broadly agreed with those of the humidified particles, indicating the potential to infer atmospheric hygroscopic behavior from a selected subset of particles. As a result, these methods offer unique quantitative capabilities to characterize and correlate the hygroscopicity and chemistry of individual submicrometer atmospheric particles.« less
Measuring mass-based hygroscopicity of atmospheric particles through in situ imaging
Piens, Dominique S.; Kelly, Stephen T.; Harder, Tristan H.; ...
2016-04-18
Quantifying how atmospheric particles interact with water vapor is critical for understanding the effects of aerosols on climate. We present a novel method to measure the mass-based hygroscopicity of particles while characterizing their elemental and carbon functional group compositions. Since mass-based hygroscopicity is insensitive to particle geometry, it is advantageous for probing the hygroscopic behavior of atmospheric particles, which can have irregular morphologies. Combining scanning electron microscopy with energy dispersive X-ray analysis (SEM/EDX), scanning transmission X-ray microscopy (STXM) analysis, and in situ STXM humidification experiments, this method was validated using laboratory-generated, atmospherically relevant particles. Then, the hygroscopicity and elemental compositionmore » of 15 complex atmospheric particles were analyzed by leveraging quantification of C, N, and O from STXM, and complementary elemental quantification from SEM/EDX. We found three types of hygroscopic responses, and correlated high hygroscopicity with Na and Cl content. The mixing state of 158 other particles from the sample broadly agreed with those of the humidified particles, indicating the potential to infer atmospheric hygroscopic behavior from a selected subset of particles. As a result, these methods offer unique quantitative capabilities to characterize and correlate the hygroscopicity and chemistry of individual submicrometer atmospheric particles.« less
Li, P; Jia, J W; Jiang, L X; Zhu, H; Bai, L; Wang, J B; Tang, X M; Pan, A H
2012-04-27
To ensure the implementation of genetically modified organism (GMO)-labeling regulations, an event-specific detection method was developed based on the junction sequence of an exogenous integrant in the transgenic carnation variety Moonlite. The 5'-transgene integration sequence was isolated by thermal asymmetric interlaced PCR. Based upon the 5'-transgene integration sequence, the event-specific primers and TaqMan probe were designed to amplify the fragments, which spanned the exogenous DNA and carnation genomic DNA. Qualitative and quantitative PCR assays were developed employing the designed primers and probe. The detection limit of the qualitative PCR assay was 0.05% for Moonlite in 100 ng total carnation genomic DNA, corresponding to about 79 copies of the carnation haploid genome; the limit of detection and quantification of the quantitative PCR assay were estimated to be 38 and 190 copies of haploid carnation genomic DNA, respectively. Carnation samples with different contents of genetically modified components were quantified and the bias between the observed and true values of three samples were lower than the acceptance criterion (<25%) of the GMO detection method. These results indicated that these event-specific methods would be useful for the identification and quantification of the GMO carnation Moonlite.
Rønning, Helene Thorsen; Einarsen, Kristin; Asp, Tone Normann
2006-06-23
A simple and rapid method for the determination and confirmation of chloramphenicol in several food matrices with LC-MS/MS was developed. Following addition of d5-chloramphenicol as internal standard, meat, seafood, egg, honey and milk samples were extracted with acetonitrile. Chloroform was then added to remove water. After evaporation, the residues were reconstituted in methanol/water (3+4) before injection. The urine and plasma samples were after addition of internal standard applied to a Chem Elut extraction cartridge, eluted with ethyl acetate, and hexane washed. Also these samples were reconstituted in methanol/water (3+4) after evaporation. By using an MRM acquisition method in negative ionization mode, the transitions 321-->152, 321-->194 and 326-->157 were used for quantification, confirmation and internal standard, respectively. Quantification of chloramphenicol positive samples regardless of matrix could be achieved with a common water based calibration curve. The validation of the method was based on EU-decision 2002/657 and different ways of calculating CCalpha and CCbeta were evaluated. The common CCalpha and CCbeta for all matrices were 0.02 and 0.04 microg/kg for the 321-->152 ion transition, and 0.02 and 0.03 microg/kg for the 321-->194 ion transition. At fortification level 0.1 microg/kg the within-laboratory reproducibility is below 25%.
Pires, F; Arcos-Martinez, M Julia; Dias-Cabral, A Cristina; Vidal, Juan C; Castillo, Juan R
2018-04-17
Human cytomegalovirus (HCMV) is a herpes virus that can cause severe infections. Still, the available methods for its diagnostic have the main disadvantage of requiring long time to be performed. In this work, a simple magnetic particle-based enzyme immunoassay (mpEIA) for the quantification of glycoprotein B of Human cytomegalovirus (gB-HCMV) in urine samples is proposed. The immunosensor scheme is based on the analyte protein gB-HCMV sandwiched between a primary monoclonal antibody, (MBs-PrG-mAb1), and a secondary anti-gB-HCMV antibody labelled with Horseradish peroxidase (Ab2-HRP) to allow spectrophotometric detection. The mpEIA analytical performance was tested in urine samples, showing a linear dependence between gB-HCMV concentration and the absorbance signal at 450 nm in a range of concentrations from 90 to 700 pg mL -1 . The calculated detection limits for gB-HCMV were 90 ± 2 pg mL -1 and the RSD was about 6.7% in urine samples. The immunosensor showed good selectivity against other viruses from Herpesviridae family, namely varicella zoster and Epstein Barr viruses. The recoveries of spiked human urine samples at 0.30-0.50 ng mL -1 concentration levels of gB-HCMV ranged between 91 to 105%. The proposed mpEIA method was validated following the guidelines of the European Medicines Agency (EMEA-2014), and allows rapid, successful and easy quantification of gB-HCMV in urine samples. Copyright © 2018 Elsevier B.V. All rights reserved.
USDA-ARS?s Scientific Manuscript database
Background: Until now, antioxidant based initiatives for preventing dementia have lacked a means to detect deficiency or measure pharmacologic effect in the human brain in situ. Objective: Our objective was to apply a novel method to measure key human brain antioxidant concentrations throughout the ...
Analyses of fungal spores or conidia in indoor dust samples can be useful for determining the contamination status of building interiors and in signaling instances where potentially harmful exposures of building occupants to these organisms may exist. A recently developed method ...
Chen, Hsiao-Wei; Wu, Chun-Feng; Chu, Lichieh Julie; Chiang, Wei-Fang; Wu, Chih-Ching; Yu, Jau-Song; Tsai, Cheng-Han; Liang, Kung-Hao; Chang, Yu-Sun; Wu, Maureen; Ou Yang, Wei-Ting
2017-01-01
Multiple (selected) reaction monitoring (MRM/SRM) of peptides is a growing technology for target protein quantification because it is more robust, precise, accurate, high-throughput, and multiplex-capable than antibody-based techniques. The technique has been applied clinically to the large-scale quantification of multiple target proteins in different types of fluids. However, previous MRM-based studies have placed less focus on sample-preparation workflow and analytical performance in the precise quantification of proteins in saliva, a noninvasively sampled body fluid. In this study, we evaluated the analytical performance of a simple and robust multiple reaction monitoring (MRM)-based targeted proteomics approach incorporating liquid chromatography with mass spectrometry detection (LC-MRM/MS). This platform was used to quantitatively assess the biomarker potential of a group of 56 salivary proteins that have previously been associated with human cancers. To further enhance the development of this technology for assay of salivary samples, we optimized the workflow for salivary protein digestion and evaluated quantification performance, robustness and technical limitations in analyzing clinical samples. Using a clinically well-characterized cohort of two independent clinical sample sets (total n = 119), we quantitatively characterized these protein biomarker candidates in saliva specimens from controls and oral squamous cell carcinoma (OSCC) patients. The results clearly showed a significant elevation of most targeted proteins in saliva samples from OSCC patients compared with controls. Overall, this platform was capable of assaying the most highly multiplexed panel of salivary protein biomarkers, highlighting the clinical utility of MRM in oral cancer biomarker research. PMID:28235782
Xu, Feifei; Yang, Ting; Sheng, Yuan; Zhong, Ting; Yang, Mi; Chen, Yun
2014-12-05
As one of the most studied post-translational modifications (PTM), protein phosphorylation plays an essential role in almost all cellular processes. Current methods are able to predict and determine thousands of phosphorylation sites, whereas stoichiometric quantification of these sites is still challenging. Liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS)-based targeted proteomics is emerging as a promising technique for site-specific quantification of protein phosphorylation using proteolytic peptides as surrogates of proteins. However, several issues may limit its application, one of which relates to the phosphopeptides with different phosphorylation sites and the same mass (i.e., isobaric phosphopeptides). While employment of site-specific product ions allows for these isobaric phosphopeptides to be distinguished and quantified, site-specific product ions are often absent or weak in tandem mass spectra. In this study, linear algebra algorithms were employed as an add-on to targeted proteomics to retrieve information on individual phosphopeptides from their common spectra. To achieve this simultaneous quantification, a LC-MS/MS-based targeted proteomics assay was first developed and validated for each phosphopeptide. Given the slope and intercept of calibration curves of phosphopeptides in each transition, linear algebraic equations were developed. Using a series of mock mixtures prepared with varying concentrations of each phosphopeptide, the reliability of the approach to quantify isobaric phosphopeptides containing multiple phosphorylation sites (≥ 2) was discussed. Finally, we applied this approach to determine the phosphorylation stoichiometry of heat shock protein 27 (HSP27) at Ser78 and Ser82 in breast cancer cells and tissue samples.
Montgomery, Jill D; Hensler, Heather R; Jacobson, Lisa P; Jenkins, Frank J
2008-07-01
The aim of the present study was to determine if the Alpha DigiDoc RT system would be an effective method of quantifying immunohistochemical staining as compared with a manual counting method, which is considered the gold standard. Two readers were used to count 31 samples by both methods. The results obtained using the Bland-Altman for concordance deemed no statistical difference between the 2 methods. Thus, the Alpha DigiDoc RT system is an effective, low cost method to quantify immunohistochemical data.
Takabatake, Reona; Akiyama, Hiroshi; Sakata, Kozue; Onishi, Mari; Koiwa, Tomohiro; Futo, Satoshi; Minegishi, Yasutaka; Teshima, Reiko; Mano, Junichi; Furui, Satoshi; Kitta, Kazumi
2011-01-01
A novel real-time PCR-based analytical method was developed for the event-specific quantification of a genetically modified (GM) soybean event; A2704-12. During the plant transformation, DNA fragments derived from pUC19 plasmid were integrated in A2704-12, and the region was found to be A2704-12 specific. The pUC19-derived DNA sequences were used as primers for the specific detection of A2704-12. We first tried to construct a standard plasmid for A2704-12 quantification using pUC19. However, non-specific signals appeared with both qualitative and quantitative PCR analyses using the specific primers with pUC19 as a template, and we then constructed a plasmid using pBR322. The conversion factor (C(f)), which is required to calculate the amount of the genetically modified organism (GMO), was experimentally determined with two real-time PCR instruments, the Applied Biosystems 7900HT and the Applied Biosystems 7500. The determined C(f) values were both 0.98. The quantitative method was evaluated by means of blind tests in multi-laboratory trials using the two real-time PCR instruments. The limit of quantitation for the method was estimated to be 0.1%. The trueness and precision were evaluated as the bias and reproducibility of relative standard deviation (RSD(R)), and the determined bias and RSD(R) values for the method were each less than 20%. These results suggest that the developed method would be suitable for practical analyses for the detection and quantification of A2704-12.
Jamzad, Amoon; Setarehdan, Seyed Kamaledin
2014-04-01
The twinkling artifact is an undesired phenomenon within color Doppler sonograms that usually appears at the site of internal calcifications. Since the appearance of the twinkling artifact is correlated with the roughness of the calculi, noninvasive roughness estimation of the internal stones may be considered as a potential twinkling artifact application. This article proposes a novel quantitative approach for measurement and analysis of twinkling artifact data for roughness estimation. A phantom was developed with 7 quantified levels of roughness. The Doppler system was initially calibrated by the proposed procedure to facilitate the analysis. A total of 1050 twinkling artifact images were acquired from the phantom, and 32 novel numerical measures were introduced and computed for each image. The measures were then ranked on the basis of roughness quantification ability using different methods. The performance of the proposed twinkling artifact-based surface roughness quantification method was finally investigated for different combinations of features and classifiers. Eleven features were shown to be the most efficient numerical twinkling artifact measures in roughness characterization. The linear classifier outperformed other methods for twinkling artifact classification. The pixel count measures produced better results among the other categories. The sequential selection method showed higher accuracy than other individual rankings. The best roughness recognition average accuracy of 98.33% was obtained by the first 5 principle components and the linear classifier. The proposed twinkling artifact analysis method could recognize the phantom surface roughness with average accuracy of 98.33%. This method may also be applicable for noninvasive calculi characterization in treatment management.
Smart Sensor for Real-Time Quantification of Common Symptoms Present in Unhealthy Plants
Contreras-Medina, Luis M.; Osornio-Rios, Roque A.; Torres-Pacheco, Irineo; Romero-Troncoso, Rene de J.; Guevara-González, Ramon G.; Millan-Almaraz, Jesus R.
2012-01-01
Plant responses to physiological function disorders are called symptoms and they are caused principally by pathogens and nutritional deficiencies. Plant symptoms are commonly used as indicators of the health and nutrition status of plants. Nowadays, the most popular method to quantify plant symptoms is based on visual estimations, consisting on evaluations that raters give based on their observation of plant symptoms; however, this method is inaccurate and imprecise because of its obvious subjectivity. Computational Vision has been employed in plant symptom quantification because of its accuracy and precision. Nevertheless, the systems developed so far lack in-situ, real-time and multi-symptom analysis. There exist methods to obtain information about the health and nutritional status of plants based on reflectance and chlorophyll fluorescence, but they use expensive equipment and are frequently destructive. Therefore, systems able of quantifying plant symptoms overcoming the aforementioned disadvantages that can serve as indicators of health and nutrition in plants are desirable. This paper reports an FPGA-based smart sensor able to perform non-destructive, real-time and in-situ analysis of leaf images to quantify multiple symptoms presented by diseased and malnourished plants; this system can serve as indicator of the health and nutrition in plants. The effectiveness of the proposed smart-sensor was successfully tested by analyzing diseased and malnourished plants. PMID:22368496
Evaluation of two methods to determine glyphosate and AMPA in soils of Argentina
NASA Astrophysics Data System (ADS)
De Geronimo, Eduardo; Lorenzon, Claudio; Iwasita, Barbara; Faggioli, Valeria; Aparicio, Virginia; Costa, Jose Luis
2017-04-01
Argentine agricultural production is fundamentally based on a technological package combining no-tillage and the dependence of glyphosate applications to control weeds in transgenic crops (soybean, maize and cotton). Therefore, glyphosate is the most employed herbicide in the country, where 180 to 200 million liters are applied every year. Due to its widespread use, it is important to assess its impact on the environment and, therefore, reliable analytical methods are mandatory. Glyphosate molecule exhibits unique physical and chemical characteristics which difficult its quantification, especially in soils with high organic matter content, such as the central eastern Argentine soils, where strong interferences are normally observed. The objective of this work was to compare two methods for extraction and quantification of glyphosate and AMPA in samples of 8 representative soils of Argentina. The first analytical method (method 1) was based on the use of phosphate buffer as extracting solution and dichloromethane to minimize matrix organic content. In the second method (method 2), potassium hydroxide was used to extract the analytes followed by a clean-up step using solid phase extraction (SPE) to minimize strong interferences. Sensitivity, recoveries, matrix effects and robustness were evaluated. Both methodologies involved a derivatization with 9-fluorenyl-methyl-chloroformate (FMOC) in borate buffer and detection based on ultra-high-pressure liquid chromatography coupled to tandem mass spectrometry (UHPLC-MS/MS). Recoveries obtained from soil samples spiked at 0.1 and 1 mg kg-1 and were satisfactory in both methods (70% - 120%). However, there was a remarkable difference regarding the matrix effect, being the SPE clean-up step (method 2) insufficient to remove the interferences. Whereas the dilution and the clean-up with dichloromethane (method 1) were more effective minimizing the ionic suppression. Moreover, method 1 had fewer steps in the protocol of sample processing than method 2. This can be highly valuable in the routine lab work due to the reduction of potential undesired errors such as the loss of analyte or sample contamination. In addition, the substitution of SPE by another alternative involved a considerable reduction of analytical costs in method 1. We conclude that method 1 seemed to be simpler and cheaper than method 2, as well as reliable to quantify glyphosate in Argentinean soils. We hope that this experience can be useful to simplify the protocols of glyphosate quantification and contribute to the understanding of the fate of this herbicide in the environment.
Isak, I; Patel, M; Riddell, M; West, M; Bowers, T; Wijeyekoon, S; Lloyd, J
2016-08-01
Fourier transform infrared (FTIR) spectroscopy was used in this study for the rapid quantification of polyhydroxyalkanoates (PHA) in mixed and pure culture bacterial biomass. Three different statistical analysis methods (regression, partial least squares (PLS) and nonlinear) were applied to the FTIR data and the results were plotted against the PHA values measured with the reference gas chromatography technique. All methods predicted PHA content in mixed culture biomass with comparable efficiency, indicated by similar residuals values. The PHA in these cultures ranged from low to medium concentration (0-44 wt% of dried biomass content). However, for the analysis of the combined mixed and pure culture biomass with PHA concentration ranging from low to high (0-93% of dried biomass content), the PLS method was most efficient. This paper reports, for the first time, the use of a single calibration model constructed with a combination of mixed and pure cultures covering a wide PHA range, for predicting PHA content in biomass. Currently no one universal method exists for processing FTIR data for polyhydroxyalkanoates (PHA) quantification. This study compares three different methods of analysing FTIR data for quantification of PHAs in biomass. A new data-processing approach was proposed and the results were compared against existing literature methods. Most publications report PHA quantification of medium range in pure culture. However, in our study we encompassed both mixed and pure culture biomass containing a broader range of PHA in the calibration curve. The resulting prediction model is useful for rapid quantification of a wider range of PHA content in biomass. © 2016 The Society for Applied Microbiology.
Quantification of localized vertebral deformities using a sparse wavelet-based shape model.
Zewail, R; Elsafi, A; Durdle, N
2008-01-01
Medical experts often examine hundreds of spine x-ray images to determine existence of various pathologies. Common pathologies of interest are anterior osteophites, disc space narrowing, and wedging. By careful inspection of the outline shapes of the vertebral bodies, experts are able to identify and assess vertebral abnormalities with respect to the pathology under investigation. In this paper, we present a novel method for quantification of vertebral deformation using a sparse shape model. Using wavelets and Independent component analysis (ICA), we construct a sparse shape model that benefits from the approximation power of wavelets and the capability of ICA to capture higher order statistics in wavelet space. The new model is able to capture localized pathology-related shape deformations, hence it allows for quantification of vertebral shape variations. We investigate the capability of the model to predict localized pathology related deformations. Next, using support-vector machines, we demonstrate the diagnostic capabilities of the method through the discrimination of anterior osteophites in lumbar vertebrae. Experiments were conducted using a set of 150 contours from digital x-ray images of lumbar spine. Each vertebra is labeled as normal or abnormal. Results reported in this work focus on anterior osteophites as the pathology of interest.
NASA Astrophysics Data System (ADS)
Firdaus, M. Lutfi; Puspita, Melfi; Alwi, Wiwit; Ghufira, Nurhamidah, Elvia, Rina
2017-11-01
In the present study, activated carbon prepared from palm oil husk was used as adsorbent to remove synthetic dyes of Reactive Red 120 (RR) and Direct Green 26 (DG) from aqueous solution. The effects of solution pH, contact time, adsorbent weight, dyes concentration, and temperature on adsorption were evaluated based on batch experiments along with determination of the adsorption isotherms, kinetics, and thermodynamics parameters. Visible spectrophotometry was used for the quantification of dyes concentration, in conjunction with digital image colorimetry as a novel quantification method. Compared to visible spectrophotometry, the results of digital image colorimetry were accurate. In addition, improved sensitivity was achieved using this new colorimetry method. At equilibrium, dyes adsorption onto activated carbon followed Freundlich model, with adsorption capacities for RR and DG were 32 and 27 mg/g, respectively. The adsorption kinetics study showed a pseudo-second-order model with thermodynamic parameters of ΔG°, ΔH°, and ΔS° were -1.8 to -3.8 kJ/mol, -13.5 to -24.38 kJ/mol, and 0.001 J/mol, respectively. Therefore, the process of adsorption was exothermic and spontaneous with an increase in the disorder or entropy of the system.
Sarais, Giorgia; Caboni, Pierluigi; Sarritzu, Erika; Russo, Mariateresa; Cabras, Paolo
2008-05-14
Neem-based insecticides containing azadirachtin and related azadirachtoids are widely used in agriculture. Here, we report an analytical method for the rapid and accurate quantification of the insecticide azadirachtin A and B and other azadirachtoids such as salannin, nimbin, and their deacetylated analogues on tomatoes and peaches. Azadirachtoids were extracted from fruits and vegetables with acetonitrile. Using high-performance liquid chromatography/electrospray ionization tandem mass spectrometer, azadirachtoids were selectively detected monitoring the multiple reaction transitions of sodium adduct precursor ions. For azadirachtin A, calibration was linear over a working range of 1-1000 microg/L with r > 0.996. The limit of detection and limit of quantification for azadirachtin A were 0.4 and 0.8 microg/kg, respectively. The presence of interfering compounds in the peach and tomato extracts was evaluated and found to be minimal. Because of the linear behavior, it was concluded that the multiple reaction transitions of sodium adduct ions can be used for analytical purposes, that is, for the identification and quantification of azadirachtin A and B and related azadirachtoids in fruit and vegetable extracts at trace levels.
Stringlike Pulse Quantification Study by Pulse Wave in 3D Pulse Mapping
Chung, Yu-Feng; Yeh, Cheng-Chang; Si, Xiao-Chen; Chang, Chien-Chen; Hu, Chung-Shing; Chu, Yu-Wen
2012-01-01
Abstract Background A stringlike pulse is highly related to hypertension, and many classification approaches have been proposed in which the differentiation pulse wave (dPW) can effectively classify the stringlike pulse indicating hypertension. Unfortunately, the dPW method cannot distinguish the spring stringlike pulse from the stringlike pulse so labeled by physicians in clinics. Design By using a Bi-Sensing Pulse Diagnosis Instrument (BSPDI), this study proposed a novel Plain Pulse Wave (PPW) to classify a stringlike pulse based on an array of pulse signals, mimicking a Traditional Chinese Medicine physician's finger-reading skill. Results In comparison to PPWs at different pulse taking positions, phase delay Δθand correlation coefficient r can be elucidated as the quantification parameters of stringlike pulse. As a result, the recognition rates of a hypertensive stringlike pulse, spring stringlike pulse, and non–stringlike pulse are 100%, 100%, 77% for PPW and 70%, 0%, 59% for dPW, respectively. Conclusions Integrating dPW and PPW can unify the classification of stringlike pulse including hypertensive stringlike pulse and spring stringlike pulse. Hence, the proposed novel method, PPW, enhances quantification of stringlike pulse. PMID:23057481
Hua, Marti Z; Feng, Shaolong; Wang, Shuo; Lu, Xiaonan
2018-08-30
We report the development of a molecularly imprinted polymers-surface-enhanced Raman spectroscopy (MIPs-SERS) method for rapid detection and quantification of a herbicide residue 2,4-dichlorophenoxyacetic acid (2,4-D) in milk. MIPs were synthesized via bulk polymerization and utilized as solid phase extraction sorbent to selectively extract and enrich 2,4-D from milk. Silver nanoparticles were synthesized to facilitate the collection of SERS spectra of the extracts. Based on the characteristic band intensity of 2,4-D (391 cm -1 ), the limit of detection was 0.006 ppm and the limit of quantification was 0.008 ppm. A simple logarithmic working range (0.01-1 ppm) was established, satisfying the sensitivity requirement referring to the maximum residue level of 2,4-D in milk in both Europe and North America. The overall test of 2,4-D for each milk sample required only 20 min including sample preparation. This MIPs-SERS method has potential for practical applications in detecting 2,4-D in agri-foods. Copyright © 2018 Elsevier Ltd. All rights reserved.
Dubois, Mathieu; Tarres, Adrienne; Goldmann, Till; Empl, Anna Maria; Donaubauer, Alfred; Seefelder, Walburga
2012-05-04
The presence of fatty acid esters of monochloropropanediol (MEs) in food is a recent concern raised due to the carcinogenicity of their hydrolysable moieties 2- and 3-monochloropropanediol (2- and 3-MCPD). Several indirect methods for the quantification of MEs have been developed and are commonly in use until today, however significant discrepancies among analytical results obtained are challenging their reliability. The aim of the present study was therefore to test the trueness of an indirect method by comparing it to a newly developed direct method using palm oil and palm olein as examples. The indirect method was based on ester cleavage under acidic conditions, derivatization of the liberated 2- and 3-MCPD with heptafluorobutyryl imidazole and GC-MS determination. The direct method was comprised of two extraction procedures targeting 2-and 3-MCPD mono esters (co-extracting as well glycidyl esters) by the use of double solid phase extraction (SPE), and 2- and 3-MCPD di-esters by the use of silica gel column, respectively. Detection was carried out by liquid chromatography coupled to time of flight mass spectrometry (LC-ToF-MS). Accurate quantification of the intact compounds was assured by means of matrix matched standard addition on extracts. Analysis of 22 palm oil and 7 palm olein samples (2- plus 3-MCPD contamination ranged from 0.3 to 8.8 μg/g) by both methods revealed no significant bias. Both methods were therefore considered as comparable in terms of results; however the indirect method was shown to require less analytical standards, being less tedious and furthermore applicable to all type of different vegetable oils and hence recommended for routine application. Copyright © 2012 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
McCracken, Katherine E.; Angus, Scott V.; Reynolds, Kelly A.; Yoon, Jeong-Yeol
2016-06-01
Smartphone image-based sensing of microfluidic paper analytical devices (μPADs) offers low-cost and mobile evaluation of water quality. However, consistent quantification is a challenge due to variable environmental, paper, and lighting conditions, especially across large multi-target μPADs. Compensations must be made for variations between images to achieve reproducible results without a separate lighting enclosure. We thus developed a simple method using triple-reference point normalization and a fast-Fourier transform (FFT)-based pre-processing scheme to quantify consistent reflected light intensity signals under variable lighting and channel conditions. This technique was evaluated using various light sources, lighting angles, imaging backgrounds, and imaging heights. Further testing evaluated its handle of absorbance, quenching, and relative scattering intensity measurements from assays detecting four water contaminants - Cr(VI), total chlorine, caffeine, and E. coli K12 - at similar wavelengths using the green channel of RGB images. Between assays, this algorithm reduced error from μPAD surface inconsistencies and cross-image lighting gradients. Although the algorithm could not completely remove the anomalies arising from point shadows within channels or some non-uniform background reflections, it still afforded order-of-magnitude quantification and stable assay specificity under these conditions, offering one route toward improving smartphone quantification of μPAD assays for in-field water quality monitoring.
NASA Astrophysics Data System (ADS)
Griffiths, John R.; Chicooree, Navin; Connolly, Yvonne; Neffling, Milla; Lane, Catherine S.; Knapman, Thomas; Smith, Duncan L.
2014-05-01
Protein modification by ubiquitination and SUMOylation occur throughout the cell and are responsible for numerous cellular functions such as apoptosis, DNA replication and repair, and gene transcription. Current methods for the identification of such modifications using mass spectrometry predominantly rely upon tryptic isopeptide tag generation followed by database searching with in vitro genetic mutation of SUMO routinely required. We have recently described a novel approach to ubiquitin and SUMO modification detection based upon the diagnostic a' and b' ions released from the isopeptide tags upon collision-induced dissociation of reductively methylated Ubl isopeptides (RUbI) using formaldehyde. Here, we significantly extend those studies by combining data-independent acquisition (DIA) with alternative labeling reagents to improve diagnostic ion coverage and enable relative quantification of modified peptides from both MS and MS/MS signals. Model synthetic ubiquitin and SUMO-derived isopeptides were labeled with mTRAQ reagents (Δ0, Δ4, and Δ8) and subjected to LC-MS/MS with SWATH acquisition. Novel diagnostic ions were generated upon CID, which facilitated the selective detection of these modified peptides. Simultaneous MS-based and MS/MS-based relative quantification was demonstrated for both Ub and SUMO-derived isopeptides across three channels in a background of mTRAQ-labeled Escherichia coli digest.
Dependence of quantitative accuracy of CT perfusion imaging on system parameters
NASA Astrophysics Data System (ADS)
Li, Ke; Chen, Guang-Hong
2017-03-01
Deconvolution is a popular method to calculate parametric perfusion parameters from four dimensional CT perfusion (CTP) source images. During the deconvolution process, the four dimensional space is squeezed into three-dimensional space by removing the temporal dimension, and a prior knowledge is often used to suppress noise associated with the process. These additional complexities confound the understanding about deconvolution-based CTP imaging system and how its quantitative accuracy depends on parameters and sub-operations involved in the image formation process. Meanwhile, there has been a strong clinical need in answering this question, as physicians often rely heavily on the quantitative values of perfusion parameters to make diagnostic decisions, particularly during an emergent clinical situation (e.g. diagnosis of acute ischemic stroke). The purpose of this work was to develop a theoretical framework that quantitatively relates the quantification accuracy of parametric perfusion parameters with CTP acquisition and post-processing parameters. This goal was achieved with the help of a cascaded systems analysis for deconvolution-based CTP imaging systems. Based on the cascaded systems analysis, the quantitative relationship between regularization strength, source image noise, arterial input function, and the quantification accuracy of perfusion parameters was established. The theory could potentially be used to guide developments of CTP imaging technology for better quantification accuracy and lower radiation dose.
Grain growth prediction based on data assimilation by implementing 4DVar on multi-phase-field model
NASA Astrophysics Data System (ADS)
Ito, Shin-ichi; Nagao, Hiromichi; Kasuya, Tadashi; Inoue, Junya
2017-12-01
We propose a method to predict grain growth based on data assimilation by using a four-dimensional variational method (4DVar). When implemented on a multi-phase-field model, the proposed method allows us to calculate the predicted grain structures and uncertainties in them that depend on the quality and quantity of the observational data. We confirm through numerical tests involving synthetic data that the proposed method correctly reproduces the true phase-field assumed in advance. Furthermore, it successfully quantifies uncertainties in the predicted grain structures, where such uncertainty quantifications provide valuable information to optimize the experimental design.
Aravind, S G; Arimboor, Ranjith; Rangan, Meena; Madhavan, Soumya N; Arumughan, C
2008-11-04
Application of modern scientific knowledge coupled with sensitive analytical technique is important for the quality evaluation and standardization of polyherbal formulations. Semecarpus anacardium, an important medicinal plant with wide medicinal properties, is frequently used in a large number of traditional herbal preparations. Tetrahydroamentoflavone (THA), a major bioactive biflavonoid was selected as a chemical marker of S. anacardium and RP-semi-preparative HPLC conditions were optimized for the isolation of tetrahydroamentoflavone. HPTLC analytical method was developed for the fingerprinting of S. anacardium flavonoids and quantification of tetrahydroamentoflavone. The method was validated in terms of their linearity, LOD, LOQ, precision and accuracy and compared with RP-HPLC-DAD method. The methods were demonstrated for the chemical fingerprinting of S. anacardium plant parts and some commercial polyherbal formulations and the amount of tetrahydroamentoflavone was quantified. HPTLC analysis showed that S. anacardium seed contained approximately 10 g kg(-1) of tetrahydroamentoflavone. The methods were able to identify and quantify tetrahydroamentoflavone from complex mixtures of phytochemicals and could be extended to the marker-based standardization of polyherbal formulations, containing S. anacardium.
de la Calle, Maria B; Devesa, Vicenta; Fiamegos, Yiannis; Vélez, Dinoraz
2017-09-01
The European Food Safety Authority (EFSA) underlined in its Scientific Opinion on Arsenic in Food that in order to support a sound exposure assessment to inorganic arsenic through diet, information about distribution of arsenic species in various food types must be generated. A method, previously validated in a collaborative trial, has been applied to determine inorganic arsenic in a wide variety of food matrices, covering grains, mushrooms and food of marine origin (31 samples in total). The method is based on detection by flow injection-hydride generation-atomic absorption spectrometry of the iAs selectively extracted into chloroform after digestion of the proteins with concentrated HCl. The method is characterized by a limit of quantification of 10 µg/kg dry weight, which allowed quantification of inorganic arsenic in a large amount of food matrices. Information is provided about performance scores given to results obtained with this method and which were reported by different laboratories in several proficiency tests. The percentage of satisfactory results obtained with the discussed method is higher than that of the results obtained with other analytical approaches.
Ulu, Sevgi Tatar
2012-01-01
A sensitive spectrofluorimetric method was developed for the determination of tizanidine in human plasma, urine and pharmaceutical preparations. The method is based on reaction of tizanidine with 1-dimethylaminonaphthalene-5-sulphonyl chloride (dansyl chloride) in an alkaline medium to form a highly fluorescent derivative that was measured at 511 nm after excitation at 383 nm. The different experimental parameters affecting the fluorescence intensity of tizanidine was carefully studied and optimized. The fluorescence-concentration plots were rectilinear over the ranges 50-500 and 20-300 ng/mL for plasma and urine, respectively, detection limits of 1.81 and 0.54 ng/mL and quantification limits of 5.43 and 1.62 ng/mL for plasma and urine, respectively. The method presents good performance in terms of linearity, detection and quantification limits, precision, accuracy and specificity. The proposed method was successfully applied for the determination of tizanidine in pharmaceutical preparations. The results obtained were compared with a reference method, using t- and F-tests. Copyright © 2011 John Wiley & Sons, Ltd.
Yang, Jian-Yi; Peng, Zhen-Ling; Yu, Zu-Guo; Zhang, Rui-Jie; Anh, Vo; Wang, Desheng
2009-04-21
In this paper, we intend to predict protein structural classes (alpha, beta, alpha+beta, or alpha/beta) for low-homology data sets. Two data sets were used widely, 1189 (containing 1092 proteins) and 25PDB (containing 1673 proteins) with sequence homology being 40% and 25%, respectively. We propose to decompose the chaos game representation of proteins into two kinds of time series. Then, a novel and powerful nonlinear analysis technique, recurrence quantification analysis (RQA), is applied to analyze these time series. For a given protein sequence, a total of 16 characteristic parameters can be calculated with RQA, which are treated as feature representation of protein sequences. Based on such feature representation, the structural class for each protein is predicted with Fisher's linear discriminant algorithm. The jackknife test is used to test and compare our method with other existing methods. The overall accuracies with step-by-step procedure are 65.8% and 64.2% for 1189 and 25PDB data sets, respectively. With one-against-others procedure used widely, we compare our method with five other existing methods. Especially, the overall accuracies of our method are 6.3% and 4.1% higher for the two data sets, respectively. Furthermore, only 16 parameters are used in our method, which is less than that used by other methods. This suggests that the current method may play a complementary role to the existing methods and is promising to perform the prediction of protein structural classes.
NASA Astrophysics Data System (ADS)
Jones, C. E.; Kato, S.; Nakashima, Y.; Kajii, Y.
2014-05-01
Biogenic emissions supply the largest fraction of non-methane volatile organic compounds (VOC) from the biosphere to the atmospheric boundary layer, and typically comprise a complex mixture of reactive terpenes. Due to this chemical complexity, achieving comprehensive measurements of biogenic VOC (BVOC) in air within a satisfactory time resolution is analytically challenging. To address this, we have developed a novel, fully automated Fast Gas Chromatography (Fast-GC) based technique to provide higher time resolution monitoring of monoterpenes (and selected other C9-C15 terpenes) during plant emission studies and in ambient air. To our knowledge, this is the first study to apply a Fast-GC based separation technique to achieve quantification of terpenes in ambient air. Three chromatography methods have been developed for atmospheric terpene analysis under different sampling scenarios. Each method facilitates chromatographic separation of selected BVOC within a significantly reduced analysis time compared to conventional GC methods, whilst maintaining the ability to quantify individual monoterpene structural isomers. Using this approach, the C9-C15 BVOC composition of single plant emissions may be characterised within a 14.5 min analysis time. Moreover, in-situ quantification of 12 monoterpenes in unpolluted ambient air may be achieved within an 11.7 min chromatographic separation time (increasing to 19.7 min when simultaneous quantification of multiple oxygenated C9-C10 terpenoids is required, and/or when concentrations of anthropogenic VOC are significant). These analysis times potentially allow for a twofold to fivefold increase in measurement frequency compared to conventional GC methods. Here we outline the technical details and analytical capability of this chromatographic approach, and present the first in-situ Fast-GC observations of 6 monoterpenes and the oxygenated BVOC (OBVOC) linalool in ambient air. During this field deployment within a suburban forest ~30 km west of central Tokyo, Japan, the Fast-GC limit of detection with respect to monoterpenes was 4-5 ppt, and the agreement between Fast-GC and PTR-MS derived total monoterpene mixing ratios was consistent with previous GC/PTR-MS comparisons. The measurement uncertainties associated with the Fast-GC quantification of monoterpenes are ≤ 12%, while larger uncertainties (up to ~25%) are associated with the OBVOC and sesquiterpene measurements.
Xu, Feng; Beyazoglu, Turker; Hefner, Evan; Gurkan, Umut Atakan
2011-01-01
Cellular alignment plays a critical role in functional, physical, and biological characteristics of many tissue types, such as muscle, tendon, nerve, and cornea. Current efforts toward regeneration of these tissues include replicating the cellular microenvironment by developing biomaterials that facilitate cellular alignment. To assess the functional effectiveness of the engineered microenvironments, one essential criterion is quantification of cellular alignment. Therefore, there is a need for rapid, accurate, and adaptable methodologies to quantify cellular alignment for tissue engineering applications. To address this need, we developed an automated method, binarization-based extraction of alignment score (BEAS), to determine cell orientation distribution in a wide variety of microscopic images. This method combines a sequenced application of median and band-pass filters, locally adaptive thresholding approaches and image processing techniques. Cellular alignment score is obtained by applying a robust scoring algorithm to the orientation distribution. We validated the BEAS method by comparing the results with the existing approaches reported in literature (i.e., manual, radial fast Fourier transform-radial sum, and gradient based approaches). Validation results indicated that the BEAS method resulted in statistically comparable alignment scores with the manual method (coefficient of determination R2=0.92). Therefore, the BEAS method introduced in this study could enable accurate, convenient, and adaptable evaluation of engineered tissue constructs and biomaterials in terms of cellular alignment and organization. PMID:21370940
Petrova, Darinka Todorova; Cocisiu, Gabriela Ariadna; Eberle, Christoph; Rhode, Karl-Heinz; Brandhorst, Gunnar; Walson, Philip D; Oellerich, Michael
2013-09-01
The aim of this study was to develop a novel method for automated quantification of cell-free hemoglobin (fHb) based on the HI (Roche Diagnostics). The novel fHb method based on the HI was correlated with fHb measured using the triple wavelength methods of both Harboe [fHb, g/L = (0.915 * HI + 2.634)/100] and Fairbanks et al. [fHb, g/L = (0.917 * HI + 2.131)/100]. fHb concentrations were estimated from the HI using the Roche Modular automated platform in self-made and commercially available quality controls, as well as samples from a proficiency testing scheme (INSTAND). The fHb using Roche automated HI results were then compared to results obtained using the traditional spectrophotometric assays for one hundred plasma samples with varying degrees of hemolysis, lipemia and/or bilirubinemia. The novel method using automated HI quantification on the Roche Modular clinical chemistry platform correlated well with results using the classical methods in the 100 patient samples (Harboe: r = 0.9284; Fairbanks et al.: r = 0.9689) and recovery was good for self-made controls. However, commercially available quality controls showed poor recovery due to an unidentified matrix problem. The novel method produced reliable determination of fHb in samples without interferences. However, poor recovery using commercially available fHb quality control samples currently greatly limits its usefulness. © 2013.
A conceptually and computationally simple method for the definition, display, quantification, and comparison of the shapes of three-dimensional mathematical molecular models is presented. Molecular or solvent-accessible volume and surface area can also be calculated. Algorithms, ...
Otani, Kyoko; Nakazono, Akemi; Salgo, Ivan S; Lang, Roberto M; Takeuchi, Masaaki
2016-10-01
Echocardiographic determination of left heart chamber volumetric parameters by using manual tracings during multiple beats is tedious in atrial fibrillation (AF). The aim of this study was to determine the usefulness of fully automated left chamber quantification software with single-beat three-dimensional transthoracic echocardiographic data sets in patients with AF. Single-beat full-volume three-dimensional transthoracic echocardiographic data sets were prospectively acquired during consecutive multiple cardiac beats (≥10 beats) in 88 patients with AF. In protocol 1, left ventricular volumes, left ventricular ejection fraction, and maximal left atrial volume were validated using automated quantification against the manual tracing method in identical beats in 10 patients. In protocol 2, automated quantification-derived averaged values from multiple beats were compared with the corresponding values obtained from the indexed beat in all patients. Excellent correlations of left chamber parameters between automated quantification and the manual method were observed (r = 0.88-0.98) in protocol 1. The time required for the analysis with the automated quantification method (5 min) was significantly less compared with the manual method (27 min) (P < .0001). In protocol 2, there were excellent linear correlations between the averaged left chamber parameters and the corresponding values obtained from the indexed beat (r = 0.94-0.99), and test-retest variability of left chamber parameters was low (3.5%-4.8%). Three-dimensional transthoracic echocardiography with fully automated quantification software is a rapid and reliable way to measure averaged values of left heart chamber parameters during multiple consecutive beats. Thus, it is a potential new approach for left chamber quantification in patients with AF in daily routine practice. Copyright © 2016 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.
Normalized Quantitative Western Blotting Based on Standardized Fluorescent Labeling.
Faden, Frederik; Eschen-Lippold, Lennart; Dissmeyer, Nico
2016-01-01
Western blot (WB) analysis is the most widely used method to monitor expression of proteins of interest in protein extracts of high complexity derived from diverse experimental setups. WB allows the rapid and specific detection of a target protein, such as non-tagged endogenous proteins as well as protein-epitope tag fusions depending on the availability of specific antibodies. To generate quantitative data from independent samples within one experiment and to allow accurate inter-experimental quantification, a reliable and reproducible method to standardize and normalize WB data is indispensable. To date, it is a standard procedure to normalize individual bands of immunodetected proteins of interest from a WB lane to other individual bands of so-called housekeeping proteins of the same sample lane. These are usually detected by an independent antibody or colorimetric detection and do not reflect the real total protein of a sample. Housekeeping proteins-assumed to be constitutively expressed mostly independent of developmental and environmental states-can greatly differ in their expression under these various conditions. Therefore, they actually do not represent a reliable reference to normalize the target protein's abundance to the total amount of protein contained in each lane of a blot.Here, we demonstrate the Smart Protein Layers (SPL) technology, a combination of fluorescent standards and a stain-free fluorescence-based visualization of total protein in gels and after transfer via WB. SPL allows a rapid and highly sensitive protein visualization and quantification with a sensitivity comparable to conventional silver staining with a 1000-fold higher dynamic range. For normalization, standardization and quantification of protein gels and WBs, a sample-dependent bi-fluorescent standard reagent is applied and, for accurate quantification of data derived from different experiments, a second calibration standard is used. Together, the precise quantification of protein expression by lane-to-lane, gel-to-gel, and blot-to-blot comparisons is facilitated especially with respect to experiments in the area of proteostasis dealing with highly variable protein levels and involving protein degradation mutants and treatments modulating protein abundance.
Aerosol-type retrieval and uncertainty quantification from OMI data
NASA Astrophysics Data System (ADS)
Kauppi, Anu; Kolmonen, Pekka; Laine, Marko; Tamminen, Johanna
2017-11-01
We discuss uncertainty quantification for aerosol-type selection in satellite-based atmospheric aerosol retrieval. The retrieval procedure uses precalculated aerosol microphysical models stored in look-up tables (LUTs) and top-of-atmosphere (TOA) spectral reflectance measurements to solve the aerosol characteristics. The forward model approximations cause systematic differences between the modelled and observed reflectance. Acknowledging this model discrepancy as a source of uncertainty allows us to produce more realistic uncertainty estimates and assists the selection of the most appropriate LUTs for each individual retrieval.This paper focuses on the aerosol microphysical model selection and characterisation of uncertainty in the retrieved aerosol type and aerosol optical depth (AOD). The concept of model evidence is used as a tool for model comparison. The method is based on Bayesian inference approach, in which all uncertainties are described as a posterior probability distribution. When there is no single best-matching aerosol microphysical model, we use a statistical technique based on Bayesian model averaging to combine AOD posterior probability densities of the best-fitting models to obtain an averaged AOD estimate. We also determine the shared evidence of the best-matching models of a certain main aerosol type in order to quantify how plausible it is that it represents the underlying atmospheric aerosol conditions.The developed method is applied to Ozone Monitoring Instrument (OMI) measurements using a multiwavelength approach for retrieving the aerosol type and AOD estimate with uncertainty quantification for cloud-free over-land pixels. Several larger pixel set areas were studied in order to investigate the robustness of the developed method. We evaluated the retrieved AOD by comparison with ground-based measurements at example sites. We found that the uncertainty of AOD expressed by posterior probability distribution reflects the difficulty in model selection. The posterior probability distribution can provide a comprehensive characterisation of the uncertainty in this kind of problem for aerosol-type selection. As a result, the proposed method can account for the model error and also include the model selection uncertainty in the total uncertainty budget.
Rapid quantification of plant-powdery mildew interactions by qPCR and conidiospore counts.
Weßling, Ralf; Panstruga, Ralph
2012-08-31
The powdery mildew disease represents a valuable patho-system to study the interaction between plant hosts and obligate biotrophic fungal pathogens. Numerous discoveries have been made on the basis of the quantitative evaluation of plant-powdery mildew interactions, especially in the context of hyper-susceptible and/or resistant plant mutants. However, the presently available methods to score the pathogenic success of powdery mildew fungi are laborious and thus not well suited for medium- to high-throughput analysis. Here we present two new protocols that allow the rapid quantitative assessment of powdery mildew disease development. One procedure depends on quantitative polymerase chain reaction (qPCR)-based evaluation of fungal biomass, while the other relies on the quantification of fungal conidiospores. We validated both techniques using the powdery mildew pathogen Golovinomyces orontii on a set of hyper-susceptible and resistant Arabidopsis thaliana mutants and found that both cover a wide dynamic range of one to two (qPCR) and four to five (quantification of conidia) orders of magnitude, respectively. The two approaches yield reproducible results and are easy to perform without specialized equipment. The qPCR and spore count assays rapidly and reproducibly quantify powdery mildew pathogenesis. Our methods are performed at later stages of infection and discern mutant phenotypes accurately. The assays therefore complement currently used procedures of powdery mildew quantification and can overcome some of their limitations. In addition, they can easily be adapted to other plant-powdery mildew patho-systems.
Semi-automated quantification and neuroanatomical mapping of heterogeneous cell populations.
Mendez, Oscar A; Potter, Colin J; Valdez, Michael; Bello, Thomas; Trouard, Theodore P; Koshy, Anita A
2018-07-15
Our group studies the interactions between cells of the brain and the neurotropic parasite Toxoplasma gondii. Using an in vivo system that allows us to permanently mark and identify brain cells injected with Toxoplasma protein, we have identified that Toxoplasma-injected neurons (TINs) are heterogeneously distributed throughout the brain. Unfortunately, standard methods to quantify and map heterogeneous cell populations onto a reference brain atlas are time consuming and prone to user bias. We developed a novel MATLAB-based semi-automated quantification and mapping program to allow the rapid and consistent mapping of heterogeneously distributed cells on to the Allen Institute Mouse Brain Atlas. The system uses two-threshold background subtraction to identify and quantify cells of interest. We demonstrate that we reliably quantify and neuroanatomically localize TINs with low intra- or inter-observer variability. In a follow up experiment, we show that specific regions of the mouse brain are enriched with TINs. The procedure we use takes advantage of simple immunohistochemistry labeling techniques, use of a standard microscope with a motorized stage, and low cost computing that can be readily obtained at a research institute. To our knowledge there is no other program that uses such readily available techniques and equipment for mapping heterogeneous populations of cells across the whole mouse brain. The quantification method described here allows reliable visualization, quantification, and mapping of heterogeneous cell populations in immunolabeled sections across whole mouse brains. Copyright © 2018 Elsevier B.V. All rights reserved.
Two-dimensional grid-free compressive beamforming.
Yang, Yang; Chu, Zhigang; Xu, Zhongming; Ping, Guoli
2017-08-01
Compressive beamforming realizes the direction-of-arrival (DOA) estimation and strength quantification of acoustic sources by solving an underdetermined system of equations relating microphone pressures to a source distribution via compressive sensing. The conventional method assumes DOAs of sources to lie on a grid. Its performance degrades due to basis mismatch when the assumption is not satisfied. To overcome this limitation for the measurement with plane microphone arrays, a two-dimensional grid-free compressive beamforming is developed. First, a continuum based atomic norm minimization is defined to denoise the measured pressure and thus obtain the pressure from sources. Next, a positive semidefinite programming is formulated to approximate the atomic norm minimization. Subsequently, a reasonably fast algorithm based on alternating direction method of multipliers is presented to solve the positive semidefinite programming. Finally, the matrix enhancement and matrix pencil method is introduced to process the obtained pressure and reconstruct the source distribution. Both simulations and experiments demonstrate that under certain conditions, the grid-free compressive beamforming can provide high-resolution and low-contamination imaging, allowing accurate and fast estimation of two-dimensional DOAs and quantification of source strengths, even with non-uniform arrays and noisy measurements.
Entropy based quantification of Ki-67 positive cell images and its evaluation by a reader study
NASA Astrophysics Data System (ADS)
Niazi, M. Khalid Khan; Pennell, Michael; Elkins, Camille; Hemminger, Jessica; Jin, Ming; Kirby, Sean; Kurt, Habibe; Miller, Barrie; Plocharczyk, Elizabeth; Roth, Rachel; Ziegler, Rebecca; Shana'ah, Arwa; Racke, Fred; Lozanski, Gerard; Gurcan, Metin N.
2013-03-01
Presence of Ki-67, a nuclear protein, is typically used to measure cell proliferation. The quantification of the Ki-67 proliferation index is performed visually by the pathologist; however, this is subject to inter- and intra-reader variability. Automated techniques utilizing digital image analysis by computers have emerged. The large variations in specimen preparation, staining, and imaging as well as true biological heterogeneity of tumor tissue often results in variable intensities in Ki-67 stained images. These variations affect the performance of currently developed methods. To optimize the segmentation of Ki-67 stained cells, one should define a data dependent transformation that will account for these color variations instead of defining a fixed linear transformation to separate different hues. To address these issues in images of tissue stained with Ki-67, we propose a methodology that exploits the intrinsic properties of CIE L∗a∗b∗ color space to translate this complex problem into an automatic entropy based thresholding problem. The developed method was evaluated through two reader studies with pathology residents and expert hematopathologists. Agreement between the proposed method and the expert pathologists was good (CCC = 0.80).
Kangas, Michael J; Burks, Raychelle M; Atwater, Jordyn; Lukowicz, Rachel M; Garver, Billy; Holmes, Andrea E
2018-02-01
With the increasing availability of digital imaging devices, colorimetric sensor arrays are rapidly becoming a simple, yet effective tool for the identification and quantification of various analytes. Colorimetric arrays utilize colorimetric data from many colorimetric sensors, with the multidimensional nature of the resulting data necessitating the use of chemometric analysis. Herein, an 8 sensor colorimetric array was used to analyze select acid and basic samples (0.5 - 10 M) to determine which chemometric methods are best suited for classification quantification of analytes within clusters. PCA, HCA, and LDA were used to visualize the data set. All three methods showed well-separated clusters for each of the acid or base analytes and moderate separation between analyte concentrations, indicating that the sensor array can be used to identify and quantify samples. Furthermore, PCA could be used to determine which sensors showed the most effective analyte identification. LDA, KNN, and HQI were used for identification of analyte and concentration. HQI and KNN could be used to correctly identify the analytes in all cases, while LDA correctly identified 95 of 96 analytes correctly. Additional studies demonstrated that controlling for solvent and image effects was unnecessary for all chemometric methods utilized in this study.
Lee, Bai Qin; Wan Mohamed Radzi, Che Wan Jasimah Bt; Khor, Sook Mei
2016-02-05
This paper reports the application of hexamethyldisilazane-trimethylsilyl trifluoromethanesulfonate (HMDS-TMSOTf) for the simultaneous silylation of 3-monochloro-1,2-propanediol (3-MCPD) and 1,3-dicholoropropanol (1,3-DCP) in solid and liquid food samples. 3-MCPD and 1,3-DCP are chloropropanols that have been established as Group 2B carcinogens in clinical testing. They can be found in heat-processed food, especially when an extended high-temperature treatment is required. However, the current AOAC detection method is time-consuming and expensive. Thus, HMDS-TMSOTf was used in this study to provide a safer, and cost-effective alternative to the HFBI method. Three important steps are involved in the quantification of 3-MCPD and 1,3-DCP: extraction, derivatization and quantification. The optimization of the derivatization process, which involved focusing on the catalyst volume, derivatization temperature, and derivatization time was performed based on the findings obtained from both the Box-Behnken modeling and a real experimental set up. With the optimized conditions, the newly developed method was used for actual food sample quantification and the results were compared with those obtained via the standard AOAC method. The developed method required less samples and reagents but it could be used to achieve lower limits of quantification (0.0043mgL(-1) for 1,3-DCP and 0.0011mgL(-1) for 3-MCPD) and detection (0.0028mgL(-1) for 1,3-DCP and 0.0008mgL(-1) for 3-MCPD). All the detected concentrations are below the maximum tolerable limit of 0.02mgL(-1). The percentage of recovery obtained from food sample analysis was between 83% and 96%. The new procedure was validated with the AOAC method and showed a comparable performance. The HMDS-TMSOTf derivatization strategy is capable of simultaneously derivatizing 1,3-DCP and 3-MCPD at room temperature, and it also serves as a rapid, sensitive, and accurate analytical method for food samples analysis. Copyright © 2015 Elsevier B.V. All rights reserved.
UFLC-ESI-MS/MS analysis of multiple mycotoxins in medicinal and edible Areca catechu.
Liu, Hongmei; Luo, Jiaoyang; Kong, Weijun; Liu, Qiutao; Hu, Yichen; Yang, Meihua
2016-05-01
A robust, sensitive and reliable ultra fast liquid chromatography combined with electrospray ionization tandem mass spectrometry (UFLC-ESI-MS/MS) was optimized and validated for simultaneous identification and quantification of eleven mycotoxins in medicinal and edible Areca catechu, based on one-step extraction without any further clean-up. Separation and quantification were performed in both positive and negative modes under multiple reaction monitoring (MRM) in a single run with zearalanone (ZAN) as internal standard. The chromatographic conditions and MS/MS parameters were carefully optimized. Matrix-matched calibration was recommended to reduce matrix effects and improve accuracy, showing good linearity within wide concentration ranges. Limits of quantification (LOQ) were lower than 50 μg kg(-1), while limits of detection (LOD) were in the range of 0.1-20 μg kg(-1). The accuracy of the developed method was validated for recoveries, ranging from 85% to 115% with relative standard deviation (RSD) ≤14.87% at low level, from 75% to 119% with RSD ≤ 14.43% at medium level and from 61% to 120% with RSD ≤ 13.18% at high level, respectively. Finally, the developed multi-mycotoxin method was applied for screening of these mycotoxins in 24 commercial samples. Only aflatoxin B2 and zearalenone were found in 2 samples. This is the first report on the application of UFLC-ESI(+/-)-MS/MS for multi-class mycotoxins in A. catechu. The developed method with many advantages of simple pretreatment, rapid determination and high sensitivity is a proposed candidate for large-scale detection and quantification of multiple mycotoxins in other complex matrixes. Copyright © 2016 Elsevier Ltd. All rights reserved.
Rieger, Benedikt; Zimmer, Fabian; Zapp, Jascha; Weingärtner, Sebastian; Schad, Lothar R
2017-11-01
To develop an implementation of the magnetic resonance fingerprinting (MRF) paradigm for quantitative imaging using echo-planar imaging (EPI) for simultaneous assessment of T 1 and T2∗. The proposed MRF method (MRF-EPI) is based on the acquisition of 160 gradient-spoiled EPI images with rapid, parallel-imaging accelerated, Cartesian readout and a measurement time of 10 s per slice. Contrast variation is induced using an initial inversion pulse, and varying the flip angles, echo times, and repetition times throughout the sequence. Joint quantification of T 1 and T2∗ is performed using dictionary matching with integrated B1+ correction. The quantification accuracy of the method was validated in phantom scans and in vivo in 6 healthy subjects. Joint T 1 and T2∗ parameter maps acquired with MRF-EPI in phantoms are in good agreement with reference measurements, showing deviations under 5% and 4% for T 1 and T2∗, respectively. In vivo baseline images were visually free of artifacts. In vivo relaxation times are in good agreement with gold-standard techniques (deviation T 1 : 4 ± 2%, T2∗: 4 ± 5%). The visual quality was comparable to the in vivo gold standard, despite substantially shortened scan times. The proposed MRF-EPI method provides fast and accurate T 1 and T2∗ quantification. This approach offers a rapid supplement to the non-Cartesian MRF portfolio, with potentially increased usability and robustness. Magn Reson Med 78:1724-1733, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.
Standardless quantification by parameter optimization in electron probe microanalysis
NASA Astrophysics Data System (ADS)
Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.
2012-11-01
A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively.
Powder X-ray diffraction method for the quantification of cocrystals in the crystallization mixture.
Padrela, Luis; de Azevedo, Edmundo Gomes; Velaga, Sitaram P
2012-08-01
The solid state purity of cocrystals critically affects their performance. Thus, it is important to accurately quantify the purity of cocrystals in the final crystallization product. The aim of this study was to develop a powder X-ray diffraction (PXRD) quantification method for investigating the purity of cocrystals. The method developed was employed to study the formation of indomethacin-saccharin (IND-SAC) cocrystals by mechanochemical methods. Pure IND-SAC cocrystals were geometrically mixed with 1:1 w/w mixture of indomethacin/saccharin in various proportions. An accurately measured amount (550 mg) of the mixture was used for the PXRD measurements. The most intense, non-overlapping, characteristic diffraction peak of IND-SAC was used to construct the calibration curve in the range 0-100% (w/w). This calibration model was validated and used to monitor the formation of IND-SAC cocrystals by liquid-assisted grinding (LAG). The IND-SAC cocrystal calibration curve showed excellent linearity (R(2) = 0.9996) over the entire concentration range, displaying limit of detection (LOD) and limit of quantification (LOQ) values of 1.23% (w/w) and 3.74% (w/w), respectively. Validation results showed excellent correlations between actual and predicted concentrations of IND-SAC cocrystals (R(2) = 0.9981). The accuracy and reliability of the PXRD quantification method depend on the methods of sample preparation and handling. The crystallinity of the IND-SAC cocrystals was higher when larger amounts of methanol were used in the LAG method. The PXRD quantification method is suitable and reliable for verifying the purity of cocrystals in the final crystallization product.
Quantification of organ motion based on an adaptive image-based scale invariant feature method
DOE Office of Scientific and Technical Information (OSTI.GOV)
Paganelli, Chiara; Peroni, Marta; Baroni, Guido
2013-11-15
Purpose: The availability of corresponding landmarks in IGRT image series allows quantifying the inter and intrafractional motion of internal organs. In this study, an approach for the automatic localization of anatomical landmarks is presented, with the aim of describing the nonrigid motion of anatomo-pathological structures in radiotherapy treatments according to local image contrast.Methods: An adaptive scale invariant feature transform (SIFT) was developed from the integration of a standard 3D SIFT approach with a local image-based contrast definition. The robustness and invariance of the proposed method to shape-preserving and deformable transforms were analyzed in a CT phantom study. The application ofmore » contrast transforms to the phantom images was also tested, in order to verify the variation of the local adaptive measure in relation to the modification of image contrast. The method was also applied to a lung 4D CT dataset, relying on manual feature identification by an expert user as ground truth. The 3D residual distance between matches obtained in adaptive-SIFT was then computed to verify the internal motion quantification with respect to the expert user. Extracted corresponding features in the lungs were used as regularization landmarks in a multistage deformable image registration (DIR) mapping the inhale vs exhale phase. The residual distances between the warped manual landmarks and their reference position in the inhale phase were evaluated, in order to provide a quantitative indication of the registration performed with the three different point sets.Results: The phantom study confirmed the method invariance and robustness properties to shape-preserving and deformable transforms, showing residual matching errors below the voxel dimension. The adapted SIFT algorithm on the 4D CT dataset provided automated and accurate motion detection of peak to peak breathing motion. The proposed method resulted in reduced residual errors with respect to standard SIFT, providing a motion description comparable to expert manual identification, as confirmed by DIR.Conclusions: The application of the method to a 4D lung CT patient dataset demonstrated adaptive-SIFT potential as an automatic tool to detect landmarks for DIR regularization and internal motion quantification. Future works should include the optimization of the computational cost and the application of the method to other anatomical sites and image modalities.« less
NASA Technical Reports Server (NTRS)
Benek, John A.; Luckring, James M.
2017-01-01
A NATO symposium held in 2008 identified many promising sensitivity analysis and un-certainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not known. The STO Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic problems of interest to NATO. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper presents an overview of the AVT-191 program content.
NASA Technical Reports Server (NTRS)
Benek, John A.; Luckring, James M.
2017-01-01
A NATO symposium held in Greece in 2008 identified many promising sensitivity analysis and uncertainty quantification technologies, but the maturity and suitability of these methods for realistic applications was not clear. The NATO Science and Technology Organization, Task Group AVT-191 was established to evaluate the maturity and suitability of various sensitivity analysis and uncertainty quantification methods for application to realistic vehicle development problems. The program ran from 2011 to 2015, and the work was organized into four discipline-centric teams: external aerodynamics, internal aerodynamics, aeroelasticity, and hydrodynamics. This paper summarizes findings and lessons learned from the task group.
[Detection of recombinant-DNA in foods from stacked genetically modified plants].
Sorokina, E Iu; Chernyshova, O N
2012-01-01
A quantitative real-time multiplex polymerase chain reaction method was applied to the detection and quantification of MON863 and MON810 in stacked genetically modified maize MON 810xMON 863. The limit of detection was approximately 0,1%. The accuracy of the quantification, measured as bias from the accepted value and the relative repeatability standard deviation, which measures the intra-laboratory variability, were within 25% at each GM-level. A method verification has demonstrated that the MON 863 and the MON810 methods can be equally applied in quantification of the respective events in stacked MON810xMON 863.
ERIC Educational Resources Information Center
Penteado, Jose C.; Angnes, Lucio; Masini, Jorge C.; Oliveira, Paulo C. C.
2005-01-01
This article describes the reaction between nitrite and safranine O. This sensitive reaction is based on the disappearance of color of the reddish-orange azo dye, allowing the determination of nitrite at the mg mL-1 level. A factorial optimization of parameters was carried out and the method was applied for the quantification of nitrite in…
Rapid Quantitative Detection of Lactobacillus sakei in Meat and Fermented Sausages by Real-Time PCR
Martín, Belén; Jofré, Anna; Garriga, Margarita; Pla, Maria; Aymerich, Teresa
2006-01-01
A quick and simple method for quantitative detection of Lactobacillus sakei in fermented sausages was successfully developed. It is based on Chelex-100-based DNA purification and real-time PCR enumeration using a TaqMan fluorescence probe. Primers and probes were designed in the L. sakei 16S-23S rRNA intergenic transcribed spacer region, and the assay was evaluated using L. sakei genomic DNA and an artificially inoculated sausage model. The detection limit of this technique was approximately 3 cells per reaction mixture using both purified DNA and the inoculated sausage model. The quantification limit was established at 30 cells per reaction mixture in both models. The assay was then applied to enumerate L. sakei in real samples, and the results were compared to the MRS agar count method followed by confirmation of the percentage of L. sakei colonies. The results obtained by real-time PCR were not statistically significantly different than those obtained by plate count on MRS agar (P > 0.05), showing a satisfactory agreement between both methods. Therefore, the real-time PCR assay developed can be considered a promising rapid alternative method for the quantification of L. sakei and evaluation of the implantation of starter strains of L. sakei in fermented sausages. PMID:16957227
Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound
NASA Astrophysics Data System (ADS)
Galperin, Michael
2003-05-01
A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.
Rapid quantitative detection of Lactobacillus sakei in meat and fermented sausages by real-time PCR.
Martín, Belén; Jofré, Anna; Garriga, Margarita; Pla, Maria; Aymerich, Teresa
2006-09-01
A quick and simple method for quantitative detection of Lactobacillus sakei in fermented sausages was successfully developed. It is based on Chelex-100-based DNA purification and real-time PCR enumeration using a TaqMan fluorescence probe. Primers and probes were designed in the L. sakei 16S-23S rRNA intergenic transcribed spacer region, and the assay was evaluated using L. sakei genomic DNA and an artificially inoculated sausage model. The detection limit of this technique was approximately 3 cells per reaction mixture using both purified DNA and the inoculated sausage model. The quantification limit was established at 30 cells per reaction mixture in both models. The assay was then applied to enumerate L. sakei in real samples, and the results were compared to the MRS agar count method followed by confirmation of the percentage of L. sakei colonies. The results obtained by real-time PCR were not statistically significantly different than those obtained by plate count on MRS agar (P > 0.05), showing a satisfactory agreement between both methods. Therefore, the real-time PCR assay developed can be considered a promising rapid alternative method for the quantification of L. sakei and evaluation of the implantation of starter strains of L. sakei in fermented sausages.
NASA Astrophysics Data System (ADS)
Greer, Tyler; Lietz, Christopher B.; Xiang, Feng; Li, Lingjun
2015-01-01
Absolute quantification of protein targets using liquid chromatography-mass spectrometry (LC-MS) is a key component of candidate biomarker validation. One popular method combines multiple reaction monitoring (MRM) using a triple quadrupole instrument with stable isotope-labeled standards (SIS) for absolute quantification (AQUA). LC-MRM AQUA assays are sensitive and specific, but they are also expensive because of the cost of synthesizing stable isotope peptide standards. While the chemical modification approach using mass differential tags for relative and absolute quantification (mTRAQ) represents a more economical approach when quantifying large numbers of peptides, these reagents are costly and still suffer from lower throughput because only two concentration values per peptide can be obtained in a single LC-MS run. Here, we have developed and applied a set of five novel mass difference reagents, isotopic N, N-dimethyl leucine (iDiLeu). These labels contain an amine reactive group, triazine ester, are cost effective because of their synthetic simplicity, and have increased throughput compared with previous LC-MS quantification methods by allowing construction of a four-point standard curve in one run. iDiLeu-labeled peptides show remarkably similar retention time shifts, slightly lower energy thresholds for higher-energy collisional dissociation (HCD) fragmentation, and high quantification accuracy for trypsin-digested protein samples (median errors <15%). By spiking in an iDiLeu-labeled neuropeptide, allatostatin, into mouse urine matrix, two quantification methods are validated. The first uses one labeled peptide as an internal standard to normalize labeled peptide peak areas across runs (<19% error), whereas the second enables standard curve creation and analyte quantification in one run (<8% error).
Venturelli, Gustavo L; Brod, Fábio C A; Rossi, Gabriela B; Zimmermann, Naíra F; Oliveira, Jaison P; Faria, Josias C; Arisi, Ana C M
2014-11-01
The Embrapa 5.1 genetically modified (GM) common bean was approved for commercialization in Brazil. Methods for the quantification of this new genetically modified organism (GMO) are necessary. The development of a suitable endogenous reference is essential for GMO quantification by real-time PCR. Based on this, a new taxon-specific endogenous reference quantification assay was developed for Phaseolus vulgaris L. Three genes encoding common bean proteins (phaseolin, arcelin, and lectin) were selected as candidates for endogenous reference. Primers targeting these candidate genes were designed and the detection was evaluated using the SYBR Green chemistry. The assay targeting lectin gene showed higher specificity than the remaining assays, and a hydrolysis probe was then designed. This assay showed high specificity for 50 common bean samples from two gene pools, Andean and Mesoamerican. For GM common bean varieties, the results were similar to those obtained for non-GM isogenic varieties with PCR efficiency values ranging from 92 to 101 %. Moreover, this assay presented a limit of detection of ten haploid genome copies. The primers and probe developed in this work are suitable to detect and quantify either GM or non-GM common bean.
Seyer, Alexandre; Fenaille, François; Féraudet-Tarisse, Cecile; Volland, Hervé; Popoff, Michel R; Tabet, Jean-Claude; Junot, Christophe; Becher, François
2012-06-05
Epsilon toxin (ETX) is one of the most lethal toxins produced by Clostridium species and is considered as a potential bioterrorist weapon. Here, we present a rapid mass spectrometry-based method for ETX quantification in complex matrixes. As a prerequisite, naturally occurring prototoxin and toxin species were first structurally characterized by top-down and bottom-up experiments, to identify the most pertinent peptides for quantification. Following selective ETX immunoextraction and trypsin digestion, two proteotypic peptides shared by all the toxin forms were separated by ultraperformance liquid chromatography (UPLC) and monitored by ESI-MS (electrospray ionization-mass spectrometry) operating in the multiple reaction monitoring mode (MRM) with collision-induced dissociation. Thorough protocol optimization, i.e., a 15 min immunocapture, a 2 h enzymatic digestion, and an UPLC-MS/MS detection, allowed the whole quantification process including the calibration curve to be performed in less than 4 h, without compromising assay robustness and sensitivity. The assay sensitivity in milk and serum was estimated at 5 ng·mL(-1) for ETX, making this approach complementary to enzyme linked immunosorbent assay (ELISA) techniques.
Kennedy, Jacob J.; Yan, Ping; Zhao, Lei; Ivey, Richard G.; Voytovich, Uliana J.; Moore, Heather D.; Lin, Chenwei; Pogosova-Agadjanyan, Era L.; Stirewalt, Derek L.; Reding, Kerryn W.; Whiteaker, Jeffrey R.; Paulovich, Amanda G.
2016-01-01
A major goal in cell signaling research is the quantification of phosphorylation pharmacodynamics following perturbations. Traditional methods of studying cellular phospho-signaling measure one analyte at a time with poor standardization, rendering them inadequate for interrogating network biology and contributing to the irreproducibility of preclinical research. In this study, we test the feasibility of circumventing these issues by coupling immobilized metal affinity chromatography (IMAC)-based enrichment of phosphopeptides with targeted, multiple reaction monitoring (MRM) mass spectrometry to achieve precise, specific, standardized, multiplex quantification of phospho-signaling responses. A multiplex immobilized metal affinity chromatography- multiple reaction monitoring assay targeting phospho-analytes responsive to DNA damage was configured, analytically characterized, and deployed to generate phospho-pharmacodynamic curves from primary and immortalized human cells experiencing genotoxic stress. The multiplexed assays demonstrated linear ranges of ≥3 orders of magnitude, median lower limit of quantification of 0.64 fmol on column, median intra-assay variability of 9.3%, median inter-assay variability of 12.7%, and median total CV of 16.0%. The multiplex immobilized metal affinity chromatography- multiple reaction monitoring assay enabled robust quantification of 107 DNA damage-responsive phosphosites from human cells following DNA damage. The assays have been made publicly available as a resource to the community. The approach is generally applicable, enabling wide interrogation of signaling networks. PMID:26621847
Single cell genomic quantification by non-fluorescence nonlinear microscopy
NASA Astrophysics Data System (ADS)
Kota, Divya; Liu, Jing
2017-02-01
Human epidermal growth receptor 2 (Her2) is a gene which plays a major role in breast cancer development. The quantification of Her2 expression in single cells is limited by several drawbacks in existing fluorescence-based single molecule techniques, such as low signal-to-noise ratio (SNR), strong autofluorescence and background signals from biological components. For rigorous genomic quantification, a robust method of orthogonal detection is highly desirable and we demonstrated it by two non-fluorescent imaging techniques -transient absorption microscopy (TAM) and second harmonic generation (SHG). In TAM, gold nanoparticles (AuNPs) are chosen as an orthogonal probes for detection of single molecules which gives background-free quantifications of single mRNA transcript. In SHG, emission from barium titanium oxide (BTO) nanoprobes was demonstrated which allows stable signal beyond the autofluorescence window. Her2 mRNA was specifically labeled with nanoprobes which are conjugated with antibodies or oligonucleotides and quantified at single copy sensitivity in the cancer cells and tissues. Furthermore, a non-fluorescent super-resolution concept, named as second harmonic super-resolution microscopy (SHaSM), was proposed to quantify individual Her2 transcripts in cancer cells beyond the diffraction limit. These non-fluorescent imaging modalities will provide new dimensions in biomarker quantification at single molecule sensitivity in turbid biological samples, offering a strong cross-platform strategy for clinical monitoring at single cell resolution.
Kubec, Roman; Dadáková, Eva
2009-10-09
A novel HPLC method for determination of a wide variety of S-substituted cysteine derivatives in Allium species has been developed and validated. This method allows simultaneous separation and quantification of S-alk(en)ylcysteine S-oxides, gamma-glutamyl-S-alk(en)ylcysteines and gamma-glutamyl-S-alk(en)ylcysteine S-oxides in a single run. The procedure is based on extraction of these amino acids and dipeptides by methanol, their derivatization by dansyl chloride and subsequent separation by reversed phase HPLC. The main advantages of the new method are simplicity, excellent stability of derivatives, high sensitivity, specificity and the ability to simultaneously analyze the whole range of S-substituted cysteine derivatives. This method was critically compared with other chromatographic procedures used for quantification of S-substituted cysteine derivatives, namely with two other HPLC methods (derivatization by o-phthaldialdehyde/tert-butylthiol and fluorenylmethyl chloroformate), and with determination by gas chromatography or capillary electrophoresis. Major advantages and drawbacks of these analytical procedures are discussed. Employing these various chromatographic methods, the content and relative proportions of individual S-substituted cysteine derivatives were determined in four most frequently consumed alliaceous vegetables (garlic, onion, shallot, and leek).
Cruz, Rebeca; Casal, Susana
2013-11-15
Vitamin E analysis in green vegetables is performed by an array of different methods, making it difficult to compare published data or choosing the adequate one for a particular sample. Aiming to achieve a consistent method with wide applicability, the current study reports the development and validation of a fast micro-method for quantification of vitamin E in green leafy vegetables. The methodology uses solid-liquid extraction based on the Folch method, with tocol as internal standard, and normal-phase HPLC with fluorescence detection. A large linear working range was confirmed, being highly reproducible, with inter-day precisions below 5% (RSD). Method sensitivity was established (below 0.02 μg/g fresh weight), and accuracy was assessed by recovery tests (>96%). The method was tested in different green leafy vegetables, evidencing diverse tocochromanol profiles, with variable ratios and amounts of α- and γ-tocopherol, and other minor compounds. The methodology is adequate for routine analyses, with a reduced chromatographic run (<7 min) and organic solvent consumption, and requires only standard chromatographic equipment available in most laboratories. Copyright © 2013 Elsevier Ltd. All rights reserved.
Gyawali, P
2018-02-01
Raw and partially treated wastewater has been widely used to maintain the global water demand. Presence of viable helminth ova and larvae in the wastewater raised significant public health concern especially when used for agriculture and aquaculture. Depending on the prevalence of helminth infections in communities, up to 1.0 × 10 3 ova/larvae can be presented per litre of wastewater and 4 gm (dry weight) of sludge. Multi-barrier approaches including pathogen reduction, risk assessment, and exposure reduction have been suggested by health regulators to minimise the potential health risk. However, with a lack of a sensitive and specific method for the quantitative detection of viable helminth ova from wastewater, an accurate health risk assessment is difficult to achieve. As a result, helminth infections are difficult to control from the communities despite two decades of global effort (mass drug administration). Molecular methods can be more sensitive and specific than currently adapted culture-based and vital stain methods. The molecular methods, however, required more and thorough investigation for its ability with accurate quantification of viable helminth ova/larvae from wastewater and sludge samples. Understanding different cell stages and corresponding gene copy numbers is pivotal for accurate quantification of helminth ova/larvae in wastewater samples. Identifying specific genetic markers including protein, lipid, and metabolites using multiomics approach could be utilized for cheap, rapid, sensitive, specific and point of care detection tools for helminth ova and larva in the wastewater.