Sample records for enabled accurate quantification

  1. Droplet Digital™ PCR Next-Generation Sequencing Library QC Assay.

    PubMed

    Heredia, Nicholas J

    2018-01-01

    Digital PCR is a valuable tool to quantify next-generation sequencing (NGS) libraries precisely and accurately. Accurately quantifying NGS libraries enable accurate loading of the libraries on to the sequencer and thus improve sequencing performance by reducing under and overloading error. Accurate quantification also benefits users by enabling uniform loading of indexed/barcoded libraries which in turn greatly improves sequencing uniformity of the indexed/barcoded samples. The advantages gained by employing the Droplet Digital PCR (ddPCR™) library QC assay includes the precise and accurate quantification in addition to size quality assessment, enabling users to QC their sequencing libraries with confidence.

  2. pyQms enables universal and accurate quantification of mass spectrometry data.

    PubMed

    Leufken, Johannes; Niehues, Anna; Sarin, L Peter; Wessel, Florian; Hippler, Michael; Leidel, Sebastian A; Fufezan, Christian

    2017-10-01

    Quantitative mass spectrometry (MS) is a key technique in many research areas (1), including proteomics, metabolomics, glycomics, and lipidomics. Because all of the corresponding molecules can be described by chemical formulas, universal quantification tools are highly desirable. Here, we present pyQms, an open-source software for accurate quantification of all types of molecules measurable by MS. pyQms uses isotope pattern matching that offers an accurate quality assessment of all quantifications and the ability to directly incorporate mass spectrometer accuracy. pyQms is, due to its universal design, applicable to every research field, labeling strategy, and acquisition technique. This opens ultimate flexibility for researchers to design experiments employing innovative and hitherto unexplored labeling strategies. Importantly, pyQms performs very well to accurately quantify partially labeled proteomes in large scale and high throughput, the most challenging task for a quantification algorithm. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  3. Multiplex Droplet Digital PCR Protocols for Quantification of GM Maize Events.

    PubMed

    Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Štebih, Dejan; Morisset, Dany; Holst-Jensen, Arne; Žel, Jana

    2018-01-01

    The standard-curve based simplex quantitative polymerase chain reaction (qPCR) has been the gold standard for DNA target quantification for more than a decade. The large and growing number of individual analyses needed to test for genetically modified organisms (GMOs) is reducing the cost-effectiveness of qPCR. Droplet digital PCR (ddPCR) enables absolute quantification without standard curves, avoids the amplification efficiency bias observed with qPCR, allows more accurate estimations at low target copy numbers and, in combination with multiplexing, significantly improves cost efficiency. Here we describe two protocols for multiplex quantification of GM maize events: (1) nondiscriminating, with multiplex quantification of targets as a group (12 GM maize lines) and (2) discriminating, with multiplex quantification of individual targets (events). The first enables the quantification of twelve European Union authorized GM maize events as a group with only two assays, but does not permit determination of the individual events present. The second protocol enables the quantification of four individual targets (three GM events and one endogene) in a single reaction. Both protocols can be modified for quantification of any other DNA target.

  4. Magnetic Particle Spectroscopy Reveals Dynamic Changes in the Magnetic Behavior of Very Small Superparamagnetic Iron Oxide Nanoparticles During Cellular Uptake and Enables Determination of Cell-Labeling Efficacy.

    PubMed

    Poller, Wolfram C; Löwa, Norbert; Wiekhorst, Frank; Taupitz, Matthias; Wagner, Susanne; Möller, Konstantin; Baumann, Gert; Stangl, Verena; Trahms, Lutz; Ludwig, Antje

    2016-02-01

    In vivo tracking of nanoparticle-labeled cells by magnetic resonance imaging (MRI) crucially depends on accurate determination of cell-labeling efficacy prior to transplantation. Here, we analyzed the feasibility and accuracy of magnetic particle spectroscopy (MPS) for estimation of cell-labeling efficacy in living THP-1 cells incubated with very small superparamagnetic iron oxide nanoparticles (VSOP). Cell viability and proliferation capacity were not affected by the MPS measurement procedure. In VSOP samples without cell contact, MPS enabled highly accurate quantification. In contrast, MPS constantly overestimated the amount of cell associated and internalized VSOP. Analyses of the MPS spectrum shape expressed as harmonic ratio A₅/A₃ revealed distinct changes in the magnetic behavior of VSOP in response to cellular uptake. These changes were proportional to the deviation between MPS and actual iron amount, therefore allowing for adjusted iron quantification. Transmission electron microscopy provided visual evidence that changes in the magnetic properties correlated with cell surface interaction of VSOP as well as with alterations of particle structure and arrangement during the phagocytic process. Altogether, A₅/A₃-adjusted MPS enables highly accurate, cell-preserving VSOP quantification and furthermore provides information on the magnetic characteristics of internalized VSOP.

  5. Understanding the transmission dynamics of Leishmania donovani to provide robust evidence for interventions to eliminate visceral leishmaniasis in Bihar, India

    USDA-ARS?s Scientific Manuscript database

    Molecular tools enable the collection of accurate estimates of human blood index (HBI) in Phlebotomus argentipes. The refinement of a metacyclic-specific qPCR assay to identify L. donovani in P. argentipes would enable quantification of the entomological inoculation rate (EIR) for the first time. Li...

  6. Understanding the transmission dynamics of Leishmania donovani to provide robust evidence for interventions to eliminate visceral leishmaniasis in Bihar, India

    USDA-ARS?s Scientific Manuscript database

    Molecular tools enable the collection of accurate estimates of human blood index (HBI) in P. argentipes. The refinement of a metacyclic-specific qPCR assay to identify L. donovani in P. argentipes would enable quantification of the entomological inoculation rate (EIR) for the first time. Likewise, a...

  7. Accurate quantification of fluorescent targets within turbid media based on a decoupled fluorescence Monte Carlo model.

    PubMed

    Deng, Yong; Luo, Zhaoyang; Jiang, Xu; Xie, Wenhao; Luo, Qingming

    2015-07-01

    We propose a method based on a decoupled fluorescence Monte Carlo model for constructing fluorescence Jacobians to enable accurate quantification of fluorescence targets within turbid media. The effectiveness of the proposed method is validated using two cylindrical phantoms enclosing fluorescent targets within homogeneous and heterogeneous background media. The results demonstrate that our method can recover relative concentrations of the fluorescent targets with higher accuracy than the perturbation fluorescence Monte Carlo method. This suggests that our method is suitable for quantitative fluorescence diffuse optical tomography, especially for in vivo imaging of fluorophore targets for diagnosis of different diseases and abnormalities.

  8. Identifying and quantifying secondhand smoke in multiunit homes with tobacco smoke odor complaints

    NASA Astrophysics Data System (ADS)

    Dacunto, Philip J.; Cheng, Kai-Chung; Acevedo-Bolton, Viviana; Klepeis, Neil E.; Repace, James L.; Ott, Wayne R.; Hildemann, Lynn M.

    2013-06-01

    Accurate identification and quantification of the secondhand tobacco smoke (SHS) that drifts between multiunit homes (MUHs) is essential for assessing resident exposure and health risk. We collected 24 gaseous and particle measurements over 6-9 day monitoring periods in five nonsmoking MUHs with reported SHS intrusion problems. Nicotine tracer sampling showed evidence of SHS intrusion in all five homes during the monitoring period; logistic regression and chemical mass balance (CMB) analysis enabled identification and quantification of some of the precise periods of SHS entry. Logistic regression models identified SHS in eight periods when residents complained of SHS odor, and CMB provided estimates of SHS magnitude in six of these eight periods. Both approaches properly identified or apportioned all six cooking periods used as no-SHS controls. Finally, both approaches enabled identification and/or apportionment of suspected SHS in five additional periods when residents did not report smelling smoke. The time resolution of this methodology goes beyond sampling methods involving single tracers (such as nicotine), enabling the precise identification of the magnitude and duration of SHS intrusion, which is essential for accurate assessment of human exposure.

  9. Fundamentals of multiplexing with digital PCR.

    PubMed

    Whale, Alexandra S; Huggett, Jim F; Tzonev, Svilen

    2016-12-01

    Over the past decade numerous publications have demonstrated how digital PCR (dPCR) enables precise and sensitive quantification of nucleic acids in a wide range of applications in both healthcare and environmental analysis. This has occurred in parallel with the advances in partitioning fluidics that enable a reaction to be subdivided into an increasing number of partitions. As the majority of dPCR systems are based on detection in two discrete optical channels, most research to date has focused on quantification of one or two targets within a single reaction. Here we describe 'higher order multiplexing' that is the unique ability of dPCR to precisely measure more than two targets in the same reaction. Using examples, we describe the different types of duplex and multiplex reactions that can be achieved. We also describe essential experimental considerations to ensure accurate quantification of multiple targets.

  10. Sequencing small genomic targets with high efficiency and extreme accuracy

    PubMed Central

    Schmitt, Michael W.; Fox, Edward J.; Prindle, Marc J.; Reid-Bayliss, Kate S.; True, Lawrence D.; Radich, Jerald P.; Loeb, Lawrence A.

    2015-01-01

    The detection of minority variants in mixed samples demands methods for enrichment and accurate sequencing of small genomic intervals. We describe an efficient approach based on sequential rounds of hybridization with biotinylated oligonucleotides, enabling more than one-million fold enrichment of genomic regions of interest. In conjunction with error correcting double-stranded molecular tags, our approach enables the quantification of mutations in individual DNA molecules. PMID:25849638

  11. Practical quantification of necrosis in histological whole-slide images.

    PubMed

    Homeyer, André; Schenk, Andrea; Arlt, Janine; Dahmen, Uta; Dirsch, Olaf; Hahn, Horst K

    2013-06-01

    Since the histological quantification of necrosis is a common task in medical research and practice, we evaluate different image analysis methods for quantifying necrosis in whole-slide images. In a practical usage scenario, we assess the impact of different classification algorithms and feature sets on both accuracy and computation time. We show how a well-chosen combination of multiresolution features and an efficient postprocessing step enables the accurate quantification necrosis in gigapixel images in less than a minute. The results are general enough to be applied to other areas of histological image analysis as well. Copyright © 2013 Elsevier Ltd. All rights reserved.

  12. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    PubMed

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. Background Signal as an in Situ Predictor of Dopamine Oxidation Potential: Improving Interpretation of Fast-Scan Cyclic Voltammetry Data.

    PubMed

    Meunier, Carl J; Roberts, James G; McCarty, Gregory S; Sombers, Leslie A

    2017-02-15

    Background-subtracted fast-scan cyclic voltammetry (FSCV) has emerged as a powerful analytical technique for monitoring subsecond molecular fluctuations in live brain tissue. Despite increasing utilization of FSCV, efforts to improve the accuracy of quantification have been limited due to the complexity of the technique and the dynamic recording environment. It is clear that variable electrode performance renders calibration necessary for accurate quantification; however, the nature of in vivo measurements can make conventional postcalibration difficult, or even impossible. Analyte-specific voltammograms and scaling factors that are critical for quantification can shift or fluctuate in vivo. This is largely due to impedance changes, and the effects of impedance on these measurements have not been characterized. We have previously reported that the background current can be used to predict electrode-specific scaling factors in situ. In this work, we employ model circuits to investigate the impact of impedance on FSCV measurements. Additionally, we take another step toward in situ electrode calibration by using the oxidation potential of quinones on the electrode surface to accurately predict the oxidation potential for dopamine at any point in an electrochemical experiment, as both are dependent on impedance. The model, validated both in adrenal slice and live brain tissue, enables information encoded in the shape of the background voltammogram to determine electrochemical parameters that are critical for accurate quantification. This improves data interpretation and provides a significant next step toward more automated methods for in vivo data analysis.

  14. Combining Metabolic ¹⁵N Labeling with Improved Tandem MOAC for Enhanced Probing of the Phosphoproteome.

    PubMed

    Thomas, Martin; Huck, Nicola; Hoehenwarter, Wolfgang; Conrath, Uwe; Beckers, Gerold J M

    2015-01-01

    In eukaryotic cells many diverse cellular functions are regulated by reversible protein phosphorylation. In recent years, phosphoproteomics has become a powerful tool for studying protein phosphorylation because it enables unbiased localization, and site-specific quantification of in vivo phosphorylation of hundreds of proteins in a single experiment. A common strategy for identifying phosphoproteins and their phosphorylation sites from complex biological samples is the enrichment of phosphopeptides from digested cellular lysates followed by mass spectrometry. However, despite high sensitivity of modern mass spectrometers the large dynamic range of protein abundance and the transient nature of protein phosphorylation remained major pitfalls in MS-based phosphoproteomics. This is particularly true for plants in which the presence of secondary metabolites and endogenous compounds, the overabundance of ribulose-1,5-bisphosphate carboxylase and other components of the photosynthetic apparatus, and the concurrent difficulties in protein extraction necessitate two-step phosphoprotein/phosphopeptide enrichment strategies (Nakagami et al., Plant Cell Physiol 53:118-124, 2012).Approaches for label-free peptide quantification are advantageous due to their low cost and experimental simplicity, but they lack precision. These drawbacks can be overcome by metabolic labeling of whole plants with heavy nitrogen ((15)N) which allows combining two samples very early in the phosphoprotein enrichment workflow. This avoids sample-to-sample variation introduced by the analytical procedures and it results in robust relative quantification values that need no further standardization. The integration of (15)N metabolic labeling into tandem metal-oxide affinity chromatography (MOAC) (Hoehenwarter et al., Mol Cell Proteomics 12:369-380, 2013) presents an improved and highly selective approach for the identification and accurate site-specific quantification of low-abundance phosphoproteins that is based on the successive enrichment of light and heavy nitrogen-labeled phosphoproteins and peptides. This improved strategy combines metabolic labeling of whole plants with the stable heavy nitrogen isotope ((15)N), protein extraction under denaturing conditions, phosphoprotein enrichment using Al(OH)3-based MOAC, and tryptic digest of enriched phosphoproteins followed by TiO2-based MOAC of phosphopeptides and quantitative phosphopeptide measurement by liquid chromatography (LC) and high-resolution accurate mass (HR/AM) mass spectrometry (MS). Thus, tandem MOAC effectively targets the phosphate moiety of phosphoproteins and phosphopeptides and allows probing of the phosphoproteome to unprecedented depth, while (15)N metabolic labeling enables accurate relative quantification of measured peptides and direct comparison between samples.

  15. Development of a new densimeter for the combined investigation of dew-point densities and sorption phenomena of fluid mixtures

    NASA Astrophysics Data System (ADS)

    Moritz, Katharina; Kleinrahm, Reiner; McLinden, Mark O.; Richter, Markus

    2017-12-01

    For the determination of dew-point densities and pressures of fluid mixtures, a new densimeter has been developed. The new apparatus is based on the well-established two-sinker density measurement principle with the additional capability of quantifying sorption effects. In the vicinity of the dew line, such effects cause a change in composition of the gas mixture under study, which can significantly distort accurate density measurements. The new experimental technique enables the accurate measurement of dew-point densities and pressures and the quantification of sorption effects at the same time.

  16. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  17. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  18. Simultaneous Quantification of Multiple Alternatively Spliced mRNA Transcripts Using Droplet Digital PCR.

    PubMed

    Sun, Bing; Zheng, Yun-Ling

    2018-01-01

    Currently there is no sensitive, precise, and reproducible method to quantitate alternative splicing of mRNA transcripts. Droplet digital™ PCR (ddPCR™) analysis allows for accurate digital counting for quantification of gene expression. Human telomerase reverse transcriptase (hTERT) is one of the essential components required for telomerase activity and for the maintenance of telomeres. Several alternatively spliced forms of hTERT mRNA in human primary and tumor cells have been reported in the literature. Using one pair of primers and two probes for hTERT, four alternatively spliced forms of hTERT (α-/β+, α+/β- single deletions, α-/β- double deletion, and nondeletion α+/β+) were accurately quantified through a novel analysis method via data collected from a single ddPCR reaction. In this chapter, we describe this ddPCR method that enables direct quantitative comparison of four alternatively spliced forms of the hTERT messenger RNA without the need for internal standards or multiple pairs of primers specific for each variant, eliminating the technical variation due to differential PCR amplification efficiency for different amplicons and the challenges of quantification using standard curves. This simple and straightforward method should have general utility for quantifying alternatively spliced gene transcripts.

  19. Tau-independent Phase Analysis: A Novel Method for Accurately Determining Phase Shifts.

    PubMed

    Tackenberg, Michael C; Jones, Jeff R; Page, Terry L; Hughey, Jacob J

    2018-06-01

    Estimations of period and phase are essential in circadian biology. While many techniques exist for estimating period, comparatively few methods are available for estimating phase. Current approaches to analyzing phase often vary between studies and are sensitive to coincident changes in period and the stage of the circadian cycle at which the stimulus occurs. Here we propose a new technique, tau-independent phase analysis (TIPA), for quantifying phase shifts in multiple types of circadian time-course data. Through comprehensive simulations, we show that TIPA is both more accurate and more precise than the standard actogram approach. TIPA is computationally simple and therefore will enable accurate and reproducible quantification of phase shifts across multiple subfields of chronobiology.

  20. Rapid perfusion quantification using Welch-Satterthwaite approximation and analytical spectral filtering

    NASA Astrophysics Data System (ADS)

    Krishnan, Karthik; Reddy, Kasireddy V.; Ajani, Bhavya; Yalavarthy, Phaneendra K.

    2017-02-01

    CT and MR perfusion weighted imaging (PWI) enable quantification of perfusion parameters in stroke studies. These parameters are calculated from the residual impulse response function (IRF) based on a physiological model for tissue perfusion. The standard approach for estimating the IRF is deconvolution using oscillatory-limited singular value decomposition (oSVD) or Frequency Domain Deconvolution (FDD). FDD is widely recognized as the fastest approach currently available for deconvolution of CT Perfusion/MR PWI. In this work, three faster methods are proposed. The first is a direct (model based) crude approximation to the final perfusion quantities (Blood flow, Blood volume, Mean Transit Time and Delay) using the Welch-Satterthwaite approximation for gamma fitted concentration time curves (CTC). The second method is a fast accurate deconvolution method, we call Analytical Fourier Filtering (AFF). The third is another fast accurate deconvolution technique using Showalter's method, we call Analytical Showalter's Spectral Filtering (ASSF). Through systematic evaluation on phantom and clinical data, the proposed methods are shown to be computationally more than twice as fast as FDD. The two deconvolution based methods, AFF and ASSF, are also shown to be quantitatively accurate compared to FDD and oSVD.

  1. Centrifuge: rapid and sensitive classification of metagenomic sequences

    PubMed Central

    Song, Li; Breitwieser, Florian P.

    2016-01-01

    Centrifuge is a novel microbial classification engine that enables rapid, accurate, and sensitive labeling of reads and quantification of species on desktop computers. The system uses an indexing scheme based on the Burrows-Wheeler transform (BWT) and the Ferragina-Manzini (FM) index, optimized specifically for the metagenomic classification problem. Centrifuge requires a relatively small index (4.2 GB for 4078 bacterial and 200 archaeal genomes) and classifies sequences at very high speed, allowing it to process the millions of reads from a typical high-throughput DNA sequencing run within a few minutes. Together, these advances enable timely and accurate analysis of large metagenomics data sets on conventional desktop computers. Because of its space-optimized indexing schemes, Centrifuge also makes it possible to index the entire NCBI nonredundant nucleotide sequence database (a total of 109 billion bases) with an index size of 69 GB, in contrast to k-mer-based indexing schemes, which require far more extensive space. PMID:27852649

  2. A portable smart phone-based plasmonic nanosensor readout platform that measures transmitted light intensities of nanosubstrates using an ambient light sensor.

    PubMed

    Fu, Qiangqiang; Wu, Ze; Xu, Fangxiang; Li, Xiuqing; Yao, Cuize; Xu, Meng; Sheng, Liangrong; Yu, Shiting; Tang, Yong

    2016-05-21

    Plasmonic nanosensors may be used as tools for diagnostic testing in the field of medicine. However, quantification of plasmonic nanosensors often requires complex and bulky readout instruments. Here, we report the development of a portable smart phone-based plasmonic nanosensor readout platform (PNRP) for accurate quantification of plasmonic nanosensors. This device operates by transmitting excitation light from a LED through a nanosubstrate and measuring the intensity of the transmitted light using the ambient light sensor of a smart phone. The device is a cylinder with a diameter of 14 mm, a length of 38 mm, and a gross weight of 3.5 g. We demonstrated the utility of this smart phone-based PNRP by measuring two well-established plasmonic nanosensors with this system. In the first experiment, the device measured the morphology changes of triangular silver nanoprisms (AgNPRs) in an immunoassay for the detection of carcinoembryonic antigen (CEA). In the second experiment, the device measured the aggregation of gold nanoparticles (AuNPs) in an aptamer-based assay for the detection of adenosine triphosphate (ATP). The results from the smart phone-based PNRP were consistent with those from commercial spectrophotometers, demonstrating that the smart phone-based PNRP enables accurate quantification of plasmonic nanosensors.

  3. STEM VQ Method, Using Scanning Transmission Electron Microscopy (STEM) for Accurate Virus Quantification

    DTIC Science & Technology

    2017-02-02

    Corresponding Author Abstract Accurate virus quantification is sought, but a perfect method still eludes the scientific community. Electron...unlimited. UNCLASSIFIED 2 provides morphology data and counts all viral particles, including partial or noninfectious particles; however, EM methods ...consistent, reproducible virus quantification method called Scanning Transmission Electron Microscopy – Virus Quantification (STEM-VQ) which simplifies

  4. The Rényi divergence enables accurate and precise cluster analysis for localisation microscopy.

    PubMed

    Staszowska, Adela D; Fox-Roberts, Patrick; Hirvonen, Liisa M; Peddie, Christopher J; Collinson, Lucy M; Jones, Gareth E; Cox, Susan

    2018-06-01

    Clustering analysis is a key technique for quantitatively characterising structures in localisation microscopy images. To build up accurate information about biological structures, it is critical that the quantification is both accurate (close to the ground truth) and precise (has small scatter and is reproducible). Here we describe how the Rényi divergence can be used for cluster radius measurements in localisation microscopy data. We demonstrate that the Rényi divergence can operate with high levels of background and provides results which are more accurate than Ripley's functions, Voronoi tesselation or DBSCAN. Data supporting this research will be made accessible via a web link. Software codes developed for this work can be accessed via http://coxphysics.com/Renyi_divergence_software.zip. Implemented in C ++. Correspondence and requests for materials can be also addressed to the corresponding author. adela.staszowska@gmail.com or susan.cox@kcl.ac.uk. Supplementary data are available at Bioinformatics online.

  5. 1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.

    PubMed

    Dagnino, Denise; Schripsema, Jan

    2005-08-01

    A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.

  6. Simultaneous quantification of alternatively spliced transcripts in a single droplet digital PCR reaction.

    PubMed

    Sun, Bing; Tao, Lian; Zheng, Yun-Ling

    2014-06-01

    Human telomerase reverse transcriptase (hTERT) is an essential component required for telomerase activity and telomere maintenance. Several alternatively spliced forms of hTERT mRNA have been reported in human primary and tumor cells. Currently, however, there is no sensitive and accurate method for the simultaneous quantification of multiple alternatively spliced RNA transcripts, such as in the case of hTERT. Here we show droplet digital PCR (ddPCR) provides sensitive, simultaneous digital quantification in a single reaction of two alternatively spliced single deletion hTERT transcripts (α-/β+ and α+/β-) as well as the opportunity to manually quantify non-deletion (α+/β+) and double deletion (α-/β-) transcripts. Our ddPCR method enables direct comparison among four alternatively spliced mRNAs without the need for internal standards or multiple primer pairs specific for each variant as real-time PCR (qPCR) requires, thus eliminating potential variation due to differences in PCR amplification efficiency.

  7. Droplet Digital Enzyme-Linked Oligonucleotide Hybridization Assay for Absolute RNA Quantification.

    PubMed

    Guan, Weihua; Chen, Liben; Rane, Tushar D; Wang, Tza-Huei

    2015-09-03

    We present a continuous-flow droplet-based digital Enzyme-Linked Oligonucleotide Hybridization Assay (droplet digital ELOHA) for sensitive detection and absolute quantification of RNA molecules. Droplet digital ELOHA incorporates direct hybridization and single enzyme reaction via the formation of single probe-RNA-probe (enzyme) complex on magnetic beads. It enables RNA detection without reverse transcription and PCR amplification processes. The magnetic beads are subsequently encapsulated into a large number of picoliter-sized droplets with enzyme substrates in a continuous-flow device. This device is capable of generating droplets at high-throughput. It also integrates in-line enzymatic incubation and detection of fluorescent products. Our droplet digital ELOHA is able to accurately quantify (differentiate 40% difference) as few as ~600 RNA molecules in a 1 mL sample (equivalent to 1 aM or lower) without molecular replication. The absolute quantification ability of droplet digital ELOHA is demonstrated with the analysis of clinical Neisseria gonorrhoeae 16S rRNA to show its potential value in real complex samples.

  8. Droplet Digital Enzyme-Linked Oligonucleotide Hybridization Assay for Absolute RNA Quantification

    PubMed Central

    Guan, Weihua; Chen, Liben; Rane, Tushar D.; Wang, Tza-Huei

    2015-01-01

    We present a continuous-flow droplet-based digital Enzyme-Linked Oligonucleotide Hybridization Assay (droplet digital ELOHA) for sensitive detection and absolute quantification of RNA molecules. Droplet digital ELOHA incorporates direct hybridization and single enzyme reaction via the formation of single probe-RNA-probe (enzyme) complex on magnetic beads. It enables RNA detection without reverse transcription and PCR amplification processes. The magnetic beads are subsequently encapsulated into a large number of picoliter-sized droplets with enzyme substrates in a continuous-flow device. This device is capable of generating droplets at high-throughput. It also integrates in-line enzymatic incubation and detection of fluorescent products. Our droplet digital ELOHA is able to accurately quantify (differentiate 40% difference) as few as ~600 RNA molecules in a 1 mL sample (equivalent to 1 aM or lower) without molecular replication. The absolute quantification ability of droplet digital ELOHA is demonstrated with the analysis of clinical Neisseria gonorrhoeae 16S rRNA to show its potential value in real complex samples. PMID:26333806

  9. Implementation of an iPhone wireless accelerometer application for the quantification of reflex response.

    PubMed

    LeMoyne, Robert; Mastroianni, Timothy; Grundfest, Warren; Nishikawa, Kiisa

    2013-01-01

    The patellar tendon reflex represents an inherent aspect of the standard neurological evaluation. The features of the reflex response provide initial perspective regarding the status of the nervous system. An iPhone wireless accelerometer application integrated with a potential energy impact pendulum attached to a reflex hammer has been successfully developed, tested, and evaluated for quantifying the patellar tendon reflex. The iPhone functions as a wireless accelerometer platform. The wide coverage range of the iPhone enables the quantification of reflex response samples in rural and remote settings. The iPhone has the capacity to transmit the reflex response acceleration waveform by wireless transmission through email. Automated post-processing of the acceleration waveform provides feature extraction of the maximum acceleration of the reflex response ascertained after evoking the patellar tendon reflex. The iPhone wireless accelerometer application demonstrated the utility of the smartphone as a biomedical device, while providing accurate and consistent quantification of the reflex response.

  10. Droplet Digital Enzyme-Linked Oligonucleotide Hybridization Assay for Absolute RNA Quantification

    NASA Astrophysics Data System (ADS)

    Guan, Weihua; Chen, Liben; Rane, Tushar D.; Wang, Tza-Huei

    2015-09-01

    We present a continuous-flow droplet-based digital Enzyme-Linked Oligonucleotide Hybridization Assay (droplet digital ELOHA) for sensitive detection and absolute quantification of RNA molecules. Droplet digital ELOHA incorporates direct hybridization and single enzyme reaction via the formation of single probe-RNA-probe (enzyme) complex on magnetic beads. It enables RNA detection without reverse transcription and PCR amplification processes. The magnetic beads are subsequently encapsulated into a large number of picoliter-sized droplets with enzyme substrates in a continuous-flow device. This device is capable of generating droplets at high-throughput. It also integrates in-line enzymatic incubation and detection of fluorescent products. Our droplet digital ELOHA is able to accurately quantify (differentiate 40% difference) as few as ~600 RNA molecules in a 1 mL sample (equivalent to 1 aM or lower) without molecular replication. The absolute quantification ability of droplet digital ELOHA is demonstrated with the analysis of clinical Neisseria gonorrhoeae 16S rRNA to show its potential value in real complex samples.

  11. Potential for Development of an Escherichia coli—Based Biosensor for Assessing Bioavailable Methionine: A Review

    PubMed Central

    Chalova, Vesela I.; Froelich, Clifford A.; Ricke, Steven C.

    2010-01-01

    Methionine is an essential amino acid for animals and is typically considered one of the first limiting amino acids in animal feed formulations. Methionine deficiency or excess in animal diets can lead to sub-optimal animal performance and increased environmental pollution, which necessitates its accurate quantification and proper dosage in animal rations. Animal bioassays are the current industry standard to quantify methionine bioavailability. However, animal-based assays are not only time consuming, but expensive and are becoming more scrutinized by governmental regulations. In addition, a variety of artifacts can hinder the variability and time efficacy of these assays. Microbiological assays, which are based on a microbial response to external supplementation of a particular nutrient such as methionine, appear to be attractive potential alternatives to the already established standards. They are rapid and inexpensive in vitro assays which are characterized with relatively accurate and consistent estimation of digestible methionine in feeds and feed ingredients. The current review discusses the potential to develop Escherichia coli-based microbial biosensors for methionine bioavailability quantification. Methionine biosynthesis and regulation pathways are overviewed in relation to genetic manipulation required for the generation of a respective methionine auxotroph that could be practical for a routine bioassay. A prospective utilization of Escherichia coli methionine biosensor would allow for inexpensive and rapid methionine quantification and ultimately enable timely assessment of nutritional profiles of feedstuffs. PMID:22319312

  12. Uncertainty quantification and validation of 3D lattice scaffolds for computer-aided biomedical applications.

    PubMed

    Gorguluarslan, Recep M; Choi, Seung-Kyum; Saldana, Christopher J

    2017-07-01

    A methodology is proposed for uncertainty quantification and validation to accurately predict the mechanical response of lattice structures used in the design of scaffolds. Effective structural properties of the scaffolds are characterized using a developed multi-level stochastic upscaling process that propagates the quantified uncertainties at strut level to the lattice structure level. To obtain realistic simulation models for the stochastic upscaling process and minimize the experimental cost, high-resolution finite element models of individual struts were reconstructed from the micro-CT scan images of lattice structures which are fabricated by selective laser melting. The upscaling method facilitates the process of determining homogenized strut properties to reduce the computational cost of the detailed simulation model for the scaffold. Bayesian Information Criterion is utilized to quantify the uncertainties with parametric distributions based on the statistical data obtained from the reconstructed strut models. A systematic validation approach that can minimize the experimental cost is also developed to assess the predictive capability of the stochastic upscaling method used at the strut level and lattice structure level. In comparison with physical compression test results, the proposed methodology of linking the uncertainty quantification with the multi-level stochastic upscaling method enabled an accurate prediction of the elastic behavior of the lattice structure with minimal experimental cost by accounting for the uncertainties induced by the additive manufacturing process. Copyright © 2017 Elsevier Ltd. All rights reserved.

  13. Centrifuge: rapid and sensitive classification of metagenomic sequences.

    PubMed

    Kim, Daehwan; Song, Li; Breitwieser, Florian P; Salzberg, Steven L

    2016-12-01

    Centrifuge is a novel microbial classification engine that enables rapid, accurate, and sensitive labeling of reads and quantification of species on desktop computers. The system uses an indexing scheme based on the Burrows-Wheeler transform (BWT) and the Ferragina-Manzini (FM) index, optimized specifically for the metagenomic classification problem. Centrifuge requires a relatively small index (4.2 GB for 4078 bacterial and 200 archaeal genomes) and classifies sequences at very high speed, allowing it to process the millions of reads from a typical high-throughput DNA sequencing run within a few minutes. Together, these advances enable timely and accurate analysis of large metagenomics data sets on conventional desktop computers. Because of its space-optimized indexing schemes, Centrifuge also makes it possible to index the entire NCBI nonredundant nucleotide sequence database (a total of 109 billion bases) with an index size of 69 GB, in contrast to k-mer-based indexing schemes, which require far more extensive space. © 2016 Kim et al.; Published by Cold Spring Harbor Laboratory Press.

  14. In-line monitoring of cocrystallization process and quantification of carbamazepine-nicotinamide cocrystal using Raman spectroscopy and chemometric tools.

    PubMed

    Soares, Frederico L F; Carneiro, Renato L

    2017-06-05

    A cocrystallization process may involve several molecular species, which are generally solid under ambient conditions. Thus, accurate monitoring of different components that might appear during the reaction is necessary, as well as quantification of the final product. This work reports for the first time the synthesis of carbamazepine-nicotinamide cocrystal in aqueous media with a full conversion. The reactions were monitored by Raman spectroscopy coupled with Multivariate Curve Resolution - Alternating Least Squares, and the quantification of the final product among its coformers was performed using Raman spectroscopy and Partial Least Squares regression. The slurry reaction was made in four different conditions: room temperature, 40°C, 60°C and 80°C. The slurry reaction at 80°C enabled a full conversion of initial substrates into the cocrystal form, using water as solvent for a greener method. The employment of MCR-ALS coupled with Raman spectroscopy enabled to observe the main steps of the reactions, such as drug dissolution, nucleation and crystallization of the cocrystal. The PLS models gave mean errors of cross validation around 2.0 (% wt/wt), and errors of validation between 2.5 and 8.2 (% wt/wt) for all components. These were good results since the spectra of cocrystals and the physical mixture of the coformers present some similar peaks. Copyright © 2017 Elsevier B.V. All rights reserved.

  15. In-line monitoring of cocrystallization process and quantification of carbamazepine-nicotinamide cocrystal using Raman spectroscopy and chemometric tools

    NASA Astrophysics Data System (ADS)

    Soares, Frederico L. F.; Carneiro, Renato L.

    2017-06-01

    A cocrystallization process may involve several molecular species, which are generally solid under ambient conditions. Thus, accurate monitoring of different components that might appear during the reaction is necessary, as well as quantification of the final product. This work reports for the first time the synthesis of carbamazepine-nicotinamide cocrystal in aqueous media with a full conversion. The reactions were monitored by Raman spectroscopy coupled with Multivariate Curve Resolution - Alternating Least Squares, and the quantification of the final product among its coformers was performed using Raman spectroscopy and Partial Least Squares regression. The slurry reaction was made in four different conditions: room temperature, 40 °C, 60 °C and 80 °C. The slurry reaction at 80 °C enabled a full conversion of initial substrates into the cocrystal form, using water as solvent for a greener method. The employment of MCR-ALS coupled with Raman spectroscopy enabled to observe the main steps of the reactions, such as drug dissolution, nucleation and crystallization of the cocrystal. The PLS models gave mean errors of cross validation around 2.0 (% wt/wt), and errors of validation between 2.5 and 8.2 (% wt/wt) for all components. These were good results since the spectra of cocrystals and the physical mixture of the coformers present some similar peaks.

  16. Quantification of differential gene expression by multiplexed targeted resequencing of cDNA

    PubMed Central

    Arts, Peer; van der Raadt, Jori; van Gestel, Sebastianus H.C.; Steehouwer, Marloes; Shendure, Jay; Hoischen, Alexander; Albers, Cornelis A.

    2017-01-01

    Whole-transcriptome or RNA sequencing (RNA-Seq) is a powerful and versatile tool for functional analysis of different types of RNA molecules, but sample reagent and sequencing cost can be prohibitive for hypothesis-driven studies where the aim is to quantify differential expression of a limited number of genes. Here we present an approach for quantification of differential mRNA expression by targeted resequencing of complementary DNA using single-molecule molecular inversion probes (cDNA-smMIPs) that enable highly multiplexed resequencing of cDNA target regions of ∼100 nucleotides and counting of individual molecules. We show that accurate estimates of differential expression can be obtained from molecule counts for hundreds of smMIPs per reaction and that smMIPs are also suitable for quantification of relative gene expression and allele-specific expression. Compared with low-coverage RNA-Seq and a hybridization-based targeted RNA-Seq method, cDNA-smMIPs are a cost-effective high-throughput tool for hypothesis-driven expression analysis in large numbers of genes (10 to 500) and samples (hundreds to thousands). PMID:28474677

  17. Real-time PCR for differential quantification of CVI988 vaccine virus and virulent strains of Marek’s disease virus

    PubMed Central

    Baigent, Susan J.; Nair, Venugopal K.; Le Galludec, Hervé

    2016-01-01

    CVI988/Rispens vaccine, the ‘gold standard’ vaccine against Marek’s disease in poultry, is not easily distinguishable from virulent strains of Marek’s disease herpesvirus (MDV). Accurate differential measurement of CVI988 and virulent MDV is commercially important to confirm successful vaccination, to diagnose Marek’s disease, and to investigate causes of vaccine failure. A real-time quantitative PCR assay to distinguish CVI988 and virulent MDV based on a consistent single nucleotide polymorphism in the pp38 gene, was developed, optimised and validated using common primers to amplify both viruses, but differential detection of PCR products using two short probes specific for either CVI988 or virulent MDV. Both probes showed perfect specificity for three commercial preparations of CVI988 and 12 virulent MDV strains. Validation against BAC-sequence-specific and US2-sequence-specific q-PCR, on spleen samples from experimental chickens co-infected with BAC-cloned pCVI988 and wild-type virulent MDV, demonstrated that CVI988 and virulent MDV could be quantified very accurately. The assay was then used to follow kinetics of replication of commercial CVI988 and virulent MDV in feather tips and blood of vaccinated and challenged experimental chickens. The assay is a great improvement in enabling accurate differential quantification of CVI988 and virulent MDV over a biologically relevant range of virus levels. PMID:26973285

  18. Three-dimensional Hessian matrix-based quantitative vascular imaging of rat iris with optical-resolution photoacoustic microscopy in vivo

    NASA Astrophysics Data System (ADS)

    Zhao, Huangxuan; Wang, Guangsong; Lin, Riqiang; Gong, Xiaojing; Song, Liang; Li, Tan; Wang, Wenjia; Zhang, Kunya; Qian, Xiuqing; Zhang, Haixia; Li, Lin; Liu, Zhicheng; Liu, Chengbo

    2018-04-01

    For the diagnosis and evaluation of ophthalmic diseases, imaging and quantitative characterization of vasculature in the iris are very important. The recently developed photoacoustic imaging, which is ultrasensitive in imaging endogenous hemoglobin molecules, provides a highly efficient label-free method for imaging blood vasculature in the iris. However, the development of advanced vascular quantification algorithms is still needed to enable accurate characterization of the underlying vasculature. We have developed a vascular information quantification algorithm by adopting a three-dimensional (3-D) Hessian matrix and applied for processing iris vasculature images obtained with a custom-built optical-resolution photoacoustic imaging system (OR-PAM). For the first time, we demonstrate in vivo 3-D vascular structures of a rat iris with a the label-free imaging method and also accurately extract quantitative vascular information, such as vessel diameter, vascular density, and vascular tortuosity. Our results indicate that the developed algorithm is capable of quantifying the vasculature in the 3-D photoacoustic images of the iris in-vivo, thus enhancing the diagnostic capability of the OR-PAM system for vascular-related ophthalmic diseases in vivo.

  19. Quantification of the progression of CMV infection as observed from retinal angiograms in patients with AIDS

    NASA Astrophysics Data System (ADS)

    Brahmi, Djamel; Cassoux, Nathalie; Serruys, Camille; Giron, Alain; Lehoang, Phuc; Fertil, Bernard

    1999-05-01

    To support ophthalmologists in their daily routine and enable the quantitative assessment of progression of Cytomegalovirus infection as observed on series of retinal angiograms, a methodology allowing an accurate comparison of retinal borders has been developed. In order to evaluate accuracy of borders, ophthalmologists have been asked to repeatedly outline boundaries between infected and noninfected areas. As a matter of fact, accuracy of drawing relies on local features such as contrast, quality of image, background..., all factors which make the boundaries more or less perceptible from one part of an image to another. In order to directly estimate accuracy of retinal border from image analysis, an artificial neural network (a succession of unsupervised and supervised neural networks) has been designed to correlate accuracy of drawing (as calculated form ophthalmologists' hand-outlines) with local features of the underlying image. Our method has been applied to the quantification of CMV retinitis. It is shown that accuracy of border is properly predicted and characterized by a confident envelope that allows, after a registration phase based on fixed landmarks such as vessel forks, to accurately assess the evolution of CMV infection.

  20. Slow Off-Rate Modified Aptamer (SOMAmer) as a Novel Reagent in Immunoassay Development for Accurate Soluble Glypican-3 Quantification in Clinical Samples.

    PubMed

    Duo, Jia; Chiriac, Camelia; Huang, Richard Y-C; Mehl, John; Chen, Guodong; Tymiak, Adrienne; Sabbatini, Peter; Pillutla, Renuka; Zhang, Yan

    2018-04-17

    Accurate quantification of soluble glypican-3 in clinical samples using immunoassays is challenging, because of the lack of appropriate antibody reagents to provide a full spectrum measurement of all potential soluble glypican-3 fragments in vivo. Glypican-3 SOMAmer (slow off-rate modified aptamer) is a novel reagent that binds, with high affinity, to a far distinct epitope of glypican-3, when compared to all available antibody reagents generated in-house. This paper describes an integrated analytical approach to rational selection of key reagents based on molecular characterization by epitope mapping, with the focus on our work using a SOMAmer as a new reagent to address development challenges with traditional antibody reagents for the soluble glypican-3 immunoassay. A qualified SOMAmer-based assay was developed and used for soluble glypican-3 quantification in hepatocellular carcinoma (HCC) patient samples. The assay demonstrated good sensitivity, accuracy, and precision. Data correlated with those obtained using the traditional antibody-based assay were used to confirm the clinically relevant soluble glypican-3 forms in vivo. This result was reinforced by a liquid chromatography tandem mass spectrometry (LC-MS/MS) assay quantifying signature peptides generated from trypsin digestion. The work presented here offers an integrated strategy for qualifying aptamers as an alternative affinity platform for immunoassay reagents that can enable speedy assay development, especially when traditional antibody reagents cannot meet assay requirements.

  1. Visualization of endothelial cell cycle dynamics in mouse using the Flt-1/eGFP-anillin system.

    PubMed

    Herz, Katia; Becker, Alexandra; Shi, Chenyue; Ema, Masatsugo; Takahashi, Satoru; Potente, Michael; Hesse, Michael; Fleischmann, Bernd K; Wenzel, Daniela

    2018-05-01

    Endothelial cell proliferation is a key process during vascular growth but its kinetics could only be assessed in vitro or ex vivo so far. To enable the monitoring and quantification of cell cycle kinetics in vivo, we have generated transgenic mice expressing an eGFP-anillin construct under control of the endothelial-specific Flt-1 promoter. This construct labels the nuclei of endothelial cells in late G1, S and G2 phase and changes its localization during the different stages of M phase, thereby enabling the monitoring of EC proliferation and cytokinesis. In Flt-1/eGFP-anillin mice, we found eGFP + signals specifically in Ki67 + /PECAM + endothelial cells during vascular development. Quantification using this cell cycle reporter in embryos revealed a decline in endothelial cell proliferation between E9.5 to E12.5. By time-lapse microscopy, we determined the length of different cell cycle phases in embryonic endothelial cells in vivo and found a M phase duration of about 80 min with 2/3 covering karyokinesis and 1/3 cytokinesis. Thus, we have generated a versatile transgenic system for the accurate assessment of endothelial cell cycle dynamics in vitro and in vivo.

  2. Ultrasensitive multiplex optical quantification of bacteria in large samples of biofluids

    PubMed Central

    Pazos-Perez, Nicolas; Pazos, Elena; Catala, Carme; Mir-Simon, Bernat; Gómez-de Pedro, Sara; Sagales, Juan; Villanueva, Carlos; Vila, Jordi; Soriano, Alex; García de Abajo, F. Javier; Alvarez-Puebla, Ramon A.

    2016-01-01

    Efficient treatments in bacterial infections require the fast and accurate recognition of pathogens, with concentrations as low as one per milliliter in the case of septicemia. Detecting and quantifying bacteria in such low concentrations is challenging and typically demands cultures of large samples of blood (~1 milliliter) extending over 24–72 hours. This delay seriously compromises the health of patients. Here we demonstrate a fast microorganism optical detection system for the exhaustive identification and quantification of pathogens in volumes of biofluids with clinical relevance (~1 milliliter) in minutes. We drive each type of bacteria to accumulate antibody functionalized SERS-labelled silver nanoparticles. Particle aggregation on the bacteria membranes renders dense arrays of inter-particle gaps in which the Raman signal is exponentially amplified by several orders of magnitude relative to the dispersed particles. This enables a multiplex identification of the microorganisms through the molecule-specific spectral fingerprints. PMID:27364357

  3. Automated Protein Biomarker Analysis: on-line extraction of clinical samples by Molecularly Imprinted Polymers

    NASA Astrophysics Data System (ADS)

    Rossetti, Cecilia; Świtnicka-Plak, Magdalena A.; Grønhaug Halvorsen, Trine; Cormack, Peter A. G.; Sellergren, Börje; Reubsaet, Léon

    2017-03-01

    Robust biomarker quantification is essential for the accurate diagnosis of diseases and is of great value in cancer management. In this paper, an innovative diagnostic platform is presented which provides automated molecularly imprinted solid-phase extraction (MISPE) followed by liquid chromatography-mass spectrometry (LC-MS) for biomarker determination using ProGastrin Releasing Peptide (ProGRP), a highly sensitive biomarker for Small Cell Lung Cancer, as a model. Molecularly imprinted polymer microspheres were synthesized by precipitation polymerization and analytical optimization of the most promising material led to the development of an automated quantification method for ProGRP. The method enabled analysis of patient serum samples with elevated ProGRP levels. Particularly low sample volumes were permitted using the automated extraction within a method which was time-efficient, thereby demonstrating the potential of such a strategy in a clinical setting.

  4. Earned Value Management (EVM) Implementation Handbook

    NASA Technical Reports Server (NTRS)

    2013-01-01

    The purpose of this handbook is to provide Earned Value Management (EVM) guidance for the effective application, implementation, and utilization of EVM on NASA programs, projects, major contracts and subcontracts in a consolidated reference document. EVM is a project management process that effectively integrates a project s scope of work with schedule and cost elements for optimum project planning and control. The goal is to achieve timely and accurate quantification of progress that will facilitate management by exception and enable early visibility into the nature and the magnitude of technical problems as well as the intended course and success of corrective actions.

  5. Earned Value Management (EVM) Implementation Handbook

    NASA Technical Reports Server (NTRS)

    Terrell, Stefanie M.; Richards, Brad W.

    2018-01-01

    The purpose of this handbook is to provide Earned Value Management (EVM) guidance for the effective application, implementation, and utilization of EVM on NASA programs, projects, major contracts and subcontracts in a consolidated reference document. EVM is a project management process that effectively integrates a project?s scope of work with schedule and cost elements for optimum project planning and control. The goal is to achieve timely and accurate quantification of progress that will facilitate management by exception and enable early visibility into the nature and the magnitude of technical problems as well as the intended course and success of corrective actions.

  6. Spectral performance of a whole-body research photon counting detector CT: quantitative accuracy in derived image sets

    NASA Astrophysics Data System (ADS)

    Leng, Shuai; Zhou, Wei; Yu, Zhicong; Halaweish, Ahmed; Krauss, Bernhard; Schmidt, Bernhard; Yu, Lifeng; Kappler, Steffen; McCollough, Cynthia

    2017-09-01

    Photon-counting computed tomography (PCCT) uses a photon counting detector to count individual photons and allocate them to specific energy bins by comparing photon energy to preset thresholds. This enables simultaneous multi-energy CT with a single source and detector. Phantom studies were performed to assess the spectral performance of a research PCCT scanner by assessing the accuracy of derived images sets. Specifically, we assessed the accuracy of iodine quantification in iodine map images and of CT number accuracy in virtual monoenergetic images (VMI). Vials containing iodine with five known concentrations were scanned on the PCCT scanner after being placed in phantoms representing the attenuation of different size patients. For comparison, the same vials and phantoms were also scanned on 2nd and 3rd generation dual-source, dual-energy scanners. After material decomposition, iodine maps were generated, from which iodine concentration was measured for each vial and phantom size and compared with the known concentration. Additionally, VMIs were generated and CT number accuracy was compared to the reference standard, which was calculated based on known iodine concentration and attenuation coefficients at each keV obtained from the U.S. National Institute of Standards and Technology (NIST). Results showed accurate iodine quantification (root mean square error of 0.5 mgI/cc) and accurate CT number of VMIs (percentage error of 8.9%) using the PCCT scanner. The overall performance of the PCCT scanner, in terms of iodine quantification and VMI CT number accuracy, was comparable to that of EID-based dual-source, dual-energy scanners.

  7. The antibody-based magnetic microparticle immunoassay using p-FET sensing platform for Alzheimer's disease pathogenic factor

    NASA Astrophysics Data System (ADS)

    Kim, Chang-Beom; Kim, Kwan-Soo; Song, Ki-Bong

    2013-05-01

    The importance of early Alzheimer's disease (AD) detection has been recognized to diagnose people at high risk of AD. The existence of intra/extracellular beta-amyloid (Aβ) of brain neurons has been regarded as the most archetypal hallmark of AD. The existing computed-image-based neuroimaging tools have limitations on accurate quantification of nanoscale Aβ peptides due to optical diffraction during imaging processes. Therefore, we propose a new method that is capable of evaluating a small amount of Aβ peptides by using photo-sensitive field-effect transistor (p-FET) integrated with magnetic force-based microbead collecting platform and selenium(Se) layer (thickness ~700 nm) as an optical filter. This method demonstrates a facile approach for the analysis of Aβ quantification using magnetic force and magnetic silica microparticles (diameter 0.2~0.3 μm). The microbead collecting platform mainly consists of the p-FET sensing array and the magnet (diameter ~1 mm) which are placed beneath each sensing region of the p-FET, which enables the assembly of the Aβ antibody conjugated microbeads, captures the Aβ peptides from samples, measures the photocurrents generated by the Q-dot tagged with Aβ peptides, and consequently results in the effective Aβ quantification.

  8. Computerized image analysis for quantitative neuronal phenotyping in zebrafish.

    PubMed

    Liu, Tianming; Lu, Jianfeng; Wang, Ye; Campbell, William A; Huang, Ling; Zhu, Jinmin; Xia, Weiming; Wong, Stephen T C

    2006-06-15

    An integrated microscope image analysis pipeline is developed for automatic analysis and quantification of phenotypes in zebrafish with altered expression of Alzheimer's disease (AD)-linked genes. We hypothesize that a slight impairment of neuronal integrity in a large number of zebrafish carrying the mutant genotype can be detected through the computerized image analysis method. Key functionalities of our zebrafish image processing pipeline include quantification of neuron loss in zebrafish embryos due to knockdown of AD-linked genes, automatic detection of defective somites, and quantitative measurement of gene expression levels in zebrafish with altered expression of AD-linked genes or treatment with a chemical compound. These quantitative measurements enable the archival of analyzed results and relevant meta-data. The structured database is organized for statistical analysis and data modeling to better understand neuronal integrity and phenotypic changes of zebrafish under different perturbations. Our results show that the computerized analysis is comparable to manual counting with equivalent accuracy and improved efficacy and consistency. Development of such an automated data analysis pipeline represents a significant step forward to achieve accurate and reproducible quantification of neuronal phenotypes in large scale or high-throughput zebrafish imaging studies.

  9. Evaluation of empirical rule of linearly correlated peptide selection (ERLPS) for proteotypic peptide-based quantitative proteomics.

    PubMed

    Liu, Kehui; Zhang, Jiyang; Fu, Bin; Xie, Hongwei; Wang, Yingchun; Qian, Xiaohong

    2014-07-01

    Precise protein quantification is essential in comparative proteomics. Currently, quantification bias is inevitable when using proteotypic peptide-based quantitative proteomics strategy for the differences in peptides measurability. To improve quantification accuracy, we proposed an "empirical rule for linearly correlated peptide selection (ERLPS)" in quantitative proteomics in our previous work. However, a systematic evaluation on general application of ERLPS in quantitative proteomics under diverse experimental conditions needs to be conducted. In this study, the practice workflow of ERLPS was explicitly illustrated; different experimental variables, such as, different MS systems, sample complexities, sample preparations, elution gradients, matrix effects, loading amounts, and other factors were comprehensively investigated to evaluate the applicability, reproducibility, and transferability of ERPLS. The results demonstrated that ERLPS was highly reproducible and transferable within appropriate loading amounts and linearly correlated response peptides should be selected for each specific experiment. ERLPS was used to proteome samples from yeast to mouse and human, and in quantitative methods from label-free to O18/O16-labeled and SILAC analysis, and enabled accurate measurements for all proteotypic peptide-based quantitative proteomics over a large dynamic range. © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Determination of δ-[L-α-aminoadipyl]-L-cysteinyl-D-valine in cell extracts of Penicillium chrysogenum using ion pair-RP-UPLC-MS/MS.

    PubMed

    Seifar, Reza Maleki; Deshmukh, Amit T; Heijnen, Joseph J; van Gulik, Walter M

    2012-01-01

    δ-[L-α-Aminoadipyl]-L-cysteinyl-D-valine (ACV) is a key intermediate in the biosynthesis pathway of penicillins and cephalosporins. Therefore, the accurate quantification of ACV is relevant, e.g. for kinetic studies on the production of these β-lactam antibiotics. However, accurate quantification of ACV is a challenge, because it is an active thiol compound which, upon exposure to air, can easily react with other thiol compounds to form oxidized disulfides. We have found that, during exposure to air, the oxidation of ACV occurs both in aqueous standard solutions as well as in biological samples. Qualitative and quantitative determinations of ACV and the oxidized dimer bis-δ-[L-α-aminoadipyl]-L-cysteinyl-D-valine have been carried out using ion pair reversed-phase ultra high-performance liquid chromatography, hyphenated with tandem mass spectrometry (IP-RP-UPLC-MS/MS) as the analytical platform. We show that by application of tris(2-carboxy-ethyl)phosphine hydrochloride (TCEP) as the reducing reagent, the total amount of ACV can be determined, while using maleimide as derivatizing reagent enables to quantify the free reduced form only. Copyright © 2012 WILEY‐VCH Verlag GmbH & Co. KGaA, Weinheim.

  11. Coarse-Grained Descriptions of Dynamics for Networks with Both Intrinsic and Structural Heterogeneities

    PubMed Central

    Bertalan, Tom; Wu, Yan; Laing, Carlo; Gear, C. William; Kevrekidis, Ioannis G.

    2017-01-01

    Finding accurate reduced descriptions for large, complex, dynamically evolving networks is a crucial enabler to their simulation, analysis, and ultimately design. Here, we propose and illustrate a systematic and powerful approach to obtaining good collective coarse-grained observables—variables successfully summarizing the detailed state of such networks. Finding such variables can naturally lead to successful reduced dynamic models for the networks. The main premise enabling our approach is the assumption that the behavior of a node in the network depends (after a short initial transient) on the node identity: a set of descriptors that quantify the node properties, whether intrinsic (e.g., parameters in the node evolution equations) or structural (imparted to the node by its connectivity in the particular network structure). The approach creates a natural link with modeling and “computational enabling technology” developed in the context of Uncertainty Quantification. In our case, however, we will not focus on ensembles of different realizations of a problem, each with parameters randomly selected from a distribution. We will instead study many coupled heterogeneous units, each characterized by randomly assigned (heterogeneous) parameter value(s). One could then coin the term Heterogeneity Quantification for this approach, which we illustrate through a model dynamic network consisting of coupled oscillators with one intrinsic heterogeneity (oscillator individual frequency) and one structural heterogeneity (oscillator degree in the undirected network). The computational implementation of the approach, its shortcomings and possible extensions are also discussed. PMID:28659781

  12. Microfluidic-based mini-metagenomics enables discovery of novel microbial lineages from complex environmental samples.

    PubMed

    Yu, Feiqiao Brian; Blainey, Paul C; Schulz, Frederik; Woyke, Tanja; Horowitz, Mark A; Quake, Stephen R

    2017-07-05

    Metagenomics and single-cell genomics have enabled genome discovery from unknown branches of life. However, extracting novel genomes from complex mixtures of metagenomic data can still be challenging and represents an ill-posed problem which is generally approached with ad hoc methods. Here we present a microfluidic-based mini-metagenomic method which offers a statistically rigorous approach to extract novel microbial genomes while preserving single-cell resolution. We used this approach to analyze two hot spring samples from Yellowstone National Park and extracted 29 new genomes, including three deeply branching lineages. The single-cell resolution enabled accurate quantification of genome function and abundance, down to 1% in relative abundance. Our analyses of genome level SNP distributions also revealed low to moderate environmental selection. The scale, resolution, and statistical power of microfluidic-based mini-metagenomics make it a powerful tool to dissect the genomic structure of microbial communities while effectively preserving the fundamental unit of biology, the single cell.

  13. Optimizing total reflection X-ray fluorescence for direct trace element quantification in proteins I: Influence of sample homogeneity and reflector type

    NASA Astrophysics Data System (ADS)

    Wellenreuther, G.; Fittschen, U. E. A.; Achard, M. E. S.; Faust, A.; Kreplin, X.; Meyer-Klaucke, W.

    2008-12-01

    Total reflection X-ray fluorescence (TXRF) is a very promising method for the direct, quick and reliable multi-elemental quantification of trace elements in protein samples. With the introduction of an internal standard consisting of two reference elements, scandium and gallium, a wide range of proteins can be analyzed, regardless of their salt content, buffer composition, additives and amino acid composition. This strategy also enables quantification of matrix effects. Two potential issues associated with drying have been considered in this study: (1) Formation of heterogeneous residues of varying thickness and/or density; and (2) separation of the internal standard and protein during drying (which has to be prevented to allow accurate quantification). These issues were investigated by microbeam X-ray fluorescence (μXRF) with special emphasis on (I) the influence of sample support and (II) the protein / buffer system used. In the first part, a model protein was studied on well established sample supports used in TXRF, PIXE and XRF (Mylar, siliconized quartz, Plexiglas and silicon). In the second part we imaged proteins of different molecular weight, oligomerization state, bound metals and solubility. A partial separation of protein and internal standard was only observed with untreated silicon, suggesting it may not be an adequate support material. Siliconized quartz proved to be the least prone to heterogeneous drying of the sample and yielded the most reliable results.

  14. Accurate quantification of 5 German cockroach (GCr) allergens in complex extracts using multiple reaction monitoring mass spectrometry (MRM MS).

    PubMed

    Mindaye, S T; Spiric, J; David, N A; Rabin, R L; Slater, J E

    2017-12-01

    German cockroach (GCr) allergen extracts are complex and heterogeneous products, and methods to better assess their potency and composition are needed for adequate studies of their safety and efficacy. The objective of this study was to develop an assay based on liquid chromatography and multiple reaction monitoring mass spectrometry (LC-MRM MS) for rapid, accurate, and reproducible quantification of 5 allergens (Bla g 1, Bla g 2, Bla g 3, Bla g 4, and Bla g 5) in crude GCr allergen extracts. We first established a comprehensive peptide library of allergens from various commercial extracts as well as recombinant allergens. Peptide mapping was performed using high-resolution MS, and the peptide library was then used to identify prototypic and quantotypic peptides to proceed with MRM method development. Assay development included a systematic optimization of digestion conditions (buffer, digestion time, and trypsin concentration), chromatographic separation, and MS parameters. Robustness and suitability were assessed following ICH (Q2 [R1]) guidelines. The method is precise (RSD < 10%), linear over a wide range (r > 0.99, 0.01-1384 fmol/μL), and sensitive (LLOD and LLOQ <1 fmol/μL). Having established the parameters for LC-MRM MS, we quantified allergens from various commercial GCr extracts and showed considerable variability that may impact clinical efficacy. Our data demonstrate that the LC-MRM MS method is valuable for absolute quantification of allergens in GCr extracts and likely has broader applicability to other complex allergen extracts. Definitive quantification provides a new standard for labelling of allergen extracts, which will inform patient care, enable personalized therapy, and enhance the efficacy of immunotherapy for environmental and food allergies. © 2017 The Authors. Clinical & Experimental Allergy published by John Wiley & Sons Ltd. This article has been contributed to by US Government employees and their work is in the public domain in the USA.

  15. Dual modal ultra-bright nanodots with aggregation-induced emission and gadolinium-chelation for vascular integrity and leakage detection.

    PubMed

    Feng, Guangxue; Li, Jackson Liang Yao; Claser, Carla; Balachander, Akhila; Tan, Yingrou; Goh, Chi Ching; Kwok, Immanuel Weng Han; Rénia, Laurent; Tang, Ben Zhong; Ng, Lai Guan; Liu, Bin

    2018-01-01

    The study of blood brain barrier (BBB) functions is important for neurological disorder research. However, the lack of suitable tools and methods has hampered the progress of this field. Herein, we present a hybrid nanodot strategy, termed AIE-Gd dots, comprising of a fluorogen with aggregation-induced emission (AIE) characteristics as the core to provide bright and stable fluorescence for optical imaging, and gadolinium (Gd) for accurate quantification of vascular leakage via inductively-coupled plasma mass spectrometry (ICP-MS). In this report, we demonstrate that AIE-Gd dots enable direct visualization of brain vascular networks under resting condition, and that they form localized punctate aggregates and accumulate in the brain tissue during experimental cerebral malaria, indicative of hemorrhage and BBB malfunction. With its superior detection sensitivity and multimodality, we hereby propose that AIE-Gd dots can serve as a better alternative to Evans blue for visualization and quantification of changes in brain barrier functions. Copyright © 2017. Published by Elsevier Ltd.

  16. Assessment of Recovery of Milk Protein Allergens from Processed Food for Mass Spectrometry Quantification.

    PubMed

    Groves, Kate; Cryar, Adam; Walker, Michael; Quaglia, Milena

    2018-01-01

    Assessing the recovery of food allergens from solid processed matrixes is one of the most difficult steps that needs to be overcome to enable the accurate quantification of protein allergens by immunoassay and MS. A feasibility study is described herein applying International System of Units (SI)-traceably quantified milk protein solutions to assess recovery by an improved extraction method. Untargeted MS analysis suggests that this novel extraction method can be further developed to provide high recoveries for a broad range of food allergens. A solution of α-casein was traceably quantified to the SI for the content of α-S1 casein. Cookie dough was prepared by spiking a known amount of the SI-traceable quantified solution into a mixture of flour, sugar, and soya spread, followed by baking. A novel method for the extraction of protein food allergens from solid matrixes based on proteolytic digestion was developed, and its performance was compared with the performance of methods reported in the literature.

  17. Magnetic fingerprints of rolling cells for quantitative flow cytometry in whole blood

    NASA Astrophysics Data System (ADS)

    Reisbeck, Mathias; Helou, Michael Johannes; Richter, Lukas; Kappes, Barbara; Friedrich, Oliver; Hayden, Oliver

    2016-09-01

    Over the past 50 years, flow cytometry has had a profound impact on preclinical and clinical applications requiring single cell function information for counting, sub-typing and quantification of epitope expression. At the same time, the workflow complexity and high costs of such optical systems still limit flow cytometry applications to specialized laboratories. Here, we present a quantitative magnetic flow cytometer that incorporates in situ magnetophoretic cell focusing for highly accurate and reproducible rolling of the cellular targets over giant magnetoresistance sensing elements. Time-of-flight analysis is used to unveil quantitative single cell information contained in its magnetic fingerprint. Furthermore, we used erythrocytes as a biological model to validate our methodology with respect to precise analysis of the hydrodynamic cell diameter, quantification of binding capacity of immunomagnetic labels, and discrimination of cell morphology. The extracted time-of-flight information should enable point-of-care quantitative flow cytometry in whole blood for clinical applications, such as immunology and primary hemostasis.

  18. Decision peptide-driven: a free software tool for accurate protein quantification using gel electrophoresis and matrix assisted laser desorption ionization time of flight mass spectrometry.

    PubMed

    Santos, Hugo M; Reboiro-Jato, Miguel; Glez-Peña, Daniel; Nunes-Miranda, J D; Fdez-Riverola, Florentino; Carvallo, R; Capelo, J L

    2010-09-15

    The decision peptide-driven tool implements a software application for assisting the user in a protocol for accurate protein quantification based on the following steps: (1) protein separation through gel electrophoresis; (2) in-gel protein digestion; (3) direct and inverse (18)O-labeling and (4) matrix assisted laser desorption ionization time of flight mass spectrometry, MALDI analysis. The DPD software compares the MALDI results of the direct and inverse (18)O-labeling experiments and quickly identifies those peptides with paralleled loses in different sets of a typical proteomic workflow. Those peptides are used for subsequent accurate protein quantification. The interpretation of the MALDI data from direct and inverse labeling experiments is time-consuming requiring a significant amount of time to do all comparisons manually. The DPD software shortens and simplifies the searching of the peptides that must be used for quantification from a week to just some minutes. To do so, it takes as input several MALDI spectra and aids the researcher in an automatic mode (i) to compare data from direct and inverse (18)O-labeling experiments, calculating the corresponding ratios to determine those peptides with paralleled losses throughout different sets of experiments; and (ii) allow to use those peptides as internal standards for subsequent accurate protein quantification using (18)O-labeling. In this work the DPD software is presented and explained with the quantification of protein carbonic anhydrase. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  19. Performance of different reflectance and diffuse optical imaging tomographic approaches in fluorescence molecular imaging of small animals

    NASA Astrophysics Data System (ADS)

    Dinten, Jean-Marc; Petié, Philippe; da Silva, Anabela; Boutet, Jérôme; Koenig, Anne; Hervé, Lionel; Berger, Michel; Laidevant, Aurélie; Rizo, Philippe

    2006-03-01

    Optical imaging of fluorescent probes is an essential tool for investigation of molecular events in small animals for drug developments. In order to get localization and quantification information of fluorescent labels, CEA-LETI has developed efficient approaches in classical reflectance imaging as well as in diffuse optical tomographic imaging with continuous and temporal signals. This paper presents an overview of the different approaches investigated and their performances. High quality fluorescence reflectance imaging is obtained thanks to the development of an original "multiple wavelengths" system. The uniformity of the excitation light surface area is better than 15%. Combined with the use of adapted fluorescent probes, this system enables an accurate detection of pathological tissues, such as nodules, beneath the animal's observed area. Performances for the detection of ovarian nodules on a nude mouse are shown. In order to investigate deeper inside animals and get 3D localization, diffuse optical tomography systems are being developed for both slab and cylindrical geometries. For these two geometries, our reconstruction algorithms are based on analytical expression of light diffusion. Thanks to an accurate introduction of light/matter interaction process in the algorithms, high quality reconstructions of tumors in mice have been obtained. Reconstruction of lung tumors on mice are presented. By the use of temporal diffuse optical imaging, localization and quantification performances can be improved at the price of a more sophisticated acquisition system and more elaborate information processing methods. Such a system based on a pulsed laser diode and a time correlated single photon counting system has been set up. Performances of this system for localization and quantification of fluorescent probes are presented.

  20. Quantitative Imaging with a Mobile Phone Microscope

    PubMed Central

    Skandarajah, Arunan; Reber, Clay D.; Switz, Neil A.; Fletcher, Daniel A.

    2014-01-01

    Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone–based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications. PMID:24824072

  1. A new background subtraction method for Western blot densitometry band quantification through image analysis software.

    PubMed

    Gallo-Oller, Gabriel; Ordoñez, Raquel; Dotor, Javier

    2018-06-01

    Since its first description, Western blot has been widely used in molecular labs. It constitutes a multistep method that allows the detection and/or quantification of proteins from simple to complex protein mixtures. Western blot quantification method constitutes a critical step in order to obtain accurate and reproducible results. Due to the technical knowledge required for densitometry analysis together with the resources availability, standard office scanners are often used for the imaging acquisition of developed Western blot films. Furthermore, the use of semi-quantitative software as ImageJ (Java-based image-processing and analysis software) is clearly increasing in different scientific fields. In this work, we describe the use of office scanner coupled with the ImageJ software together with a new image background subtraction method for accurate Western blot quantification. The proposed method represents an affordable, accurate and reproducible approximation that could be used in the presence of limited resources availability. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    PubMed

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  3. Medium-throughput processing of whole mount in situ hybridisation experiments into gene expression domains.

    PubMed

    Crombach, Anton; Cicin-Sain, Damjan; Wotton, Karl R; Jaeger, Johannes

    2012-01-01

    Understanding the function and evolution of developmental regulatory networks requires the characterisation and quantification of spatio-temporal gene expression patterns across a range of systems and species. However, most high-throughput methods to measure the dynamics of gene expression do not preserve the detailed spatial information needed in this context. For this reason, quantification methods based on image bioinformatics have become increasingly important over the past few years. Most available approaches in this field either focus on the detailed and accurate quantification of a small set of gene expression patterns, or attempt high-throughput analysis of spatial expression through binary pattern extraction and large-scale analysis of the resulting datasets. Here we present a robust, "medium-throughput" pipeline to process in situ hybridisation patterns from embryos of different species of flies. It bridges the gap between high-resolution, and high-throughput image processing methods, enabling us to quantify graded expression patterns along the antero-posterior axis of the embryo in an efficient and straightforward manner. Our method is based on a robust enzymatic (colorimetric) in situ hybridisation protocol and rapid data acquisition through wide-field microscopy. Data processing consists of image segmentation, profile extraction, and determination of expression domain boundary positions using a spline approximation. It results in sets of measured boundaries sorted by gene and developmental time point, which are analysed in terms of expression variability or spatio-temporal dynamics. Our method yields integrated time series of spatial gene expression, which can be used to reverse-engineer developmental gene regulatory networks across species. It is easily adaptable to other processes and species, enabling the in silico reconstitution of gene regulatory networks in a wide range of developmental contexts.

  4. Automated and Adaptable Quantification of Cellular Alignment from Microscopic Images for Tissue Engineering Applications

    PubMed Central

    Xu, Feng; Beyazoglu, Turker; Hefner, Evan; Gurkan, Umut Atakan

    2011-01-01

    Cellular alignment plays a critical role in functional, physical, and biological characteristics of many tissue types, such as muscle, tendon, nerve, and cornea. Current efforts toward regeneration of these tissues include replicating the cellular microenvironment by developing biomaterials that facilitate cellular alignment. To assess the functional effectiveness of the engineered microenvironments, one essential criterion is quantification of cellular alignment. Therefore, there is a need for rapid, accurate, and adaptable methodologies to quantify cellular alignment for tissue engineering applications. To address this need, we developed an automated method, binarization-based extraction of alignment score (BEAS), to determine cell orientation distribution in a wide variety of microscopic images. This method combines a sequenced application of median and band-pass filters, locally adaptive thresholding approaches and image processing techniques. Cellular alignment score is obtained by applying a robust scoring algorithm to the orientation distribution. We validated the BEAS method by comparing the results with the existing approaches reported in literature (i.e., manual, radial fast Fourier transform-radial sum, and gradient based approaches). Validation results indicated that the BEAS method resulted in statistically comparable alignment scores with the manual method (coefficient of determination R2=0.92). Therefore, the BEAS method introduced in this study could enable accurate, convenient, and adaptable evaluation of engineered tissue constructs and biomaterials in terms of cellular alignment and organization. PMID:21370940

  5. A Versatile Cell Death Screening Assay Using Dye-Stained Cells and Multivariate Image Analysis.

    PubMed

    Collins, Tony J; Ylanko, Jarkko; Geng, Fei; Andrews, David W

    2015-11-01

    A novel dye-based method for measuring cell death in image-based screens is presented. Unlike conventional high- and medium-throughput cell death assays that measure only one form of cell death accurately, using multivariate analysis of micrographs of cells stained with the inexpensive mix, red dye nonyl acridine orange, and a nuclear stain, it was possible to quantify cell death induced by a variety of different agonists even without a positive control. Surprisingly, using a single known cytotoxic agent as a positive control for training a multivariate classifier allowed accurate quantification of cytotoxicity for mechanistically unrelated compounds enabling generation of dose-response curves. Comparison with low throughput biochemical methods suggested that cell death was accurately distinguished from cell stress induced by low concentrations of the bioactive compounds Tunicamycin and Brefeldin A. High-throughput image-based format analyses of more than 300 kinase inhibitors correctly identified 11 as cytotoxic with only 1 false positive. The simplicity and robustness of this dye-based assay makes it particularly suited to live cell screening for toxic compounds.

  6. A Versatile Cell Death Screening Assay Using Dye-Stained Cells and Multivariate Image Analysis

    PubMed Central

    Collins, Tony J.; Ylanko, Jarkko; Geng, Fei

    2015-01-01

    Abstract A novel dye-based method for measuring cell death in image-based screens is presented. Unlike conventional high- and medium-throughput cell death assays that measure only one form of cell death accurately, using multivariate analysis of micrographs of cells stained with the inexpensive mix, red dye nonyl acridine orange, and a nuclear stain, it was possible to quantify cell death induced by a variety of different agonists even without a positive control. Surprisingly, using a single known cytotoxic agent as a positive control for training a multivariate classifier allowed accurate quantification of cytotoxicity for mechanistically unrelated compounds enabling generation of dose–response curves. Comparison with low throughput biochemical methods suggested that cell death was accurately distinguished from cell stress induced by low concentrations of the bioactive compounds Tunicamycin and Brefeldin A. High-throughput image-based format analyses of more than 300 kinase inhibitors correctly identified 11 as cytotoxic with only 1 false positive. The simplicity and robustness of this dye-based assay makes it particularly suited to live cell screening for toxic compounds. PMID:26422066

  7. Microfluidic-based mini-metagenomics enables discovery of novel microbial lineages from complex environmental samples

    PubMed Central

    Yu, Feiqiao Brian; Blainey, Paul C; Schulz, Frederik; Woyke, Tanja; Horowitz, Mark A; Quake, Stephen R

    2017-01-01

    Metagenomics and single-cell genomics have enabled genome discovery from unknown branches of life. However, extracting novel genomes from complex mixtures of metagenomic data can still be challenging and represents an ill-posed problem which is generally approached with ad hoc methods. Here we present a microfluidic-based mini-metagenomic method which offers a statistically rigorous approach to extract novel microbial genomes while preserving single-cell resolution. We used this approach to analyze two hot spring samples from Yellowstone National Park and extracted 29 new genomes, including three deeply branching lineages. The single-cell resolution enabled accurate quantification of genome function and abundance, down to 1% in relative abundance. Our analyses of genome level SNP distributions also revealed low to moderate environmental selection. The scale, resolution, and statistical power of microfluidic-based mini-metagenomics make it a powerful tool to dissect the genomic structure of microbial communities while effectively preserving the fundamental unit of biology, the single cell. DOI: http://dx.doi.org/10.7554/eLife.26580.001 PMID:28678007

  8. The impact of carbon-13 and deuterium on relative quantification of proteins using stable isotope diethyl labeling.

    PubMed

    Koehler, Christian J; Arntzen, Magnus Ø; Thiede, Bernd

    2015-05-15

    Stable isotopic labeling techniques are useful for quantitative proteomics. A cost-effective and convenient method for diethylation by reductive amination was established. The impact using either carbon-13 or deuterium on quantification accuracy and precision was investigated using diethylation. We established an effective approach for stable isotope labeling by diethylation of amino groups of peptides. The approach was validated using matrix-assisted laser desorption/ionization time-of-flight (MALDI-TOF) and nanospray liquid chromatography/electrospray ionization (nanoLC/ESI)-ion trap/orbitrap for mass spectrometric analysis as well as MaxQuant for quantitative data analysis. Reaction conditions with low reagent costs, high yields and minor side reactions were established for diethylation. Furthermore, we showed that diethylation can be applied to up to sixplex labeling. For duplex experiments, we compared diethylation in the analysis of the proteome of HeLa cells using acetaldehyde-(13) C(2)/(12) C(2) and acetaldehyde-(2) H(4)/(1) H(4). Equal numbers of proteins could be identified and quantified; however, (13) C(4)/(12) C(4) -diethylation revealed a lower variance of quantitative peptide ratios within proteins resulting in a higher precision of quantified proteins and less falsely regulated proteins. The results were compared with dimethylation showing minor effects because of the lower number of deuteriums. The described approach for diethylation of primary amines is a cost-effective and accurate method for up to sixplex relative quantification of proteomes. (13) C(4)/(12) C(4) -diethylation enables duplex quantification based on chemical labeling without using deuterium which reduces identification of false-negatives and increases the quality of the quantification results. Copyright © 2015 John Wiley & Sons, Ltd.

  9. In vivo imaging of cancer cell size and cellularity using temporal diffusion spectroscopy.

    PubMed

    Jiang, Xiaoyu; Li, Hua; Xie, Jingping; McKinley, Eliot T; Zhao, Ping; Gore, John C; Xu, Junzhong

    2017-07-01

    A temporal diffusion MRI spectroscopy based approach has been developed to quantify cancer cell size and density in vivo. A novel imaging microstructural parameters using limited spectrally edited diffusion (IMPULSED) method selects a specific limited diffusion spectral window for an accurate quantification of cell sizes ranging from 10 to 20 μm in common solid tumors. In practice, it is achieved by a combination of a single long diffusion time pulsed gradient spin echo (PGSE) and three low-frequency oscillating gradient spin echo (OGSE) acquisitions. To validate our approach, hematoxylin and eosin staining and immunostaining of cell membranes, in concert with whole slide imaging, were used to visualize nuclei and cell boundaries, and hence, enabled accurate estimates of cell size and cellularity. Based on a two compartment model (incorporating intra- and extracellular spaces), accurate estimates of cell sizes were obtained in vivo for three types of human colon cancers. The IMPULSED-derived apparent cellularities showed a stronger correlation (r = 0.81; P < 0.0001) with histology-derived cellularities than conventional ADCs (r = -0.69; P < 0.03). The IMPULSED approach samples a specific region of temporal diffusion spectra with enhanced sensitivity to length scales of 10-20 μm, and enables measurements of cell sizes and cellularities in solid tumors in vivo. Magn Reson Med 78:156-164, 2017. © 2016 International Society for Magnetic Resonance in Medicine. © 2016 International Society for Magnetic Resonance in Medicine.

  10. 68Ga-EDTA PET/CT imaging and plasma clearance for glomerular filtration rate quantification: comparison to conventional 51Cr-EDTA.

    PubMed

    Hofman, Michael; Binns, David; Johnston, Val; Siva, Shankar; Thompson, Mick; Eu, Peter; Collins, Marnie; Hicks, Rodney J

    2015-03-01

    Glomerular filtration rate (GFR) can accurately be determined using (51)Cr-ethylenediaminetetraacetic acid (EDTA) plasma clearance counting but is time-consuming and requires technical skills and equipment not always available in imaging departments. (68)Ga-EDTA can be readily available using an onsite generator, and PET/CT enables both imaging of renal function and accurate camera-based quantitation of clearance of activity from blood and its appearance in the urine. This study aimed to assess agreement between (68)Ga-EDTA GFR ((68)Ga-GFR) and (51)Cr-EDTA GFR ((51)Cr-GFR), using serial plasma sampling and PET imaging. (68)Ga-EDTA and (51)Cr-EDTA were injected concurrently in 31 patients. Dynamic PET/CT encompassing the kidneys was acquired for 10 min followed by 3 sequential 3-min multibed step acquisitions from kidneys to bladder. PET quantification was performed using renal activity at 1-2 min (PETinitial), renal excretion at 2-10 min (PETearly), and, subsequently, urinary excretion into the collecting system and bladder (PETlate). Plasma sampling at 2, 3, and 4 h was performed, with (68)Ga followed by (51)Cr counting after positron decay. The level of agreement for GFR determination was calculated using a Bland-Altman plot and Pearson correlation coefficient (PCC). (51)Cr-GFR ranged from 10 to 220 mL/min (mean, 85 mL/min). There was good agreement between (68)Ga-GFR and (51)Cr-GFR using serial plasma sampling, with a Bland-Altman bias of -14 ± 20 mL/min and a PCC of 0.94 (95% confidence interval, 0.88-0.97). Of the 3 methods used for camera-based quantification, the strongest correlation was for plasma sampling-derived GFR with PETlate (PCC of 0.90; 95% confidence interval, 0.80-0.95). (68)Ga-GFR agreed well with (51)Cr-GFR for estimation of GFR using serial plasma counting. PET dynamic imaging provides a method to estimate GFR without plasma sampling, with the additional advantage of enabling renal imaging in a single study. Additional validation in a larger cohort is warranted to further assess utility. © 2015 by the Society of Nuclear Medicine and Molecular Imaging, Inc.

  11. User-initialized active contour segmentation and golden-angle real-time cardiovascular magnetic resonance enable accurate assessment of LV function in patients with sinus rhythm and arrhythmias.

    PubMed

    Contijoch, Francisco; Witschey, Walter R T; Rogers, Kelly; Rears, Hannah; Hansen, Michael; Yushkevich, Paul; Gorman, Joseph; Gorman, Robert C; Han, Yuchi

    2015-05-21

    Data obtained during arrhythmia is retained in real-time cardiovascular magnetic resonance (rt-CMR), but there is limited and inconsistent evidence to show that rt-CMR can accurately assess beat-to-beat variation in left ventricular (LV) function or during an arrhythmia. Multi-slice, short axis cine and real-time golden-angle radial CMR data was collected in 22 clinical patients (18 in sinus rhythm and 4 patients with arrhythmia). A user-initialized active contour segmentation (ACS) software was validated via comparison to manual segmentation on clinically accepted software. For each image in the 2D acquisitions, slice volume was calculated and global LV volumes were estimated via summation across the LV using multiple slices. Real-time imaging data was reconstructed using different image exposure times and frame rates to evaluate the effect of temporal resolution on measured function in each slice via ACS. Finally, global volumetric function of ectopic and non-ectopic beats was measured using ACS in patients with arrhythmias. ACS provides global LV volume measurements that are not significantly different from manual quantification of retrospectively gated cine images in sinus rhythm patients. With an exposure time of 95.2 ms and a frame rate of > 89 frames per second, golden-angle real-time imaging accurately captures hemodynamic function over a range of patient heart rates. In four patients with frequent ectopic contractions, initial quantification of the impact of ectopic beats on hemodynamic function was demonstrated. User-initialized active contours and golden-angle real-time radial CMR can be used to determine time-varying LV function in patients. These methods will be very useful for the assessment of LV function in patients with frequent arrhythmias.

  12. Rapid and accurate identification by real-time PCR of biotoxin-producing dinoflagellates from the family gymnodiniaceae.

    PubMed

    Smith, Kirsty F; de Salas, Miguel; Adamson, Janet; Rhodes, Lesley L

    2014-03-07

    The identification of toxin-producing dinoflagellates for monitoring programmes and bio-compound discovery requires considerable taxonomic expertise. It can also be difficult to morphologically differentiate toxic and non-toxic species or strains. Various molecular methods have been used for dinoflagellate identification and detection, and this study describes the development of eight real-time polymerase chain reaction (PCR) assays targeting the large subunit ribosomal RNA (LSU rRNA) gene of species from the genera Gymnodinium, Karenia, Karlodinium, and Takayama. Assays proved to be highly specific and sensitive, and the assay for G. catenatum was further developed for quantification in response to a bloom in Manukau Harbour, New Zealand. The assay estimated cell densities from environmental samples as low as 0.07 cells per PCR reaction, which equated to three cells per litre. This assay not only enabled conclusive species identification but also detected the presence of cells below the limit of detection for light microscopy. This study demonstrates the usefulness of real-time PCR as a sensitive and rapid molecular technique for the detection and quantification of micro-algae from environmental samples.

  13. In situ visualization of carbonylation and its co-localization with proteins, lipids, DNA and RNA in Caenorhabditis elegans.

    PubMed

    Kuzmic, Mira; Javot, Hélène; Bonzom, Jean-Marc; Lecomte-Pradines, Catherine; Radman, Miroslav; Garnier-Laplace, Jacqueline; Frelon, Sandrine

    2016-12-01

    All key biological macromolecules are susceptible to carbonylation - an irreparable oxidative damage with deleterious biological consequences. Carbonyls in proteins, lipids and DNA from cell extracts have been used as a biomarker of oxidative stress and aging, but formation of insoluble aggregates by carbonylated proteins precludes quantification. Since carbonylated proteins correlate with and become a suspected cause of morbidity and mortality in some organisms, there is a need for their accurate quantification and localization. Using appropriate fluorescent probes, we have developed an in situ detection of total proteins, DNA, RNA, lipids and carbonyl groups at the level of the whole organism. In C. elegans, we found that after UV irradiation carbonylation co-localizes mainly with proteins and, to a lesser degree, with DNA, RNA and lipids. The method efficiency was illustrated by carbonylation induction assessment over 5 different UV doses. The procedure enables the monitoring of carbonylation in the nematode C. elegans during stress, aging and disease along its life cycle including the egg stage. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Introducing capillary electrophoresis with laser-induced fluorescence (CE-LIF) as a potential analysis and quantification tool for galactooligosaccharides extracted from complex food matrices.

    PubMed

    Albrecht, Simone; Schols, Henk A; Klarenbeek, Bert; Voragen, Alphons G J; Gruppen, Harry

    2010-03-10

    The analysis and quantification of (galacto)oligosaccharides from food matrices demands both a reproducible extraction method as well as a sensitive and accurate analytical method. Three typical matrices, namely, infant formula, fruit juice, and a maltodextrin-rich preparation, to which a commercial galactooligosaccharide mixture was added in a product concentration range from 1.25 to 30%, served as model substrates. Solid-phase extraction on graphitized carbon material upon enzymatic amyloglucosidase pretreatment enabled a good recovery and a selective purification of the different galactooligosaccharide structures from the exceeding amounts of particularly lactose and maltodextrins. With the implementation of capillary electrophoresis in combination with laser-induced fluorescence (CE-LIF) detection, a new possibility facilitating a sensitive qualitative and quantitative determination of the galactooligosaccharide contents in the different food matrices is outlined. Simultaneous monitoring and quantifying prebiotic oligosaccharides embedded in food matrices presents a promising and important step toward an efficient monitoring of individual oligosaccharides and is of interest for research areas dealing with small quantities of oligosaccharides embedded in complex matrices, e.g., body liquids.

  15. Group refractive index quantification using a Fourier domain short coherence Sagnac interferometer.

    PubMed

    Montonen, Risto; Kassamakov, Ivan; Lehmann, Peter; Österberg, Kenneth; Hæggström, Edward

    2018-02-15

    The group refractive index is important in length calibration of Fourier domain interferometers by transparent transfer standards. We demonstrate accurate group refractive index quantification using a Fourier domain short coherence Sagnac interferometer. Because of a justified linear length calibration function, the calibration constants cancel out in the evaluation of the group refractive index, which is then obtained accurately from two uncalibrated lengths. Measurements of two standard thickness coverslips revealed group indices of 1.5426±0.0042 and 1.5434±0.0046, with accuracies quoted at the 95% confidence level. This agreed with the dispersion data of the coverslip manufacturer and therefore validates our method. Our method provides a sample specific and accurate group refractive index quantification using the same Fourier domain interferometer that is to be calibrated for the length. This reduces significantly the requirements of the calibration transfer standard.

  16. Review of the quantification techniques for polycyclic aromatic hydrocarbons (PAHs) in food products.

    PubMed

    Bansal, Vasudha; Kumar, Pawan; Kwon, Eilhann E; Kim, Ki-Hyun

    2017-10-13

    There is a growing need for accurate detection of trace-level PAHs in food products due to the numerous detrimental effects caused by their contamination (e.g., toxicity, carcinogenicity, and teratogenicity). This review aims to discuss the up-to-date knowledge on the measurement techniques available for PAHs contained in food or its related products. This article aims to provide a comprehensive outline on the measurement techniques of PAHs in food to help reduce their deleterious impacts on human health based on the accurate quantification. The main part of this review is dedicated to the opportunities and practical options for the treatment of various food samples and for accurate quantification of PAHs contained in those samples. Basic information regarding all available analytical measurement techniques for PAHs in food samples is also evaluated with respect to their performance in terms of quality assurance.

  17. Characterization of 3-Dimensional PET Systems for Accurate Quantification of Myocardial Blood Flow.

    PubMed

    Renaud, Jennifer M; Yip, Kathy; Guimond, Jean; Trottier, Mikaël; Pibarot, Philippe; Turcotte, Eric; Maguire, Conor; Lalonde, Lucille; Gulenchyn, Karen; Farncombe, Troy; Wisenberg, Gerald; Moody, Jonathan; Lee, Benjamin; Port, Steven C; Turkington, Timothy G; Beanlands, Rob S; deKemp, Robert A

    2017-01-01

    Three-dimensional (3D) mode imaging is the current standard for PET/CT systems. Dynamic imaging for quantification of myocardial blood flow with short-lived tracers, such as 82 Rb-chloride, requires accuracy to be maintained over a wide range of isotope activities and scanner counting rates. We proposed new performance standard measurements to characterize the dynamic range of PET systems for accurate quantitative imaging. 82 Rb or 13 N-ammonia (1,100-3,000 MBq) was injected into the heart wall insert of an anthropomorphic torso phantom. A decaying isotope scan was obtained over 5 half-lives on 9 different 3D PET/CT systems and 1 3D/2-dimensional PET-only system. Dynamic images (28 × 15 s) were reconstructed using iterative algorithms with all corrections enabled. Dynamic range was defined as the maximum activity in the myocardial wall with less than 10% bias, from which corresponding dead-time, counting rates, and/or injected activity limits were established for each scanner. Scatter correction residual bias was estimated as the maximum cavity blood-to-myocardium activity ratio. Image quality was assessed via the coefficient of variation measuring nonuniformity of the left ventricular myocardium activity distribution. Maximum recommended injected activity/body weight, peak dead-time correction factor, counting rates, and residual scatter bias for accurate cardiac myocardial blood flow imaging were 3-14 MBq/kg, 1.5-4.0, 22-64 Mcps singles and 4-14 Mcps prompt coincidence counting rates, and 2%-10% on the investigated scanners. Nonuniformity of the myocardial activity distribution varied from 3% to 16%. Accurate dynamic imaging is possible on the 10 3D PET systems if the maximum injected MBq/kg values are respected to limit peak dead-time losses during the bolus first-pass transit. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  18. Quantitative measurements of electromechanical response with a combined optical beam and interferometric atomic force microscope

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Labuda, Aleksander; Proksch, Roger

    An ongoing challenge in atomic force microscope (AFM) experiments is the quantitative measurement of cantilever motion. The vast majority of AFMs use the optical beam deflection (OBD) method to infer the deflection of the cantilever. The OBD method is easy to implement, has impressive noise performance, and tends to be mechanically robust. However, it represents an indirect measurement of the cantilever displacement, since it is fundamentally an angular rather than a displacement measurement. Here, we demonstrate a metrological AFM that combines an OBD sensor with a laser Doppler vibrometer (LDV) to enable accurate measurements of the cantilever velocity and displacement.more » The OBD/LDV AFM allows a host of quantitative measurements to be performed, including in-situ measurements of cantilever oscillation modes in piezoresponse force microscopy. As an example application, we demonstrate how this instrument can be used for accurate quantification of piezoelectric sensitivity—a longstanding goal in the electromechanical community.« less

  19. Two-dimensional flow nanometry of biological nanoparticles for accurate determination of their size and emission intensity

    NASA Astrophysics Data System (ADS)

    Block, Stephan; Fast, Björn Johansson; Lundgren, Anders; Zhdanov, Vladimir P.; Höök, Fredrik

    2016-09-01

    Biological nanoparticles (BNPs) are of high interest due to their key role in various biological processes and use as biomarkers. BNP size and composition are decisive for their functions, but simultaneous determination of both properties with high accuracy remains challenging. Optical microscopy allows precise determination of fluorescence/scattering intensity, but not the size of individual BNPs. The latter is better determined by tracking their random motion in bulk, but the limited illumination volume for tracking this motion impedes reliable intensity determination. Here, we show that by attaching BNPs to a supported lipid bilayer, subjecting them to hydrodynamic flows and tracking their motion via surface-sensitive optical imaging enable determination of their diffusion coefficients and flow-induced drifts, from which accurate quantification of both BNP size and emission intensity can be made. For vesicles, the accuracy of this approach is demonstrated by resolving the expected radius-squared dependence of their fluorescence intensity for radii down to 15 nm.

  20. High-throughput quantification of the levels and labeling abundance of free amino acids by liquid chromatography tandem mass spectrometry

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cocuron, Jean-Christophe; Tsogtbaatar, Enkhtuul; Alonso, Ana P.

    Accurate assessment of mass isotopomer distributions (MIDs) of intracellular metabolites, such as free amino acids (AAs), is crucial for quantifying in vivo fluxes. To date, the majority of studies that measured AA MIDs have relied on the analysis of proteinogenic rather than free AAs by: i) GC–MS, which involved cumbersome process of derivatization, or ii) NMR, which requires large quantities of biological sample. In this work, the development and validation of a high-throughput LC–MS/MS method allowing the quantification of the levels and labeling of free AAs is described. Sensitivity in the order of the femtomol was achieved using multiple reactionmore » monitoring mode (MRM). The MIDs of all free AAs were assessed without the need of derivatization, and were validated (except for Trp) on a mixture of unlabeled AA standards. Finally, this method was applied to the determination of the 13C-labeling abundance in free AAs extracted from maize embryos cultured with 13C-glutamine or 13C-glucose. Although Cys was below the limit of detection in these biological samples, the MIDs of a total of 18 free AAs were successfully determined. Due to the increased application of tandem mass spectrometry for 13C-Metabolic Flux Analysis, this novel method will enable the assessment of more complete and accurate labeling information of intracellular AAs, and therefore a better definition of the fluxes.« less

  1. High-throughput quantification of the levels and labeling abundance of free amino acids by liquid chromatography tandem mass spectrometry

    DOE PAGES

    Cocuron, Jean-Christophe; Tsogtbaatar, Enkhtuul; Alonso, Ana P.

    2017-02-16

    Accurate assessment of mass isotopomer distributions (MIDs) of intracellular metabolites, such as free amino acids (AAs), is crucial for quantifying in vivo fluxes. To date, the majority of studies that measured AA MIDs have relied on the analysis of proteinogenic rather than free AAs by: i) GC–MS, which involved cumbersome process of derivatization, or ii) NMR, which requires large quantities of biological sample. In this work, the development and validation of a high-throughput LC–MS/MS method allowing the quantification of the levels and labeling of free AAs is described. Sensitivity in the order of the femtomol was achieved using multiple reactionmore » monitoring mode (MRM). The MIDs of all free AAs were assessed without the need of derivatization, and were validated (except for Trp) on a mixture of unlabeled AA standards. Finally, this method was applied to the determination of the 13C-labeling abundance in free AAs extracted from maize embryos cultured with 13C-glutamine or 13C-glucose. Although Cys was below the limit of detection in these biological samples, the MIDs of a total of 18 free AAs were successfully determined. Due to the increased application of tandem mass spectrometry for 13C-Metabolic Flux Analysis, this novel method will enable the assessment of more complete and accurate labeling information of intracellular AAs, and therefore a better definition of the fluxes.« less

  2. Prediction of response factors for gas chromatography with flame ionization detection: Algorithm improvement, extension to silylated compounds, and application to the quantification of metabolites

    PubMed Central

    de Saint Laumer, Jean‐Yves; Leocata, Sabine; Tissot, Emeline; Baroux, Lucie; Kampf, David M.; Merle, Philippe; Boschung, Alain; Seyfried, Markus

    2015-01-01

    We previously showed that the relative response factors of volatile compounds were predictable from either combustion enthalpies or their molecular formulae only 1. We now extend this prediction to silylated derivatives by adding an increment in the ab initio calculation of combustion enthalpies. The accuracy of the experimental relative response factors database was also improved and its population increased to 490 values. In particular, more brominated compounds were measured, and their prediction accuracy was improved by adding a correction factor in the algorithm. The correlation coefficient between predicted and measured values increased from 0.936 to 0.972, leading to a mean prediction accuracy of ± 6%. Thus, 93% of the relative response factors values were predicted with an accuracy of better than ± 10%. The capabilities of the extended algorithm are exemplified by (i) the quick and accurate quantification of hydroxylated metabolites resulting from a biodegradation test after silylation and prediction of their relative response factors, without having the reference substances available; and (ii) the rapid purity determinations of volatile compounds. This study confirms that Gas chromatography with a flame ionization detector and using predicted relative response factors is one of the few techniques that enables quantification of volatile compounds without calibrating the instrument with the pure reference substance. PMID:26179324

  3. Detection and Quantification of Graphene-Family Nanomaterials in the Environment.

    PubMed

    Goodwin, David G; Adeleye, Adeyemi S; Sung, Lipiin; Ho, Kay T; Burgess, Robert M; Petersen, Elijah J

    2018-04-17

    An increase in production of commercial products containing graphene-family nanomaterials (GFNs) has led to concern over their release into the environment. The fate and potential ecotoxicological effects of GFNs in the environment are currently unclear, partially due to the limited analytical methods for GFN measurements. In this review, the unique properties of GFNs that are useful for their detection and quantification are discussed. The capacity of several classes of techniques to identify and/or quantify GFNs in different environmental matrices (water, soil, sediment, and organisms), after environmental transformations, and after release from a polymer matrix of a product is evaluated. Extraction and strategies to combine methods for more accurate discrimination of GFNs from environmental interferences as well as from other carbonaceous nanomaterials are recommended. Overall, a comprehensive review of the techniques available to detect and quantify GFNs are systematically presented to inform the state of the science, guide researchers in their selection of the best technique for the system under investigation, and enable further development of GFN metrology in environmental matrices. Two case studies are described to provide practical examples of choosing which techniques to utilize for detection or quantification of GFNs in specific scenarios. Because the available quantitative techniques are somewhat limited, more research is required to distinguish GFNs from other carbonaceous materials and improve the accuracy and detection limits of GFNs at more environmentally relevant concentrations.

  4. Smooth deuterated cellulose films for the visualisation of adsorbed bio-macromolecules

    PubMed Central

    Su, Jielong; Raghuwanshi, Vikram S.; Raverty, Warwick; Garvey, Christopher J.; Holden, Peter J.; Gillon, Marie; Holt, Stephen A.; Tabor, Rico; Batchelor, Warren; Garnier, Gil

    2016-01-01

    Novel thin and smooth deuterated cellulose films were synthesised to visualize adsorbed bio-macromolecules using contrast variation neutron reflectivity (NR) measurements. Incorporation of varying degrees of deuteration into cellulose was achieved by growing Gluconacetobacter xylinus in deuterated glycerol as carbon source dissolved in growth media containing D2O. The derivative of deuterated cellulose was prepared by trimethylsilylation(TMS) in ionic liquid(1-butyl-3-methylimidazolium chloride). The TMS derivative was dissolved in toluene for thin film preparation by spin-coating. The resulting film was regenerated into deuterated cellulose by exposure to acidic vapour. A common enzyme, horseradish peroxidase (HRP), was adsorbed from solution onto the deuterated cellulose films and visualized by NR. The scattering length density contrast of the deuterated cellulose enabled accurate visualization and quantification of the adsorbed HRP, which would have been impossible to achieve with non-deuterated cellulose. The procedure described enables preparing deuterated cellulose films that allows differentiation of cellulose and non-deuterated bio-macromolecules using NR. PMID:27796332

  5. Ranking Fragment Ions Based on Outlier Detection for Improved Label-Free Quantification in Data-Independent Acquisition LC-MS/MS

    PubMed Central

    Bilbao, Aivett; Zhang, Ying; Varesio, Emmanuel; Luban, Jeremy; Strambio-De-Castillia, Caterina; Lisacek, Frédérique; Hopfgartner, Gérard

    2016-01-01

    Data-independent acquisition LC-MS/MS techniques complement supervised methods for peptide quantification. However, due to the wide precursor isolation windows, these techniques are prone to interference at the fragment ion level, which in turn is detrimental for accurate quantification. The “non-outlier fragment ion” (NOFI) ranking algorithm has been developed to assign low priority to fragment ions affected by interference. By using the optimal subset of high priority fragment ions these interfered fragment ions are effectively excluded from quantification. NOFI represents each fragment ion as a vector of four dimensions related to chromatographic and MS fragmentation attributes and applies multivariate outlier detection techniques. Benchmarking conducted on a well-defined quantitative dataset (i.e. the SWATH Gold Standard), indicates that NOFI on average is able to accurately quantify 11-25% more peptides than the commonly used Top-N library intensity ranking method. The sum of the area of the Top3-5 NOFIs produces similar coefficients of variation as compared to the library intensity method but with more accurate quantification results. On a biologically relevant human dendritic cell digest dataset, NOFI properly assigns low priority ranks to 85% of annotated interferences, resulting in sensitivity values between 0.92 and 0.80 against 0.76 for the Spectronaut interference detection algorithm. PMID:26412574

  6. Quantification of methionine and selenomethionine in biological samples using multiple reaction monitoring high performance liquid chromatography tandem mass spectrometry (MRM-HPLC-MS/MS).

    PubMed

    Vu, Dai Long; Ranglová, Karolína; Hájek, Jan; Hrouzek, Pavel

    2018-05-01

    Quantification of selenated amino-acids currently relies on methods employing inductively coupled plasma mass spectrometry (ICP-MS). Although very accurate, these methods do not allow the simultaneous determination of standard amino-acids, hampering the comparison of the content of selenated versus non-selenated species such as methionine (Met) and selenomethionine (SeMet). This paper reports two approaches for the simultaneous quantification of Met and SeMet. In the first approach, standard enzymatic hydrolysis employing Protease XIV was applied for the preparation of samples. The second approach utilized methanesulfonic acid (MA) for the hydrolysis of samples, either in a reflux system or in a microwave oven, followed by derivatization with diethyl ethoxymethylenemalonate. The prepared samples were then analyzed by multiple reaction monitoring high performance liquid chromatography tandem mass spectrometry (MRM-HPLC-MS/MS). Both approaches provided platforms for the accurate determination of selenium/sulfur substitution rate in Met. Moreover the second approach also provided accurate simultaneous quantification of Met and SeMet with a low limit of detection, low limit of quantification and wide linearity range, comparable to the commonly used gas chromatography mass spectrometry (GC-MS) method or ICP-MS. The novel method was validated using certified reference material in conjunction with the GC-MS reference method. Copyright © 2018. Published by Elsevier B.V.

  7. Identification and accurate quantification of structurally related peptide impurities in synthetic human C-peptide by liquid chromatography-high resolution mass spectrometry.

    PubMed

    Li, Ming; Josephs, Ralf D; Daireaux, Adeline; Choteau, Tiphaine; Westwood, Steven; Wielgosz, Robert I; Li, Hongmei

    2018-06-04

    Peptides are an increasingly important group of biomarkers and pharmaceuticals. The accurate purity characterization of peptide calibrators is critical for the development of reference measurement systems for laboratory medicine and quality control of pharmaceuticals. The peptides used for these purposes are increasingly produced through peptide synthesis. Various approaches (for example mass balance, amino acid analysis, qNMR, and nitrogen determination) can be applied to accurately value assign the purity of peptide calibrators. However, all purity assessment approaches require a correction for structurally related peptide impurities in order to avoid biases. Liquid chromatography coupled to high resolution mass spectrometry (LC-hrMS) has become the key technique for the identification and accurate quantification of structurally related peptide impurities in intact peptide calibrator materials. In this study, LC-hrMS-based methods were developed and validated in-house for the identification and quantification of structurally related peptide impurities in a synthetic human C-peptide (hCP) material, which served as a study material for an international comparison looking at the competencies of laboratories to perform peptide purity mass fraction assignments. More than 65 impurities were identified, confirmed, and accurately quantified by using LC-hrMS. The total mass fraction of all structurally related peptide impurities in the hCP study material was estimated to be 83.3 mg/g with an associated expanded uncertainty of 3.0 mg/g (k = 2). The calibration hierarchy concept used for the quantification of individual impurities is described in detail. Graphical abstract ᅟ.

  8. High-speed multislice T1 mapping using inversion-recovery echo-planar imaging.

    PubMed

    Ordidge, R J; Gibbs, P; Chapman, B; Stehling, M K; Mansfield, P

    1990-11-01

    Tissue contrast in MR images is a strong function of spin-lattice (T1) and spin-spin (T2) relaxation times. However, the T1 relaxation time is rarely quantified because of the long scan time required to produce an accurate T1 map of the subject. In a standard 2D FT technique, this procedure may take up to 30 min. Modifications of the echo-planar imaging (EPI) technique which incorporate the principle of inversion recovery (IR) enable multislice T1 maps to be produced in total scan times varying from a few seconds up to a minute. Using IR-EPI, rapid quantification of T1 values may thus lead to better discrimination between tissue types in an acceptable scan time.

  9. Quantification of HTLV-1 Clonality and TCR Diversity

    PubMed Central

    Laydon, Daniel J.; Melamed, Anat; Sim, Aaron; Gillet, Nicolas A.; Sim, Kathleen; Darko, Sam; Kroll, J. Simon; Douek, Daniel C.; Price, David A.; Bangham, Charles R. M.; Asquith, Becca

    2014-01-01

    Estimation of immunological and microbiological diversity is vital to our understanding of infection and the immune response. For instance, what is the diversity of the T cell repertoire? These questions are partially addressed by high-throughput sequencing techniques that enable identification of immunological and microbiological “species” in a sample. Estimators of the number of unseen species are needed to estimate population diversity from sample diversity. Here we test five widely used non-parametric estimators, and develop and validate a novel method, DivE, to estimate species richness and distribution. We used three independent datasets: (i) viral populations from subjects infected with human T-lymphotropic virus type 1; (ii) T cell antigen receptor clonotype repertoires; and (iii) microbial data from infant faecal samples. When applied to datasets with rarefaction curves that did not plateau, existing estimators systematically increased with sample size. In contrast, DivE consistently and accurately estimated diversity for all datasets. We identify conditions that limit the application of DivE. We also show that DivE can be used to accurately estimate the underlying population frequency distribution. We have developed a novel method that is significantly more accurate than commonly used biodiversity estimators in microbiological and immunological populations. PMID:24945836

  10. Collagen Quantification in Tissue Specimens.

    PubMed

    Coentro, João Quintas; Capella-Monsonís, Héctor; Graceffa, Valeria; Wu, Zhuning; Mullen, Anne Maria; Raghunath, Michael; Zeugolis, Dimitrios I

    2017-01-01

    Collagen is the major extracellular protein in mammals. Accurate quantification of collagen is essential in the biomaterials (e.g., reproducible collagen scaffold fabrication), drug discovery (e.g., assessment of collagen in pathophysiologies, such as fibrosis), and tissue engineering (e.g., quantification of cell-synthesized collagen) fields. Although measuring hydroxyproline content is the most widely used method to quantify collagen in biological specimens, the process is very laborious. To this end, the Sircol™ Collagen Assay is widely used due to its inherent simplicity and convenience. However, this method leads to overestimation of collagen content due to the interaction of Sirius red with basic amino acids of non-collagenous proteins. Herein, we describe the addition of an ultrafiltration purification step in the process to accurately determine collagen content in tissues.

  11. The Zugspitze radiative closure experiment for quantifying water vapor absorption over the terrestrial and solar infrared - Part 2: Accurate calibration of high spectral-resolution infrared measurements of surface solar radiation

    NASA Astrophysics Data System (ADS)

    Reichert, Andreas; Rettinger, Markus; Sussmann, Ralf

    2016-09-01

    Quantitative knowledge of water vapor absorption is crucial for accurate climate simulations. An open science question in this context concerns the strength of the water vapor continuum in the near infrared (NIR) at atmospheric temperatures, which is still to be quantified by measurements. This issue can be addressed with radiative closure experiments using solar absorption spectra. However, the spectra used for water vapor continuum quantification have to be radiometrically calibrated. We present for the first time a method that yields sufficient calibration accuracy for NIR water vapor continuum quantification in an atmospheric closure experiment. Our method combines the Langley method with spectral radiance measurements of a high-temperature blackbody calibration source (< 2000 K). The calibration scheme is demonstrated in the spectral range 2500 to 7800 cm-1, but minor modifications to the method enable calibration also throughout the remainder of the NIR spectral range. The resulting uncertainty (2σ) excluding the contribution due to inaccuracies in the extra-atmospheric solar spectrum (ESS) is below 1 % in window regions and up to 1.7 % within absorption bands. The overall radiometric accuracy of the calibration depends on the ESS uncertainty, on which at present no firm consensus has been reached in the NIR. However, as is shown in the companion publication Reichert and Sussmann (2016), ESS uncertainty is only of minor importance for the specific aim of this study, i.e., the quantification of the water vapor continuum in a closure experiment. The calibration uncertainty estimate is substantiated by the investigation of calibration self-consistency, which yields compatible results within the estimated errors for 91.1 % of the 2500 to 7800 cm-1 range. Additionally, a comparison of a set of calibrated spectra to radiative transfer model calculations yields consistent results within the estimated errors for 97.7 % of the spectral range.

  12. Single Color Multiplexed ddPCR Copy Number Measurements and Single Nucleotide Variant Genotyping.

    PubMed

    Wood-Bouwens, Christina M; Ji, Hanlee P

    2018-01-01

    Droplet digital PCR (ddPCR) allows for accurate quantification of genetic events such as copy number variation and single nucleotide variants. Probe-based assays represent the current "gold-standard" for detection and quantification of these genetic events. Here, we introduce a cost-effective single color ddPCR assay that allows for single genome resolution quantification of copy number and single nucleotide variation.

  13. Technical skills measurement based on a cyber-physical system for endovascular surgery simulation.

    PubMed

    Tercero, Carlos; Kodama, Hirokatsu; Shi, Chaoyang; Ooe, Katsutoshi; Ikeda, Seiichi; Fukuda, Toshio; Arai, Fumihito; Negoro, Makoto; Kwon, Guiryong; Najdovski, Zoran

    2013-09-01

    Quantification of medical skills is a challenge, particularly simulator-based training. In the case of endovascular intervention, it is desirable that a simulator accurately recreates the morphology and mechanical characteristics of the vasculature while enabling scoring. For this purpose, we propose a cyber-physical system composed of optical sensors for a catheter's body motion encoding, a magnetic tracker for motion capture of an operator's hands, and opto-mechatronic sensors for measuring the interaction of the catheter tip with the vasculature model wall. Two pilot studies were conducted for measuring technical skills, one for distinguishing novices from experts and the other for measuring unnecessary motion. The proficiency levels were measurable between expert and novice and also between individual novice users. The results enabled scoring of the user's proficiency level, using sensitivity, reaction time, time to complete a task and respect for tissue integrity as evaluation criteria. Additionally, unnecessary motion was also measurable. The development of cyber-physical simulators for other domains of medicine depend on the study of photoelastic materials for human tissue modelling, and enables quantitative evaluation of skills using surgical instruments and a realistic representation of human tissue. Copyright © 2012 John Wiley & Sons, Ltd.

  14. Accuracy of iodine quantification in dual-layer spectral CT: Influence of iterative reconstruction, patient habitus and tube parameters.

    PubMed

    Sauter, Andreas P; Kopp, Felix K; Münzel, Daniela; Dangelmaier, Julia; Renz, Martin; Renger, Bernhard; Braren, Rickmer; Fingerle, Alexander A; Rummeny, Ernst J; Noël, Peter B

    2018-05-01

    Evaluation of the influence of iterative reconstruction, tube settings and patient habitus on the accuracy of iodine quantification with dual-layer spectral CT (DL-CT). A CT abdomen phantom with different extension rings and four iodine inserts (1, 2, 5 and 10 mg/ml) was scanned on a DL-CT. The phantom was scanned with tube-voltages of 120 and 140 kVp and CTDI vol of 2.5, 5, 10 and 20 mGy. Reconstructions were performed for eight levels of iterative reconstruction (i0-i7). Diagnostic dose levels are classified depending on patient-size and radiation dose. Measurements of iodine concentration showed accurate and reliable results. Taking all CTDI vol -levels into account, the mean absolute percentage difference (MAPD) showed less accuracy for low CTDI vol -levels (2.5 mGy: 34.72%) than for high CTDI vol -levels (20 mGy: 5.89%). At diagnostic dose levels, accurate quantification of iodine was possible (MAPD 3.38%). Level of iterative reconstruction did not significantly influence iodine measurements. Iodine quantification worked more accurately at a tube voltage of 140 kVp. Phantom size had a considerable effect only at low-dose-levels; at diagnostic dose levels the effect of phantom size decreased (MAPD <5% for all phantom sizes). With DL-CT, even low iodine concentrations can be accurately quantified. Accuracies are higher when diagnostic radiation doses are employed. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. Immobilized Metal Affinity Chromatography Coupled to Multiple Reaction Monitoring Enables Reproducible Quantification of Phospho-signaling*

    PubMed Central

    Kennedy, Jacob J.; Yan, Ping; Zhao, Lei; Ivey, Richard G.; Voytovich, Uliana J.; Moore, Heather D.; Lin, Chenwei; Pogosova-Agadjanyan, Era L.; Stirewalt, Derek L.; Reding, Kerryn W.; Whiteaker, Jeffrey R.; Paulovich, Amanda G.

    2016-01-01

    A major goal in cell signaling research is the quantification of phosphorylation pharmacodynamics following perturbations. Traditional methods of studying cellular phospho-signaling measure one analyte at a time with poor standardization, rendering them inadequate for interrogating network biology and contributing to the irreproducibility of preclinical research. In this study, we test the feasibility of circumventing these issues by coupling immobilized metal affinity chromatography (IMAC)-based enrichment of phosphopeptides with targeted, multiple reaction monitoring (MRM) mass spectrometry to achieve precise, specific, standardized, multiplex quantification of phospho-signaling responses. A multiplex immobilized metal affinity chromatography- multiple reaction monitoring assay targeting phospho-analytes responsive to DNA damage was configured, analytically characterized, and deployed to generate phospho-pharmacodynamic curves from primary and immortalized human cells experiencing genotoxic stress. The multiplexed assays demonstrated linear ranges of ≥3 orders of magnitude, median lower limit of quantification of 0.64 fmol on column, median intra-assay variability of 9.3%, median inter-assay variability of 12.7%, and median total CV of 16.0%. The multiplex immobilized metal affinity chromatography- multiple reaction monitoring assay enabled robust quantification of 107 DNA damage-responsive phosphosites from human cells following DNA damage. The assays have been made publicly available as a resource to the community. The approach is generally applicable, enabling wide interrogation of signaling networks. PMID:26621847

  16. Quantification of in vivo short echo-time proton magnetic resonance spectra at 14.1 T using two different approaches of modelling the macromolecule spectrum

    NASA Astrophysics Data System (ADS)

    Cudalbu, C.; Mlynárik, V.; Xin, L.; Gruetter, Rolf

    2009-10-01

    Reliable quantification of the macromolecule signals in short echo-time 1H MRS spectra is particularly important at high magnetic fields for an accurate quantification of metabolite concentrations (the neurochemical profile) due to effectively increased spectral resolution of the macromolecule components. The purpose of the present study was to assess two approaches of quantification, which take the contribution of macromolecules into account in the quantification step. 1H spectra were acquired on a 14.1 T/26 cm horizontal scanner on five rats using the ultra-short echo-time SPECIAL (spin echo full intensity acquired localization) spectroscopy sequence. Metabolite concentrations were estimated using LCModel, combined with a simulated basis set of metabolites using published spectral parameters and either the spectrum of macromolecules measured in vivo, using an inversion recovery technique, or baseline simulated by the built-in spline function. The fitted spline function resulted in a smooth approximation of the in vivo macromolecules, but in accordance with previous studies using Subtract-QUEST could not reproduce completely all features of the in vivo spectrum of macromolecules at 14.1 T. As a consequence, the measured macromolecular 'baseline' led to a more accurate and reliable quantification at higher field strengths.

  17. Markers of anthropogenic contamination: A validated method for quantification of pharmaceuticals, illicit drug metabolites, perfluorinated compounds, and plasticisers in sewage treatment effluent and rain runoff.

    PubMed

    Wilkinson, John L; Swinden, Julian; Hooda, Peter S; Barker, James; Barton, Stephen

    2016-09-01

    An effective, specific and accurate method is presented for the quantification of 13 markers of anthropogenic contaminants in water using solid phase extraction (SPE) followed by high performance liquid chromatography (HPLC) tandem mass spectrometry (MS/MS). Validation was conducted according to the International Conference on Harmonisation (ICH) guidelines. Method recoveries ranged from 77 to 114% and limits of quantification between 0.75 and 4.91 ng/L. A study was undertaken to quantify the concentrations and loadings of the selected contaminants in 6 sewage treatment works (STW) effluent discharges as well as concentrations in 5 rain-driven street runoffs and field drainages. Detection frequencies in STW effluent ranged from 25% (ethinylestradiol) to 100% (benzoylecgonine, bisphenol-A (BPA), bisphenol-S (BPS) and diclofenac). Average concentrations of detected compounds in STW effluents ranged from 3.62 ng/L (ethinylestradiol) to 210 ng/L (BPA). Levels of perfluorinated compounds (PFCs) perfluorooctanoic acid (PFOA) and perfluorononanoic acid (PFNA) as well as the plasticiser BPA were found in street runoff at maximum levels of 1160 ng/L, 647 ng/L and 2405 ng/L respectively (8.52, 3.09 and 2.7 times more concentrated than maximum levels in STW effluents respectively). Rain-driven street runoff may have an effect on levels of PFCs and plasticisers in receiving rivers and should be further investigated. Together, this method with the 13 selected contaminants enables the quantification of various markers of anthropogenic pollutants: inter alia pharmaceuticals, illicit drugs and their metabolites from humans and improper disposal of drugs, while the plasticisers and perfluorinated compounds may also indicate contamination from industrial and transport activity (street runoff). Copyright © 2016 Elsevier Ltd. All rights reserved.

  18. Separation, identification, quantification, and method validation of anthocyanins in botanical supplement raw materials by HPLC and HPLC-MS.

    PubMed

    Chandra, A; Rana, J; Li, Y

    2001-08-01

    A method has been established and validated for identification and quantification of individual, as well as total, anthocyanins by HPLC and LC/ES-MS in botanical raw materials used in the herbal supplement industry. The anthocyanins were separated and identified on the basis of their respective M(+) (cation) using LC/ES-MS. Separated anthocyanins were individually calculated against one commercially available anthocyanin external standard (cyanidin-3-glucoside chloride) and expressed as its equivalents. Amounts of each anthocyanin calculated as external standard equivalent were then multiplied by a molecular-weight correction factor to afford their specific quantities. Experimental procedures and use of a molecular-weight correction factors are substantiated and validated using Balaton tart cherry and elderberry as templates. Cyanidin-3-glucoside chloride has been widely used in the botanical industry to calculate total anthocyanins. In our studies on tart cherry and elderberry, its use as external standard followed by use of molecular-weight correction factors should provide relatively accurate results for total anthocyanins, because of the presence of cyanidin as their major anthocyanidin backbone. The method proposed here is simple and has a direct sample preparation procedure without any solid-phase extraction. It enables selection and use of commercially available anthocyanins as external standards for quantification of specific anthocyanins in the sample matrix irrespective of their commercial availability as analytical standards. It can be used as a template and applied for similar quantification in several anthocyanin-containing raw materials for routine quality control procedures, thus providing consistency in analytical testing of botanical raw materials used for manufacturing efficacious and true-to-the-label nutritional supplements.

  19. Quantification of ligand density and stoichiometry on the surface of liposomes using single-molecule fluorescence imaging.

    PubMed

    Belfiore, Lisa; Spenkelink, Lisanne M; Ranson, Marie; van Oijen, Antoine M; Vine, Kara L

    2018-05-28

    Despite the longstanding existence of liposome technology in drug delivery applications, there have been no ligand-directed liposome formulations approved for clinical use to date. This lack of translation is due to several factors, one of which is the absence of molecular tools for the robust quantification of ligand density on the surface of liposomes. We report here for the first time the quantification of proteins attached to the surface of small unilamellar liposomes using single-molecule fluorescence imaging. Liposomes were surface-functionalized with fluorescently labeled human proteins previously validated to target the cancer cell surface biomarkers plasminogen activator inhibitor-2 (PAI-2) and trastuzumab (TZ, Herceptin®). These protein-conjugated liposomes were visualized using a custom-built wide-field fluorescence microscope with single-molecule sensitivity. By counting the photobleaching steps of the fluorescently labeled proteins, we calculated the number of attached proteins per liposome, which was 11 ± 4 proteins for single-ligand liposomes. Imaging of dual-ligand liposomes revealed stoichiometries of the two attached proteins in accordance with the molar ratios of protein added during preparation. Preparation of PAI-2/TZ dual-ligand liposomes via two different methods revealed that the post-insertion method generated liposomes with a more equal representation of the two differently sized proteins, demonstrating the ability of this preparation method to enable better control of liposome protein densities. We conclude that the single-molecule imaging method presented here is an accurate and reliable quantification tool for determining ligand density and stoichiometry on the surface of liposomes. This method has the potential to allow for comprehensive characterization of novel ligand-directed liposomes that should facilitate the translation of these nanotherapies through to the clinic. Copyright © 2018 Elsevier B.V. All rights reserved.

  20. Toward automatic segmentation and quantification of tumor and stroma in whole-slide images of H and E stained rectal carcinomas

    NASA Astrophysics Data System (ADS)

    Geessink, Oscar G. F.; Baidoshvili, Alexi; Freling, Gerard; Klaase, Joost M.; Slump, Cornelis H.; van der Heijden, Ferdinand

    2015-03-01

    Visual estimation of tumor and stroma proportions in microscopy images yields a strong, Tumor-(lymph)Node- Metastasis (TNM) classification-independent predictor for patient survival in colorectal cancer. Therefore, it is also a potent (contra)indicator for adjuvant chemotherapy. However, quantification of tumor and stroma through visual estimation is highly subject to intra- and inter-observer variability. The aim of this study is to develop and clinically validate a method for objective quantification of tumor and stroma in standard hematoxylin and eosin (H and E) stained microscopy slides of rectal carcinomas. A tissue segmentation algorithm, based on supervised machine learning and pixel classification, was developed, trained and validated using histological slides that were prepared from surgically excised rectal carcinomas in patients who had not received neoadjuvant chemotherapy and/or radiotherapy. Whole-slide scanning was performed at 20× magnification. A total of 40 images (4 million pixels each) were extracted from 20 whole-slide images at sites showing various relative proportions of tumor and stroma. Experienced pathologists provided detailed annotations for every extracted image. The performance of the algorithm was evaluated using cross-validation by testing on 1 image at a time while using the other 39 images for training. The total classification error of the algorithm was 9.4% (SD = 3.2%). Compared to visual estimation by pathologists, the algorithm was 7.3 times (P = 0.033) more accurate in quantifying tissues, also showing 60% less variability. Automatic tissue quantification was shown to be both reliable and practicable. We ultimately intend to facilitate refined prognostic stratification of (colo)rectal cancer patients and enable better personalized treatment.

  1. Picoliter Well Array Chip-Based Digital Recombinase Polymerase Amplification for Absolute Quantification of Nucleic Acids.

    PubMed

    Li, Zhao; Liu, Yong; Wei, Qingquan; Liu, Yuanjie; Liu, Wenwen; Zhang, Xuelian; Yu, Yude

    2016-01-01

    Absolute, precise quantification methods expand the scope of nucleic acids research and have many practical applications. Digital polymerase chain reaction (dPCR) is a powerful method for nucleic acid detection and absolute quantification. However, it requires thermal cycling and accurate temperature control, which are difficult in resource-limited conditions. Accordingly, isothermal methods, such as recombinase polymerase amplification (RPA), are more attractive. We developed a picoliter well array (PWA) chip with 27,000 consistently sized picoliter reactions (314 pL) for isothermal DNA quantification using digital RPA (dRPA) at 39°C. Sample loading using a scraping liquid blade was simple, fast, and required small reagent volumes (i.e., <20 μL). Passivating the chip surface using a methoxy-PEG-silane agent effectively eliminated cross-contamination during dRPA. Our creative optical design enabled wide-field fluorescence imaging in situ and both end-point and real-time analyses of picoliter wells in a 6-cm(2) area. It was not necessary to use scan shooting and stitch serial small images together. Using this method, we quantified serial dilutions of a Listeria monocytogenes gDNA stock solution from 9 × 10(-1) to 4 × 10(-3) copies per well with an average error of less than 11% (N = 15). Overall dRPA-on-chip processing required less than 30 min, which was a 4-fold decrease compared to dPCR, requiring approximately 2 h. dRPA on the PWA chip provides a simple and highly sensitive method to quantify nucleic acids without thermal cycling or precise micropump/microvalve control. It has applications in fast field analysis and critical clinical diagnostics under resource-limited settings.

  2. Picoliter Well Array Chip-Based Digital Recombinase Polymerase Amplification for Absolute Quantification of Nucleic Acids

    PubMed Central

    Li, Zhao; Liu, Yong; Wei, Qingquan; Liu, Yuanjie; Liu, Wenwen; Zhang, Xuelian; Yu, Yude

    2016-01-01

    Absolute, precise quantification methods expand the scope of nucleic acids research and have many practical applications. Digital polymerase chain reaction (dPCR) is a powerful method for nucleic acid detection and absolute quantification. However, it requires thermal cycling and accurate temperature control, which are difficult in resource-limited conditions. Accordingly, isothermal methods, such as recombinase polymerase amplification (RPA), are more attractive. We developed a picoliter well array (PWA) chip with 27,000 consistently sized picoliter reactions (314 pL) for isothermal DNA quantification using digital RPA (dRPA) at 39°C. Sample loading using a scraping liquid blade was simple, fast, and required small reagent volumes (i.e., <20 μL). Passivating the chip surface using a methoxy-PEG-silane agent effectively eliminated cross-contamination during dRPA. Our creative optical design enabled wide-field fluorescence imaging in situ and both end-point and real-time analyses of picoliter wells in a 6-cm2 area. It was not necessary to use scan shooting and stitch serial small images together. Using this method, we quantified serial dilutions of a Listeria monocytogenes gDNA stock solution from 9 × 10-1 to 4 × 10-3 copies per well with an average error of less than 11% (N = 15). Overall dRPA-on-chip processing required less than 30 min, which was a 4-fold decrease compared to dPCR, requiring approximately 2 h. dRPA on the PWA chip provides a simple and highly sensitive method to quantify nucleic acids without thermal cycling or precise micropump/microvalve control. It has applications in fast field analysis and critical clinical diagnostics under resource-limited settings. PMID:27074005

  3. Integrating public risk perception into formal natural hazard risk assessment

    NASA Astrophysics Data System (ADS)

    Plattner, Th.; Plapp, T.; Hebel, B.

    2006-06-01

    An urgent need to take perception into account for risk assessment has been pointed out by relevant literature, its impact in terms of risk-related behaviour by individuals is obvious. This study represents an effort to overcome the broadly discussed question of whether risk perception is quantifiable or not by proposing a still simple but applicable methodology. A novel approach is elaborated to obtain a more accurate and comprehensive quantification of risk in comparison to present formal risk evaluation practice. A consideration of relevant factors enables a explicit quantification of individual risk perception and evaluation. The model approach integrates the effective individual risk reff and a weighted mean of relevant perception affecting factors PAF. The relevant PAF cover voluntariness of risk-taking, individual reducibility of risk, knowledge and experience, endangerment, subjective damage rating and subjective recurrence frequency perception. The approach assigns an individual weight to each PAF to represent its impact magnitude. The quantification of these weights is target-group-dependent (e.g. experts, laypersons) and may be effected by psychometric methods. The novel approach is subject to a plausibility check using data from an expert-workshop. A first model application is conducted by means of data of an empirical risk perception study in Western Germany to deduce PAF and weight quantification as well as to confirm and evaluate model applicbility and flexibility. Main fields of application will be a quantification of risk perception by individual persons in a formal and technical way e.g. for the purpose of risk communication issues in illustrating differing perspectives of experts and non-experts. For decision making processes this model will have to be applied with caution, since it is by definition not designed to quantify risk acceptance or risk evaluation. The approach may well explain how risk perception differs, but not why it differs. The formal model generates only "snap shots" and considers neither the socio-cultural nor the historical context of risk perception, since it is a highly individualistic and non-contextual approach.

  4. High-throughput quantification of hydroxyproline for determination of collagen.

    PubMed

    Hofman, Kathleen; Hall, Bronwyn; Cleaver, Helen; Marshall, Susan

    2011-10-15

    An accurate and high-throughput assay for collagen is essential for collagen research and development of collagen products. Hydroxyproline is routinely assayed to provide a measurement for collagen quantification. The time required for sample preparation using acid hydrolysis and neutralization prior to assay is what limits the current method for determining hydroxyproline. This work describes the conditions of alkali hydrolysis that, when combined with the colorimetric assay defined by Woessner, provide a high-throughput, accurate method for the measurement of hydroxyproline. Copyright © 2011 Elsevier Inc. All rights reserved.

  5. Electrochemical Branched-DNA Assay for Polymerase Chain Reaction-Free Detection and Quantification of Oncogenes in Messenger RNA

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lee, Ai Cheng; Dai, Ziyu; Chen, Baowei

    2008-12-01

    We describe a novel electrochemical branched-DNA (bDNA) assay for polymerase chain reaction (PCR)-free detection and quantification of p185 BCR-ABL leukemia fusion transcript in the population of messenger RNA (mRNA) extracted from cell lines. The bDNA amplifier carrying high loading of alkaline phosphatase (ALP) tracers was used to amplify targets signal. The targets were captured on microplate well surfaces through cooperative sandwich hybridization prior to the labeling of bDNA. The activity of captured ALP was monitored by square-wave voltammetric (SWV) analysis of the electroactive enzymatic product in the presence of 1-napthyl-phosphate. The specificity and sensitivity of assay enabled direct detection ofmore » target transcript in as little as 4.6 ng mRNA without PCR amplification. In combination with the use of a well-quantified standard, the electrochemical bDNA assay was capable of direct use for a PCR-free quantitative analysis of target transcript in total mRNA population. The approach thus provides a simple, sensitive, accurate and quantitative tool alternate to the RQ-PCR for early disease diagnosis.« less

  6. StatSTEM: An efficient approach for accurate and precise model-based quantification of atomic resolution electron microscopy images.

    PubMed

    De Backer, A; van den Bos, K H W; Van den Broek, W; Sijbers, J; Van Aert, S

    2016-12-01

    An efficient model-based estimation algorithm is introduced to quantify the atomic column positions and intensities from atomic resolution (scanning) transmission electron microscopy ((S)TEM) images. This algorithm uses the least squares estimator on image segments containing individual columns fully accounting for overlap between neighbouring columns, enabling the analysis of a large field of view. For this algorithm, the accuracy and precision with which measurements for the atomic column positions and scattering cross-sections from annular dark field (ADF) STEM images can be estimated, has been investigated. The highest attainable precision is reached even for low dose images. Furthermore, the advantages of the model-based approach taking into account overlap between neighbouring columns are highlighted. This is done for the estimation of the distance between two neighbouring columns as a function of their distance and for the estimation of the scattering cross-section which is compared to the integrated intensity from a Voronoi cell. To provide end-users this well-established quantification method, a user friendly program, StatSTEM, is developed which is freely available under a GNU public license. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. Recent advances on multidimensional liquid chromatography-mass spectrometry for proteomics: from qualitative to quantitative analysis--a review.

    PubMed

    Wu, Qi; Yuan, Huiming; Zhang, Lihua; Zhang, Yukui

    2012-06-20

    With the acceleration of proteome research, increasing attention has been paid to multidimensional liquid chromatography-mass spectrometry (MDLC-MS) due to its high peak capacity and separation efficiency. Recently, many efforts have been put to improve MDLC-based strategies including "top-down" and "bottom-up" to enable highly sensitive qualitative and quantitative analysis of proteins, as well as accelerate the whole analytical procedure. Integrated platforms with combination of sample pretreatment, multidimensional separations and identification were also developed to achieve high throughput and sensitive detection of proteomes, facilitating highly accurate and reproducible quantification. This review summarized the recent advances of such techniques and their applications in qualitative and quantitative analysis of proteomes. Copyright © 2012 Elsevier B.V. All rights reserved.

  8. Quantitative carbon detector for enhanced detection of molecules in foods, pharmaceuticals, cosmetics, flavors, and fuels.

    PubMed

    Beach, Connor A; Krumm, Christoph; Spanjers, Charles S; Maduskar, Saurabh; Jones, Andrew J; Dauenhauer, Paul J

    2016-03-07

    Analysis of trace compounds, such as pesticides and other contaminants, within consumer products, fuels, and the environment requires quantification of increasingly complex mixtures of difficult-to-quantify compounds. Many compounds of interest are non-volatile and exhibit poor response in current gas chromatography and flame ionization systems. Here we show the reaction of trimethylsilylated chemical analytes to methane using a quantitative carbon detector (QCD; the Polyarc™ reactor) within a gas chromatograph (GC), thereby enabling enhanced detection (up to 10×) of highly functionalized compounds including carbohydrates, acids, drugs, flavorants, and pesticides. Analysis of a complex mixture of compounds shows that the GC-QCD method exhibits faster and more accurate analysis of complex mixtures commonly encountered in everyday products and the environment.

  9. Learning the Task Management Space of an Aircraft Approach Model

    NASA Technical Reports Server (NTRS)

    Krall, Joseph; Menzies, Tim; Davies, Misty

    2014-01-01

    Validating models of airspace operations is a particular challenge. These models are often aimed at finding and exploring safety violations, and aim to be accurate representations of real-world behavior. However, the rules governing the behavior are quite complex: nonlinear physics, operational modes, human behavior, and stochastic environmental concerns all determine the responses of the system. In this paper, we present a study on aircraft runway approaches as modeled in Georgia Tech's Work Models that Compute (WMC) simulation. We use a new learner, Genetic-Active Learning for Search-Based Software Engineering (GALE) to discover the Pareto frontiers defined by cognitive structures. These cognitive structures organize the prioritization and assignment of tasks of each pilot during approaches. We discuss the benefits of our approach, and also discuss future work necessary to enable uncertainty quantification.

  10. Robust tracking and quantification of C. elegans body shape and locomotion through coiling, entanglement, and omega bends

    PubMed Central

    Roussel, Nicolas; Sprenger, Jeff; Tappan, Susan J; Glaser, Jack R

    2014-01-01

    The behavior of the well-characterized nematode, Caenorhabditis elegans (C. elegans), is often used to study the neurologic control of sensory and motor systems in models of health and neurodegenerative disease. To advance the quantification of behaviors to match the progress made in the breakthroughs of genetics, RNA, proteins, and neuronal circuitry, analysis must be able to extract subtle changes in worm locomotion across a population. The analysis of worm crawling motion is complex due to self-overlap, coiling, and entanglement. Using current techniques, the scope of the analysis is typically restricted to worms to their non-occluded, uncoiled state which is incomplete and fundamentally biased. Using a model describing the worm shape and crawling motion, we designed a deformable shape estimation algorithm that is robust to coiling and entanglement. This model-based shape estimation algorithm has been incorporated into a framework where multiple worms can be automatically detected and tracked simultaneously throughout the entire video sequence, thereby increasing throughput as well as data validity. The newly developed algorithms were validated against 10 manually labeled datasets obtained from video sequences comprised of various image resolutions and video frame rates. The data presented demonstrate that tracking methods incorporated in WormLab enable stable and accurate detection of these worms through coiling and entanglement. Such challenging tracking scenarios are common occurrences during normal worm locomotion. The ability for the described approach to provide stable and accurate detection of C. elegans is critical to achieve unbiased locomotory analysis of worm motion. PMID:26435884

  11. Ultrasound guided fluorescence molecular tomography with improved quantification by an attenuation compensated born-normalization and in vivo preclinical study of cancer

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Baoqiang; Berti, Romain; Abran, Maxime

    2014-05-15

    Ultrasound imaging, having the advantages of low-cost and non-invasiveness over MRI and X-ray CT, was reported by several studies as an adequate complement to fluorescence molecular tomography with the perspective of improving localization and quantification of fluorescent molecular targets in vivo. Based on the previous work, an improved dual-modality Fluorescence-Ultrasound imaging system was developed and then validated in imaging study with preclinical tumor model. Ultrasound imaging and a profilometer were used to obtain the anatomical prior information and 3D surface, separately, to precisely extract the tissue boundary on both sides of sample in order to achieve improved fluorescence reconstruction. Furthermore,more » a pattern-based fluorescence reconstruction on the detection side was incorporated to enable dimensional reduction of the dataset while keeping the useful information for reconstruction. Due to its putative role in the current imaging geometry and the chosen reconstruction technique, we developed an attenuation compensated Born-normalization method to reduce the attenuation effects and cancel off experimental factors when collecting quantitative fluorescence datasets over large area. Results of both simulation and phantom study demonstrated that fluorescent targets could be recovered accurately and quantitatively using this reconstruction mechanism. Finally, in vivo experiment confirms that the imaging system associated with the proposed image reconstruction approach was able to extract both functional and anatomical information, thereby improving quantification and localization of molecular targets.« less

  12. Strawberry: Fast and accurate genome-guided transcript reconstruction and quantification from RNA-Seq.

    PubMed

    Liu, Ruolin; Dickerson, Julie

    2017-11-01

    We propose a novel method and software tool, Strawberry, for transcript reconstruction and quantification from RNA-Seq data under the guidance of genome alignment and independent of gene annotation. Strawberry consists of two modules: assembly and quantification. The novelty of Strawberry is that the two modules use different optimization frameworks but utilize the same data graph structure, which allows a highly efficient, expandable and accurate algorithm for dealing large data. The assembly module parses aligned reads into splicing graphs, and uses network flow algorithms to select the most likely transcripts. The quantification module uses a latent class model to assign read counts from the nodes of splicing graphs to transcripts. Strawberry simultaneously estimates the transcript abundances and corrects for sequencing bias through an EM algorithm. Based on simulations, Strawberry outperforms Cufflinks and StringTie in terms of both assembly and quantification accuracies. Under the evaluation of a real data set, the estimated transcript expression by Strawberry has the highest correlation with Nanostring probe counts, an independent experiment measure for transcript expression. Strawberry is written in C++14, and is available as open source software at https://github.com/ruolin/strawberry under the MIT license.

  13. Development and Optimization of a Dedicated, Hybrid Dual-Modality SPECT-CmT System for Improved Breast Lesion Diagnosis

    DTIC Science & Technology

    2010-01-01

    throughout the entire 3D volume which made quantification of the different tissues in the breast possible. The p eaks representing glandular and fat in...coefficients. Keywords: tissue quantification , absolute attenuation coefficient, scatter correction, computed tomography, tomography... tissue types. 1-4 Accurate measurements of t he quantification and di fferentiation of numerous t issues can be useful to identify di sease from

  14. Automatic Assessment of Acquisition and Transmission Losses in Indian Remote Sensing Satellite Data

    NASA Astrophysics Data System (ADS)

    Roy, D.; Purna Kumari, B.; Manju Sarma, M.; Aparna, N.; Gopal Krishna, B.

    2016-06-01

    The quality of Remote Sensing data is an important parameter that defines the extent of its usability in various applications. The data from Remote Sensing satellites is received as raw data frames at the ground station. This data may be corrupted with data losses due to interferences during data transmission, data acquisition and sensor anomalies. Thus it is important to assess the quality of the raw data before product generation for early anomaly detection, faster corrective actions and product rejection minimization. Manual screening of raw images is a time consuming process and not very accurate. In this paper, an automated process for identification and quantification of losses in raw data like pixel drop out, line loss and data loss due to sensor anomalies is discussed. Quality assessment of raw scenes based on these losses is also explained. This process is introduced in the data pre-processing stage and gives crucial data quality information to users at the time of browsing data for product ordering. It has also improved the product generation workflow by enabling faster and more accurate quality estimation.

  15. Automated Dispersion and Orientation Analysis for Carbon Nanotube Reinforced Polymer Composites

    PubMed Central

    Gao, Yi; Li, Zhuo; Lin, Ziyin; Zhu, Liangjia; Tannenbaum, Allen; Bouix, Sylvain; Wong, C.P.

    2012-01-01

    The properties of carbon nanotube (CNT)/polymer composites are strongly dependent on the dispersion and orientation of CNTs in the host matrix. Quantification of the dispersion and orientation of CNTs by microstructure observation and image analysis has been demonstrated as a useful way to understand the structure-property relationship of CNT/polymer composites. However, due to the various morphologies and large amount of CNTs in one image, automatic and accurate identification of CNTs has become the bottleneck for dispersion/orientation analysis. To solve this problem, shape identification is performed for each pixel in the filler identification step, so that individual CNT can be exacted from images automatically. The improved filler identification enables more accurate analysis of CNT dispersion and orientation. The obtained dispersion index and orientation index of both synthetic and real images from model compounds correspond well with the observations. Moreover, these indices help to explain the electrical properties of CNT/Silicone composite, which is used as a model compound. This method can also be extended to other polymer composites with high aspect ratio fillers. PMID:23060008

  16. Microstructural Quantification, Property Prediction, and Stochastic Reconstruction of Heterogeneous Materials Using Limited X-Ray Tomography Data

    NASA Astrophysics Data System (ADS)

    Li, Hechao

    An accurate knowledge of the complex microstructure of a heterogeneous material is crucial for quantitative structure-property relations establishment and its performance prediction and optimization. X-ray tomography has provided a non-destructive means for microstructure characterization in both 3D and 4D (i.e., structural evolution over time). Traditional reconstruction algorithms like filtered-back-projection (FBP) method or algebraic reconstruction techniques (ART) require huge number of tomographic projections and segmentation process before conducting microstructural quantification. This can be quite time consuming and computationally intensive. In this thesis, a novel procedure is first presented that allows one to directly extract key structural information in forms of spatial correlation functions from limited x-ray tomography data. The key component of the procedure is the computation of a "probability map", which provides the probability of an arbitrary point in the material system belonging to specific phase. The correlation functions of interest are then readily computed from the probability map. Using effective medium theory, accurate predictions of physical properties (e.g., elastic moduli) can be obtained. Secondly, a stochastic optimization procedure that enables one to accurately reconstruct material microstructure from a small number of x-ray tomographic projections (e.g., 20 - 40) is presented. Moreover, a stochastic procedure for multi-modal data fusion is proposed, where both X-ray projections and correlation functions computed from limited 2D optical images are fused to accurately reconstruct complex heterogeneous materials in 3D. This multi-modal reconstruction algorithm is proved to be able to integrate the complementary data to perform an excellent optimization procedure, which indicates its high efficiency in using limited structural information. Finally, the accuracy of the stochastic reconstruction procedure using limited X-ray projection data is ascertained by analyzing the microstructural degeneracy and the roughness of energy landscape associated with different number of projections. Ground-state degeneracy of a microstructure is found to decrease with increasing number of projections, which indicates a higher probability that the reconstructed configurations match the actual microstructure. The roughness of energy landscape can also provide information about the complexity and convergence behavior of the reconstruction for given microstructures and projection number.

  17. Methods to Detect Nitric Oxide and its Metabolites in Biological Samples

    PubMed Central

    Bryan, Nathan S.; Grisham, Matthew B.

    2007-01-01

    Nitric oxide (NO) methodology is a complex and often confusing science and the focus of many debates and discussion concerning NO biochemistry. NO is involved in many physiological processes including regulation of blood pressure, immune response and neural communication. Therefore its accurate detection and quantification is critical to understanding health and disease. Due to the extremely short physiological half life of this gaseous free radical, alternative strategies for the detection of reaction products of NO biochemistry have been developed. The quantification of NO metabolites in biological samples provides valuable information with regards to in vivo NO production, bioavailability and metabolism. Simply sampling a single compartment such as blood or plasma may not always provide an accurate assessment of whole body NO status, particularly in tissues. Therefore, extrapolation of plasma or blood NO status to specific tissues of interest is no longer a valid approach. As a result, methods continue to be developed and validated which allow the detection and quantification of NO and NO-related products/metabolites in multiple compartments of experimental animals in vivo. The methods described in this review is not an exhaustive or comprehensive discussion of all methods available for the detection of NO but rather a description of the most commonly used and practical methods which allow accurate and sensitive quantification of NO products/metabolites in multiple biological matrices under normal physiological conditions. PMID:17664129

  18. Developing a real-time PCR assay for direct identification and quantification of Pratylenchus penetrans in soil

    USDA-ARS?s Scientific Manuscript database

    The root-lesion nematode Pratylenchus penetrans is a major pathogen of potato world-wide. Yield losses may be exacerbated by interaction with the fungus Verticillium dahliae in the Potato early dying disease complex. Accurate identification and quantification of P. penetrans prior to planting are es...

  19. Quantification of Efficiency of Beneficiation of Lunar Regolith

    NASA Technical Reports Server (NTRS)

    Trigwell, Steve; Lane, John; Captain, James; Weis, Kyle; Quinn, Jacqueline; Watanabe, Fumiya

    2011-01-01

    Electrostatic beneficiation of lunar regolith is being researched at Kennedy Space Center to enhance the ilmenite concentration of the regolith for the production of oxygen in in-situ resource utilization on the lunar surface. Ilmenite enrichment of up to 200% was achieved using lunar simulants. For the most accurate quantification of the regolith particles, standard petrographic methods are typically followed, but in order to optimize the process, many hundreds of samples were generated in this study that made the standard analysis methods time prohibitive. In the current studies, X-ray photoelectron spectroscopy (XPS) and Secondary Electron microscopy/Energy Dispersive Spectroscopy (SEM/EDS) were used that could automatically, and quickly, analyze many separated fractions of lunar simulant. In order to test the accuracy of the quantification, test mixture samples of known quantities of ilmenite (2, 5, 10, and 20 wt%) in silica (pure quartz powder), were analyzed by XPS and EDS. The results showed that quantification for low concentrations of ilmenite in silica could be accurately achieved by both XPS and EDS, knowing the limitations of the techniques. 1

  20. The effect of applied transducer force on acoustic radiation force impulse quantification within the left lobe of the liver.

    PubMed

    Porra, Luke; Swan, Hans; Ho, Chien

    2015-08-01

    Introduction: Acoustic Radiation Force Impulse (ARFI) Quantification measures shear wave velocities (SWVs) within the liver. It is a reliable method for predicting the severity of liver fibrosis and has the potential to assess fibrosis in any part of the liver, but previous research has found ARFI quantification in the right lobe more accurate than in the left lobe. A lack of standardised applied transducer force when performing ARFI quantification in the left lobe of the liver may account for some of this inaccuracy. The research hypothesis of this present study predicted that an increase in applied transducer force would result in an increase in SWVs measured. Methods: ARFI quantification within the left lobe of the liver was performed within a group of healthy volunteers (n = 28). During each examination, each participant was subjected to ARFI quantification at six different levels of transducer force applied to the epigastric abdominal wall. Results: A repeated measures ANOVA test showed that ARFI quantification was significantly affected by applied transducer force (p = 0.002). Significant pairwise comparisons using Bonferroni correction for multiple comparisons showed that with an increase in applied transducer force, there was a decrease in SWVs. Conclusion: Applied transducer force has a significant effect on SWVs within the left lobe of the liver and it may explain some of the less accurate and less reliable results in previous studies where transducer force was not taken into consideration. Future studies in the left lobe of the liver should take this into account and control for applied transducer force.

  1. Quantification of Fibrosis and Osteosclerosis in Myeloproliferative Neoplasms: A Computer-Assisted Image Study

    PubMed Central

    Teman, Carolin J.; Wilson, Andrew R.; Perkins, Sherrie L.; Hickman, Kimberly; Prchal, Josef T.; Salama, Mohamed E.

    2010-01-01

    Evaluation of bone marrow fibrosis and osteosclerosis in myeloproliferative neoplasms (MPN) is subject to interobserver inconsistency. Performance data for currently utilized fibrosis grading systems are lacking, and classification scales for osteosclerosis do not exist. Digital imaging can serve as a quantification method for fibrosis and osteosclerosis. We used digital imaging techniques for trabecular area assessment and reticulin-fiber quantification. Patients with all Philadelphia negative MPN subtypes had higher trabecular volume than controls (p ≤0.0015). Results suggest that the degree of osteosclerosis helps differentiate primary myelofibrosis from other MPN. Numerical quantification of fibrosis highly correlated with subjective scores, and interobserver correlation was satisfactory. Digital imaging provides accurate quantification for osteosclerosis and fibrosis. PMID:20122729

  2. A Lagrangian cylindrical coordinate system for characterizing dynamic surface geometry of tubular anatomic structures.

    PubMed

    Lundh, Torbjörn; Suh, Ga-Young; DiGiacomo, Phillip; Cheng, Christopher

    2018-03-03

    Vascular morphology characterization is useful for disease diagnosis, risk stratification, treatment planning, and prediction of treatment durability. To quantify the dynamic surface geometry of tubular-shaped anatomic structures, we propose a simple, rigorous Lagrangian cylindrical coordinate system to monitor well-defined surface points. Specifically, the proposed system enables quantification of surface curvature and cross-sectional eccentricity. Using idealized software phantom examples, we validate the method's ability to accurately quantify longitudinal and circumferential surface curvature, as well as eccentricity and orientation of eccentricity. We then apply the method to several medical imaging data sets of human vascular structures to exemplify the utility of this coordinate system for analyzing morphology and dynamic geometric changes in blood vessels throughout the body. Graphical abstract Pointwise longitudinal curvature of a thoracic aortic endograft surface for systole and diastole, with their absolute difference.

  3. Perfusion quantification in contrast-enhanced ultrasound (CEUS)--ready for research projects and routine clinical use.

    PubMed

    Tranquart, F; Mercier, L; Frinking, P; Gaud, E; Arditi, M

    2012-07-01

    With contrast-enhanced ultrasound (CEUS) now established as a valuable imaging modality for many applications, a more specific demand has recently emerged for quantifying perfusion and using measured parameters as objective indicators for various disease states. However, CEUS perfusion quantification remains challenging and is not well integrated in daily clinical practice. The development of VueBox™ alleviates existing limitations and enables quantification in a standardized way. VueBox™ operates as an off-line software application, after dynamic contrast-enhanced ultrasound (DCE-US) is performed. It enables linearization of DICOM clips, assessment of perfusion using patented curve-fitting models, and generation of parametric images by synthesizing perfusion information at the pixel level using color coding. VueBox™ is compatible with most of the available ultrasound platforms (nonlinear contrast-enabled), has the ability to process both bolus and disruption-replenishment kinetics loops, allows analysis results and their context to be saved, and generates analysis reports automatically. Specific features have been added to VueBox™, such as fully automatic in-plane motion compensation and an easy-to-use clip editor. Processing time has been reduced as a result of parallel programming optimized for multi-core processors. A long list of perfusion parameters is available for each of the two administration modes to address all possible demands currently reported in the literature for diagnosis or treatment monitoring. In conclusion, VueBox™ is a valid and robust quantification tool to be used for standardizing perfusion quantification and to improve the reproducibility of results across centers. © Georg Thieme Verlag KG Stuttgart · New York.

  4. Recent advances in stable isotope labeling based techniques for proteome relative quantification.

    PubMed

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2014-10-24

    The large scale relative quantification of all proteins expressed in biological samples under different states is of great importance for discovering proteins with important biological functions, as well as screening disease related biomarkers and drug targets. Therefore, the accurate quantification of proteins at proteome level has become one of the key issues in protein science. Herein, the recent advances in stable isotope labeling based techniques for proteome relative quantification were reviewed, from the aspects of metabolic labeling, chemical labeling and enzyme-catalyzed labeling. Furthermore, the future research direction in this field was prospected. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Automatic detection of new tumors and tumor burden evaluation in longitudinal liver CT scan studies.

    PubMed

    Vivanti, R; Szeskin, A; Lev-Cohain, N; Sosna, J; Joskowicz, L

    2017-11-01

    Radiological longitudinal follow-up of liver tumors in CT scans is the standard of care for disease progression assessment and for liver tumor therapy. Finding new tumors in the follow-up scan is essential to determine malignancy, to evaluate the total tumor burden, and to determine treatment efficacy. Since new tumors are typically small, they may be missed by examining radiologists. We describe a new method for the automatic detection and segmentation of new tumors in longitudinal liver CT studies and for liver tumors burden quantification. Its inputs are the baseline and follow-up CT scans, the baseline tumors delineation, and a tumor appearance prior model. Its outputs are the new tumors segmentations in the follow-up scan, the tumor burden quantification in both scans, and the tumor burden change. Our method is the first comprehensive method that is explicitly designed to find new liver tumors. It integrates information from the scans, the baseline known tumors delineations, and a tumor appearance prior model in the form of a global convolutional neural network classifier. Unlike other deep learning-based methods, it does not require large tagged training sets. Our experimental results on 246 tumors, of which 97 were new tumors, from 37 longitudinal liver CT studies with radiologist approved ground-truth segmentations, yields a true positive new tumors detection rate of 86 versus 72% with stand-alone detection, and a tumor burden volume overlap error of 16%. New tumors detection and tumor burden volumetry are important for diagnosis and treatment. Our new method enables a simplified radiologist-friendly workflow that is potentially more accurate and reliable than the existing one by automatically and accurately following known tumors and detecting new tumors in the follow-up scan.

  6. Covariation of Peptide Abundances Accurately Reflects Protein Concentration Differences*

    PubMed Central

    Pirmoradian, Mohammad

    2017-01-01

    Most implementations of mass spectrometry-based proteomics involve enzymatic digestion of proteins, expanding the analysis to multiple proteolytic peptides for each protein. Currently, there is no consensus of how to summarize peptides' abundances to protein concentrations, and such efforts are complicated by the fact that error control normally is applied to the identification process, and do not directly control errors linking peptide abundance measures to protein concentration. Peptides resulting from suboptimal digestion or being partially modified are not representative of the protein concentration. Without a mechanism to remove such unrepresentative peptides, their abundance adversely impacts the estimation of their protein's concentration. Here, we present a relative quantification approach, Diffacto, that applies factor analysis to extract the covariation of peptides' abundances. The method enables a weighted geometrical average summarization and automatic elimination of incoherent peptides. We demonstrate, based on a set of controlled label-free experiments using standard mixtures of proteins, that the covariation structure extracted by the factor analysis accurately reflects protein concentrations. In the 1% peptide-spectrum match-level FDR data set, as many as 11% of the peptides have abundance differences incoherent with the other peptides attributed to the same protein. If not controlled, such contradicting peptide abundance have a severe impact on protein quantifications. When adding the quantities of each protein's three most abundant peptides, we note as many as 14% of the proteins being estimated as having a negative correlation with their actual concentration differences between samples. Diffacto reduced the amount of such obviously incorrectly quantified proteins to 1.6%. Furthermore, by analyzing clinical data sets from two breast cancer studies, our method revealed the persistent proteomic signatures linked to three subtypes of breast cancer. We conclude that Diffacto can facilitate the interpretation and enhance the utility of most types of proteomics data. PMID:28302922

  7. Multi-laboratory comparison of quantitative PCR assays for detection and quantification of Fusarium virguliforme from soybean roots and soil

    USDA-ARS?s Scientific Manuscript database

    Accurate identification and quantification of Fusarium virguliforme, the cause of sudden death syndrome (SDS) in soybean, within root tissue and soil are important tasks. Several quantitative PCR (qPCR) assays have been developed but there are no reports comparing their use in sensitive and specific...

  8. Quick, sensitive and specific detection and evaluation of quantification of minor variants by high-throughput sequencing.

    PubMed

    Leung, Ross Ka-Kit; Dong, Zhi Qiang; Sa, Fei; Chong, Cheong Meng; Lei, Si Wan; Tsui, Stephen Kwok-Wing; Lee, Simon Ming-Yuen

    2014-02-01

    Minor variants have significant implications in quasispecies evolution, early cancer detection and non-invasive fetal genotyping but their accurate detection by next-generation sequencing (NGS) is hampered by sequencing errors. We generated sequencing data from mixtures at predetermined ratios in order to provide insight into sequencing errors and variations that can arise for which simulation cannot be performed. The information also enables better parameterization in depth of coverage, read quality and heterogeneity, library preparation techniques, technical repeatability for mathematical modeling, theory development and simulation experimental design. We devised minor variant authentication rules that achieved 100% accuracy in both testing and validation experiments. The rules are free from tedious inspection of alignment accuracy, sequencing read quality or errors introduced by homopolymers. The authentication processes only require minor variants to: (1) have minimum depth of coverage larger than 30; (2) be reported by (a) four or more variant callers, or (b) DiBayes or LoFreq, plus SNVer (or BWA when no results are returned by SNVer), and with the interassay coefficient of variation (CV) no larger than 0.1. Quantification accuracy undermined by sequencing errors could neither be overcome by ultra-deep sequencing, nor recruiting more variant callers to reach a consensus, such that consistent underestimation and overestimation (i.e. low CV) were observed. To accommodate stochastic error and adjust the observed ratio within a specified accuracy, we presented a proof of concept for the use of a double calibration curve for quantification, which provides an important reference towards potential industrial-scale fabrication of calibrants for NGS.

  9. Headspace-Solid Phase Microextraction Approach for Dimethylsulfoniopropionate Quantification in Solanum lycopersicum Plants Subjected to Water Stress

    PubMed Central

    Catola, Stefano; Kaidala Ganesha, Srikanta Dani; Calamai, Luca; Loreto, Francesco; Ranieri, Annamaria; Centritto, Mauro

    2016-01-01

    Dimethylsulfoniopropionate (DMSP) and dimethyl sulphide (DMS) are compounds found mainly in marine phytoplankton and in some halophytic plants. DMS is a globally important biogenic volatile in regulating of global sulfur cycle and planetary albedo, whereas DMSP is involved in the maintenance of plant-environment homeostasis. Plants emit minute amounts of DMS compared to marine phytoplankton and there is a need for hypersensitive analytic techniques to enable its quantification in plants. Solid Phase Micro Extraction from Head Space (HS-SPME) is a simple, rapid, solvent-free and cost-effective extraction mode, which can be easily hyphenated with GC-MS for the analysis of volatile organic compounds. Using tomato (Solanum lycopersicum) plants subjected to water stress as a model system, we standardized a sensitive and accurate protocol for detecting and quantifying DMSP pool sizes, and potential DMS emissions, in cryoextracted leaves. The method relies on the determination of DMS free and from DMSP pools before and after the alkaline hydrolysis via Headspace-Solid Phase Micro Extraction-Gas Chromatography-Mass Spectrometry (HS-SPME-GC-MS). We found a significant (2.5 time) increase of DMSP content in water-stressed leaves reflecting clear stress to the photosynthetic apparatus. We hypothesize that increased DMSP, and in turn DMS, in water-stressed leaves are produced by carbon sources other than direct photosynthesis, and function to protect plants either osmotically or as antioxidants. Finally, our results suggest that SPME is a powerful and suitable technique for the detection and quantification of biogenic gasses in trace amounts. PMID:27602039

  10. Error correction in multi-fidelity molecular dynamics simulations using functional uncertainty quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Reeve, Samuel Temple; Strachan, Alejandro, E-mail: strachan@purdue.edu

    We use functional, Fréchet, derivatives to quantify how thermodynamic outputs of a molecular dynamics (MD) simulation depend on the potential used to compute atomic interactions. Our approach quantifies the sensitivity of the quantities of interest with respect to the input functions as opposed to its parameters as is done in typical uncertainty quantification methods. We show that the functional sensitivity of the average potential energy and pressure in isothermal, isochoric MD simulations using Lennard–Jones two-body interactions can be used to accurately predict those properties for other interatomic potentials (with different functional forms) without re-running the simulations. This is demonstrated undermore » three different thermodynamic conditions, namely a crystal at room temperature, a liquid at ambient pressure, and a high pressure liquid. The method provides accurate predictions as long as the change in potential can be reasonably described to first order and does not significantly affect the region in phase space explored by the simulation. The functional uncertainty quantification approach can be used to estimate the uncertainties associated with constitutive models used in the simulation and to correct predictions if a more accurate representation becomes available.« less

  11. Precise montaging and metric quantification of retinal surface area from ultra-widefield fundus photography and fluorescein angiography.

    PubMed

    Croft, Daniel E; van Hemert, Jano; Wykoff, Charles C; Clifton, David; Verhoek, Michael; Fleming, Alan; Brown, David M

    2014-01-01

    Accurate quantification of retinal surface area from ultra-widefield (UWF) images is challenging due to warping produced when the retina is projected onto a two-dimensional plane for analysis. By accounting for this, the authors sought to precisely montage and accurately quantify retinal surface area in square millimeters. Montages were created using Optos 200Tx (Optos, Dunfermline, U.K.) images taken at different gaze angles. A transformation projected the images to their correct location on a three-dimensional model. Area was quantified with spherical trigonometry. Warping, precision, and accuracy were assessed. Uncorrected, posterior pixels represented up to 79% greater surface area than peripheral pixels. Assessing precision, a standard region was quantified across 10 montages of the same eye (RSD: 0.7%; mean: 408.97 mm(2); range: 405.34-413.87 mm(2)). Assessing accuracy, 50 patients' disc areas were quantified (mean: 2.21 mm(2); SE: 0.06 mm(2)), and the results fell within the normative range. By accounting for warping inherent in UWF images, precise montaging and accurate quantification of retinal surface area in square millimeters were achieved. Copyright 2014, SLACK Incorporated.

  12. FineSplice, enhanced splice junction detection and quantification: a novel pipeline based on the assessment of diverse RNA-Seq alignment solutions.

    PubMed

    Gatto, Alberto; Torroja-Fungairiño, Carlos; Mazzarotto, Francesco; Cook, Stuart A; Barton, Paul J R; Sánchez-Cabo, Fátima; Lara-Pezzi, Enrique

    2014-04-01

    Alternative splicing is the main mechanism governing protein diversity. The recent developments in RNA-Seq technology have enabled the study of the global impact and regulation of this biological process. However, the lack of standardized protocols constitutes a major bottleneck in the analysis of alternative splicing. This is particularly important for the identification of exon-exon junctions, which is a critical step in any analysis workflow. Here we performed a systematic benchmarking of alignment tools to dissect the impact of design and method on the mapping, detection and quantification of splice junctions from multi-exon reads. Accordingly, we devised a novel pipeline based on TopHat2 combined with a splice junction detection algorithm, which we have named FineSplice. FineSplice allows effective elimination of spurious junction hits arising from artefactual alignments, achieving up to 99% precision in both real and simulated data sets and yielding superior F1 scores under most tested conditions. The proposed strategy conjugates an efficient mapping solution with a semi-supervised anomaly detection scheme to filter out false positives and allows reliable estimation of expressed junctions from the alignment output. Ultimately this provides more accurate information to identify meaningful splicing patterns. FineSplice is freely available at https://sourceforge.net/p/finesplice/.

  13. Experimental Quantification of Pore-Scale Flow Phenomena in 2D Heterogeneous Porous Micromodels: Multiphase Flow Towards Coupled Solid-Liquid Interactions

    NASA Astrophysics Data System (ADS)

    Li, Y.; Kazemifar, F.; Blois, G.; Christensen, K. T.

    2017-12-01

    Geological sequestration of CO2 within saline aquifers is a viable technology for reducing CO2 emissions. Central to this goal is accurately predicting both the fidelity of candidate sites pre-injection of CO2 and its post-injection migration. Moreover, local fluid pressure buildup may cause activation of small pre-existing unidentified faults, leading to micro-seismic events, which could prove disastrous for societal acceptance of CCS, and possibly compromise seal integrity. Recent evidence shows that large-scale events are coupled with pore-scale phenomena, which necessitates the representation of pore-scale stress, strain, and multiphase flow processes in large-scale modeling. To this end, the pore-scale flow of water and liquid/supercritical CO2 is investigated under reservoir-relevant conditions, over a range of wettability conditions in 2D heterogeneous micromodels that reflect the complexity of a real sandstone. High-speed fluorescent microscopy, complemented by a fast differential pressure transmitter, allows for simultaneous measurement of the flow field within and the instantaneous pressure drop across the micromodels. A flexible micromodel is also designed and fabricated, to be used in conjunction with the micro-PIV technique, enabling the quantification of coupled solid-liquid interactions.

  14. Quantification of single-kidney glomerular filtration rate with electron-beam computed tomography

    NASA Astrophysics Data System (ADS)

    Lerman, Lilach O.; Ritman, Erik L.; Pelaez, Laura I.; Sheedy, Patrick F., II; Krier, James D.

    2000-04-01

    The ability to accurately and noninvasively quantify single- kidney GFR could be invaluable for assessment of renal function. We developed a model that enables this measurement with EBCT. To examine the reliability of this method, EBCT renal flow and volume studies after contrast media administration were performed in pigs with unilateral renal artery stenosis (Group 1), controls (Group 2), and simultaneously with inulin clearance (Group 3). Renal flow curves, obtained from the bilateral renal cortex and medulla, depicted transit of the contrast through the vascular and tubular compartments, and were fitted using extended gamma- variate functions. Renal blood flow was calculated as the sum of products of cortical and medullary perfusions and volumes. Normalized GFR (mL/min/cc) was calculated using the rate (maximal slope) of proximal tubular contrast accumulation, and EBCT-GFR as normalized GFR* cortical volume. In Group 1, the decreased GFR of the stenotic kidney correlated well with its decreased volume and RBF, and with the degree of stenosis (r equals -0.99). In Group 3, EBCT-GFR correlated well with inulin clearance (slope 1.1, r equals 0.81). This novel approach can be very useful for quantification of concurrent regional hemodynamics and function in the intact kidneys, in a manner potentially applicable to humans.

  15. Development and validation of an LC-UV method for the quantification and purity determination of the novel anticancer agent C1311 and its pharmaceutical dosage form.

    PubMed

    den Brok, Monique W J; Nuijen, Bastiaan; Hillebrand, Michel J X; Grieshaber, Charles K; Harvey, Michael D; Beijnen, Jos H

    2005-09-01

    C1311 (5-[[2-(diethylamino)ethyl]amino]-8-hydroxyimidazo [4,5,1-de]-acridin-6-one-dihydrochloride trihydrate) is the lead compound from the group of imidazoacridinones, a novel group of rationally designed anticancer agents. The pharmaceutical development of C1311 necessitated the availability of an assay for the quantification and purity determination of C1311 active pharmaceutical ingredient (API) and its pharmaceutical dosage form. A reversed-phase liquid chromatographic method (RP-LC) with ultraviolet (UV) detection was developed, consisting of separation on a C18 column with phosphate buffer (60 mM; pH 3 with 1 M citric acid)-acetonitrile-triethylamine (83:17:0.05, v/v/v) as the mobile phase and UV-detection at 280 nm. The method was found to be linear over a concentration range of 2.50-100 microg/mL, precise and accurate. Accelerated stress testing showed degradation products, which were well separated from the parent compound, confirming its stability-indicating capacity. Moreover, the use of LC-MS and on-line photo diode array detection enabled us to propose structures for four degradation products. Two of these products were also found as impurities in the API and more abundantly in an impure lot of API.

  16. Novel isotopic N, N-Dimethyl Leucine (iDiLeu) Reagents Enable Absolute Quantification of Peptides and Proteins Using a Standard Curve Approach

    NASA Astrophysics Data System (ADS)

    Greer, Tyler; Lietz, Christopher B.; Xiang, Feng; Li, Lingjun

    2015-01-01

    Absolute quantification of protein targets using liquid chromatography-mass spectrometry (LC-MS) is a key component of candidate biomarker validation. One popular method combines multiple reaction monitoring (MRM) using a triple quadrupole instrument with stable isotope-labeled standards (SIS) for absolute quantification (AQUA). LC-MRM AQUA assays are sensitive and specific, but they are also expensive because of the cost of synthesizing stable isotope peptide standards. While the chemical modification approach using mass differential tags for relative and absolute quantification (mTRAQ) represents a more economical approach when quantifying large numbers of peptides, these reagents are costly and still suffer from lower throughput because only two concentration values per peptide can be obtained in a single LC-MS run. Here, we have developed and applied a set of five novel mass difference reagents, isotopic N, N-dimethyl leucine (iDiLeu). These labels contain an amine reactive group, triazine ester, are cost effective because of their synthetic simplicity, and have increased throughput compared with previous LC-MS quantification methods by allowing construction of a four-point standard curve in one run. iDiLeu-labeled peptides show remarkably similar retention time shifts, slightly lower energy thresholds for higher-energy collisional dissociation (HCD) fragmentation, and high quantification accuracy for trypsin-digested protein samples (median errors <15%). By spiking in an iDiLeu-labeled neuropeptide, allatostatin, into mouse urine matrix, two quantification methods are validated. The first uses one labeled peptide as an internal standard to normalize labeled peptide peak areas across runs (<19% error), whereas the second enables standard curve creation and analyte quantification in one run (<8% error).

  17. A Java program for LRE-based real-time qPCR that enables large-scale absolute quantification.

    PubMed

    Rutledge, Robert G

    2011-03-02

    Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples.

  18. A Java Program for LRE-Based Real-Time qPCR that Enables Large-Scale Absolute Quantification

    PubMed Central

    Rutledge, Robert G.

    2011-01-01

    Background Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Findings Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. Conclusions The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples. PMID:21407812

  19. Establishment and evaluation of a bead-based luminex assay allowing simultaneous quantification of equine IL-12 and IFN-γ.

    PubMed

    Duran, Maria Carolina; Willenbrock, Saskia; Müller, Jessika-M V; Nolte, Ingo; Feige, Karsten; Murua Escobar, Hugo

    2013-04-01

    Interleukin-12 (IL-12) and interferon gamma (IFN-γ) are key cytokines in immunemediated equine melanoma therapy. Currently, a method for accurate simultaneous quantification of these equine cytokines is lacking. Therefore, we sought to establish an assay that allows for accurate and simultaneous quantification of equine IL-12 (eIL-12) and IFN-γ (eIFN-γ). Several antibodies were evaluated for cross-reactivity to eIL-12 and eIFN-γ and were used to establish a bead-based Luminex assay, which was subsequently applied to quantify cytokine concentrations in biological samples. Cytokine detection ranged from 31.5-5,000 pg/ml and 15-10,000 pg/ml for eIL-12 and eIFN-γ, respectively. eIL-12 was detected in supernatants of stimulated peripheral blood mononuclear cells (PBMCs) and supernatants/cell lysates of eIL-12 expression plasmid-transfected cells. Low or undetectable cytokine concentrations were measured in negative controls. In equine serum samples, the mean measured eIL-12 concentration was 1,374 ± 8 pg/ml. The bead-based assay and ELISA for eIFN-γ used to measure eIFN-γ concentrations, showed similar concentrations. Results demonstrate, to our knowledge for the first time, that cross-reactive antibody pairs to eIL-12 and eIFN-γ and Luminex bead-based technology allow for accurate, simultaneous and multiplexed quantification of these key cytokines in biological samples.

  20. Monitoring microbiological changes in drinking water systems using a fast and reproducible flow cytometric method.

    PubMed

    Prest, E I; Hammes, F; Kötzsch, S; van Loosdrecht, M C M; Vrouwenvelder, J S

    2013-12-01

    Flow cytometry (FCM) is a rapid, cultivation-independent tool to assess and evaluate bacteriological quality and biological stability of water. Here we demonstrate that a stringent, reproducible staining protocol combined with fixed FCM operational and gating settings is essential for reliable quantification of bacteria and detection of changes in aquatic bacterial communities. Triplicate measurements of diverse water samples with this protocol typically showed relative standard deviation values and 95% confidence interval values below 2.5% on all the main FCM parameters. We propose a straightforward and instrument-independent method for the characterization of water samples based on the combination of bacterial cell concentration and fluorescence distribution. Analysis of the fluorescence distribution (or so-called fluorescence fingerprint) was accomplished firstly through a direct comparison of the raw FCM data and subsequently simplified by quantifying the percentage of large and brightly fluorescent high nucleic acid (HNA) content bacteria in each sample. Our approach enables fast differentiation of dissimilar bacterial communities (less than 15 min from sampling to final result), and allows accurate detection of even small changes in aquatic environments (detection above 3% change). Demonstrative studies on (a) indigenous bacterial growth in water, (b) contamination of drinking water with wastewater, (c) household drinking water stagnation and (d) mixing of two drinking water types, univocally showed that this FCM approach enables detection and quantification of relevant bacterial water quality changes with high sensitivity. This approach has the potential to be used as a new tool for application in the drinking water field, e.g. for rapid screening of the microbial water quality and stability during water treatment and distribution in networks and premise plumbing. Copyright © 2013 Elsevier Ltd. All rights reserved.

  1. A statistical kinematic source inversion approach based on the QUESO library for uncertainty quantification and prediction

    NASA Astrophysics Data System (ADS)

    Zielke, Olaf; McDougall, Damon; Mai, Martin; Babuska, Ivo

    2014-05-01

    Seismic, often augmented with geodetic data, are frequently used to invert for the spatio-temporal evolution of slip along a rupture plane. The resulting images of the slip evolution for a single event, inferred by different research teams, often vary distinctly, depending on the adopted inversion approach and rupture model parameterization. This observation raises the question, which of the provided kinematic source inversion solutions is most reliable and most robust, and — more generally — how accurate are fault parameterization and solution predictions? These issues are not included in "standard" source inversion approaches. Here, we present a statistical inversion approach to constrain kinematic rupture parameters from teleseismic body waves. The approach is based a) on a forward-modeling scheme that computes synthetic (body-)waves for a given kinematic rupture model, and b) on the QUESO (Quantification of Uncertainty for Estimation, Simulation, and Optimization) library that uses MCMC algorithms and Bayes theorem for sample selection. We present Bayesian inversions for rupture parameters in synthetic earthquakes (i.e. for which the exact rupture history is known) in an attempt to identify the cross-over at which further model discretization (spatial and temporal resolution of the parameter space) is no longer attributed to a decreasing misfit. Identification of this cross-over is of importance as it reveals the resolution power of the studied data set (i.e. teleseismic body waves), enabling one to constrain kinematic earthquake rupture histories of real earthquakes at a resolution that is supported by data. In addition, the Bayesian approach allows for mapping complete posterior probability density functions of the desired kinematic source parameters, thus enabling us to rigorously assess the uncertainties in earthquake source inversions.

  2. 3-dimensional digital reconstruction of the murine coronary system for the evaluation of chronic allograft vasculopathy.

    PubMed

    Fónyad, László; Shinoda, Kazunobu; Farkash, Evan A; Groher, Martin; Sebastian, Divya P; Szász, A Marcell; Colvin, Robert B; Yagi, Yukako

    2015-03-28

    Chronic allograft vasculopathy (CAV) is a major mechanism of graft failure of transplanted organs in humans. Morphometric analysis of coronary arteries enables the quantitation of CAV in mouse models of heart transplantation. However, conventional histological procedures using single 2-dimensional sections limit the accuracy of CAV quantification. The aim of this study is to improve the accuracy of CAV quantification by reconstructing the murine coronary system in 3-dimensions (3D) and using virtual reconstruction and volumetric analysis to precisely assess neointimal thickness. Mouse tissue samples, native heart and transplanted hearts with chronic allograft vasculopathy, were collected and analyzed. Paraffin embedded samples were serially sectioned, stained and digitized using whole slide digital imaging techniques under normal and ultraviolet lighting. Sophisticated software tools were used to generate and manipulate 3D reconstructions of the major coronary arteries and branches. The 3D reconstruction provides not only accurate measurements but also exact volumetric data of vascular lesions. This virtual coronary arteriography demonstrates that the vasculopathy lesions in this model are localized to the proximal coronary segments. In addition, virtual rotation and volumetric analysis enabled more precise measurements of CAV than single, randomly oriented histologic sections, and offer an improved readout for this important experimental model. We believe 3D reconstruction of 2D histological slides will provide new insights into pathological mechanisms in which structural abnormalities play a role in the development of a disease. The techniques we describe are applicable to the analysis of arteries, veins, bronchioles and similar sized structures in a variety of tissue types and disease model systems. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/3772457541477230 .

  3. Detection of malondialdehyde in processed meat products without interference from the ingredients.

    PubMed

    Jung, Samooel; Nam, Ki Chang; Jo, Cheorun

    2016-10-15

    Our aim was to develop a method for accurate quantification of malondialdehyde (MDA) in meat products. MDA content of uncured ground pork (Control); ground pork cured with sodium nitrite (Nitrite); and ground pork cured with sodium nitrite, sodium chloride, sodium pyrophosphate, maltodextrin, and a sausage seasoning (Mix) was measured by the 2-thiobarbituric acid (TBA) assay with MDA extraction by trichloroacetic acid (method A) and two high-performance liquid chromatography (HPLC) methods: i) HPLC separation of the MDA-dinitrophenyl hydrazine adduct (method B) and ii) HPLC separation of MDA (method C) after MDA extraction with acetonitrile. Methods A and B could not quantify MDA accurately in groups Nitrite and Mix. Nevertheless, MDA in groups Control, Nitrite, and Mix was accurately quantified by method C with good recovery. Therefore, direct MDA quantification by HPLC after MDA extraction with acetonitrile (method C) is useful for accurate measurement of MDA content in processed meat products. Copyright © 2016 Elsevier Ltd. All rights reserved.

  4. Stochastic Analysis and Design of Heterogeneous Microstructural Materials System

    NASA Astrophysics Data System (ADS)

    Xu, Hongyi

    Advanced materials system refers to new materials that are comprised of multiple traditional constituents but complex microstructure morphologies, which lead to superior properties over the conventional materials. To accelerate the development of new advanced materials system, the objective of this dissertation is to develop a computational design framework and the associated techniques for design automation of microstructure materials systems, with an emphasis on addressing the uncertainties associated with the heterogeneity of microstructural materials. Five key research tasks are identified: design representation, design evaluation, design synthesis, material informatics and uncertainty quantification. Design representation of microstructure includes statistical characterization and stochastic reconstruction. This dissertation develops a new descriptor-based methodology, which characterizes 2D microstructures using descriptors of composition, dispersion and geometry. Statistics of 3D descriptors are predicted based on 2D information to enable 2D-to-3D reconstruction. An efficient sequential reconstruction algorithm is developed to reconstruct statistically equivalent random 3D digital microstructures. In design evaluation, a stochastic decomposition and reassembly strategy is developed to deal with the high computational costs and uncertainties induced by material heterogeneity. The properties of Representative Volume Elements (RVE) are predicted by stochastically reassembling SVE elements with stochastic properties into a coarse representation of the RVE. In design synthesis, a new descriptor-based design framework is developed, which integrates computational methods of microstructure characterization and reconstruction, sensitivity analysis, Design of Experiments (DOE), metamodeling and optimization the enable parametric optimization of the microstructure for achieving the desired material properties. Material informatics is studied to efficiently reduce the dimension of microstructure design space. This dissertation develops a machine learning-based methodology to identify the key microstructure descriptors that highly impact properties of interest. In uncertainty quantification, a comparative study on data-driven random process models is conducted to provide guidance for choosing the most accurate model in statistical uncertainty quantification. Two new goodness-of-fit metrics are developed to provide quantitative measurements of random process models' accuracy. The benefits of the proposed methods are demonstrated by the example of designing the microstructure of polymer nanocomposites. This dissertation provides material-generic, intelligent modeling/design methodologies and techniques to accelerate the process of analyzing and designing new microstructural materials system.

  5. Deep-Dive Targeted Quantification for Ultrasensitive Analysis of Proteins in Nondepleted Human Blood Plasma/Serum and Tissues

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Nie, Song; Shi, Tujin; Fillmore, Thomas L.

    Mass spectrometry-based targeted proteomics (e.g., selected reaction monitoring, SRM) is emerging as an attractive alternative to immunoassays for protein quantification. Recently we have made significant progress in SRM sensitivity for enabling quantification of low ng/mL to sub-ng/mL level proteins in nondepleted human blood plasma/serum without affinity enrichment. However, precise quantification of extremely low abundant but biologically important proteins (e.g., ≤100 pg/mL in blood plasma/serum) using targeted proteomics approaches still remains challenging. To address this need, we have developed an antibody-independent Deep-Dive SRM (DD-SRM) approach that capitalizes on multidimensional high-resolution reversed-phase liquid chromatography (LC) separation for target peptide enrichment combined withmore » precise selection of target peptide fractions of interest, significantly improving SRM sensitivity by ~5 orders of magnitude when compared to conventional LC-SRM. Application of DD-SRM to human serum and tissue has been demonstrated to enable precise quantification of endogenous proteins at ~10 pg/mL level in nondepleted serum and at <10 copies per cell level in tissue. Thus, DD-SRM holds great promise for precisely measuring extremely low abundance proteins or protein modifications, especially when high-quality antibody is not available.« less

  6. Quantification of Lignin and Its Structural Features in Plant Biomass Using 13C Lignin as Internal Standard for Pyrolysis-GC-SIM-MS.

    PubMed

    van Erven, Gijs; de Visser, Ries; Merkx, Donny W H; Strolenberg, Willem; de Gijsel, Peter; Gruppen, Harry; Kabel, Mirjam A

    2017-10-17

    Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13 C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12 C and 13 C lignin were isolated from nonlabeled and uniformly 13 C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13 C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12 C lignin analogue and was shown to be extremely accurate (>99.9%, R 2 > 0.999) and precise (RSD < 1.5%). Third, 13 C-IS based lignin quantification was applied to four common poaceous biomass sources (wheat straw, barley straw, corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin.

  7. Quantification of Lignin and Its Structural Features in Plant Biomass Using 13C Lignin as Internal Standard for Pyrolysis-GC-SIM-MS

    PubMed Central

    2017-01-01

    Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12C and 13C lignin were isolated from nonlabeled and uniformly 13C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12C lignin analogue and was shown to be extremely accurate (>99.9%, R2 > 0.999) and precise (RSD < 1.5%). Third, 13C-IS based lignin quantification was applied to four common poaceous biomass sources (wheat straw, barley straw, corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin. PMID:28926698

  8. A phase quantification method based on EBSD data for a continuously cooled microalloyed steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, H.; Wynne, B.P.; Palmiere, E.J., E-mail: e.j

    2017-01-15

    Mechanical properties of steels depend on the phase constitutions of the final microstructures which can be related to the processing parameters. Therefore, accurate quantification of different phases is necessary to investigate the relationships between processing parameters, final microstructures and mechanical properties. Point counting on micrographs observed by optical or scanning electron microscopy is widely used as a phase quantification method, and different phases are discriminated according to their morphological characteristics. However, it is difficult to differentiate some of the phase constituents with similar morphology. Differently, for EBSD based phase quantification methods, besides morphological characteristics, other parameters derived from the orientationmore » information can also be used for discrimination. In this research, a phase quantification method based on EBSD data in the unit of grains was proposed to identify and quantify the complex phase constitutions of a microalloyed steel subjected to accelerated coolings. Characteristics of polygonal ferrite/quasi-polygonal ferrite, acicular ferrite and bainitic ferrite on grain averaged misorientation angles, aspect ratios, high angle grain boundary fractions and grain sizes were analysed and used to develop the identification criteria for each phase. Comparing the results obtained by this EBSD based method and point counting, it was found that this EBSD based method can provide accurate and reliable phase quantification results for microstructures with relatively slow cooling rates. - Highlights: •A phase quantification method based on EBSD data in the unit of grains was proposed. •The critical grain area above which GAM angles are valid parameters was obtained. •Grain size and grain boundary misorientation were used to identify acicular ferrite. •High cooling rates deteriorate the accuracy of this EBSD based method.« less

  9. Evaluation of the potential use of hybrid LC-MS/MS for active drug quantification applying the 'free analyte QC concept'.

    PubMed

    Jordan, Gregor; Onami, Ichio; Heinrich, Julia; Staack, Roland F

    2017-11-01

    Assessment of active drug exposure of biologics may be crucial for drug development. Typically, ligand-binding assay methods are used to provide free/active drug concentrations. To what extent hybrid LC-MS/MS procedures enable correct 'active' drug quantification is currently under consideration. Experimental & results: The relevance of appropriate extraction condition was evaluated by a hybrid target capture immuno-affinity LC-MS/MS method using total and free/active quality controls (QCs). The rapid extraction (10 min) provided correct results, whereas overnight incubation resulted in significant overestimation of the free/active drug (monclonal antibody) concentration. Conventional total QCs were inappropriate to determine optimal method conditions in contrast to free/active QCs. The 'free/active analyte QC concept' enables development of appropriate extraction conditions for correct active drug quantification by hybrid LC-MS/MS.

  10. A framework for optimization and quantification of uncertainty and sensitivity for developing carbon capture systems

    DOE PAGES

    Eslick, John C.; Ng, Brenda; Gao, Qianwen; ...

    2014-12-31

    Under the auspices of the U.S. Department of Energy’s Carbon Capture Simulation Initiative (CCSI), a Framework for Optimization and Quantification of Uncertainty and Sensitivity (FOQUS) has been developed. This tool enables carbon capture systems to be rapidly synthesized and rigorously optimized, in an environment that accounts for and propagates uncertainties in parameters and models. FOQUS currently enables (1) the development of surrogate algebraic models utilizing the ALAMO algorithm, which can be used for superstructure optimization to identify optimal process configurations, (2) simulation-based optimization utilizing derivative free optimization (DFO) algorithms with detailed black-box process models, and (3) rigorous uncertainty quantification throughmore » PSUADE. FOQUS utilizes another CCSI technology, the Turbine Science Gateway, to manage the thousands of simulated runs necessary for optimization and UQ. Thus, this computational framework has been demonstrated for the design and analysis of a solid sorbent based carbon capture system.« less

  11. Phylogenetic Quantification of Intra-tumour Heterogeneity

    PubMed Central

    Schwarz, Roland F.; Trinh, Anne; Sipos, Botond; Brenton, James D.; Goldman, Nick; Markowetz, Florian

    2014-01-01

    Intra-tumour genetic heterogeneity is the result of ongoing evolutionary change within each cancer. The expansion of genetically distinct sub-clonal populations may explain the emergence of drug resistance, and if so, would have prognostic and predictive utility. However, methods for objectively quantifying tumour heterogeneity have been missing and are particularly difficult to establish in cancers where predominant copy number variation prevents accurate phylogenetic reconstruction owing to horizontal dependencies caused by long and cascading genomic rearrangements. To address these challenges, we present MEDICC, a method for phylogenetic reconstruction and heterogeneity quantification based on a Minimum Event Distance for Intra-tumour Copy-number Comparisons. Using a transducer-based pairwise comparison function, we determine optimal phasing of major and minor alleles, as well as evolutionary distances between samples, and are able to reconstruct ancestral genomes. Rigorous simulations and an extensive clinical study show the power of our method, which outperforms state-of-the-art competitors in reconstruction accuracy, and additionally allows unbiased numerical quantification of tumour heterogeneity. Accurate quantification and evolutionary inference are essential to understand the functional consequences of tumour heterogeneity. The MEDICC algorithms are independent of the experimental techniques used and are applicable to both next-generation sequencing and array CGH data. PMID:24743184

  12. Accurate Quantification of Cardiovascular Biomarkers in Serum Using Protein Standard Absolute Quantification (PSAQ™) and Selected Reaction Monitoring*

    PubMed Central

    Huillet, Céline; Adrait, Annie; Lebert, Dorothée; Picard, Guillaume; Trauchessec, Mathieu; Louwagie, Mathilde; Dupuis, Alain; Hittinger, Luc; Ghaleh, Bijan; Le Corvoisier, Philippe; Jaquinod, Michel; Garin, Jérôme; Bruley, Christophe; Brun, Virginie

    2012-01-01

    Development of new biomarkers needs to be significantly accelerated to improve diagnostic, prognostic, and toxicity monitoring as well as therapeutic follow-up. Biomarker evaluation is the main bottleneck in this development process. Selected Reaction Monitoring (SRM) combined with stable isotope dilution has emerged as a promising option to speed this step, particularly because of its multiplexing capacities. However, analytical variabilities because of upstream sample handling or incomplete trypsin digestion still need to be resolved. In 2007, we developed the PSAQ™ method (Protein Standard Absolute Quantification), which uses full-length isotope-labeled protein standards to quantify target proteins. In the present study we used clinically validated cardiovascular biomarkers (LDH-B, CKMB, myoglobin, and troponin I) to demonstrate that the combination of PSAQ and SRM (PSAQ-SRM) allows highly accurate biomarker quantification in serum samples. A multiplex PSAQ-SRM assay was used to quantify these biomarkers in clinical samples from myocardial infarction patients. Good correlation between PSAQ-SRM and ELISA assay results was found and demonstrated the consistency between these analytical approaches. Thus, PSAQ-SRM has the capacity to improve both accuracy and reproducibility in protein analysis. This will be a major contribution to efficient biomarker development strategies. PMID:22080464

  13. Targeted Proteomic Quantification on Quadrupole-Orbitrap Mass Spectrometer*

    PubMed Central

    Gallien, Sebastien; Duriez, Elodie; Crone, Catharina; Kellmann, Markus; Moehring, Thomas; Domon, Bruno

    2012-01-01

    There is an immediate need for improved methods to systematically and precisely quantify large sets of peptides in complex biological samples. To date protein quantification in biological samples has been routinely performed on triple quadrupole instruments operated in selected reaction monitoring mode (SRM), and two major challenges remain. Firstly, the number of peptides to be included in one survey experiment needs to be increased to routinely reach several hundreds, and secondly, the degree of selectivity should be improved so as to reliably discriminate the targeted analytes from background interferences. High resolution and accurate mass (HR/AM) analysis on the recently developed Q-Exactive mass spectrometer can potentially address these issues. This instrument presents a unique configuration: it is constituted of an orbitrap mass analyzer equipped with a quadrupole mass filter as the front-end for precursor ion mass selection. This configuration enables new quantitative methods based on HR/AM measurements, including targeted analysis in MS mode (single ion monitoring) and in MS/MS mode (parallel reaction monitoring). The ability of the quadrupole to select a restricted m/z range allows one to overcome the dynamic range limitations associated with trapping devices, and the MS/MS mode provides an additional stage of selectivity. When applied to targeted protein quantification in urine samples and benchmarked with the reference SRM technique, the quadrupole-orbitrap instrument exhibits similar or better performance in terms of selectivity, dynamic range, and sensitivity. This high performance is further enhanced by leveraging the multiplexing capability of the instrument to design novel acquisition methods and apply them to large targeted proteomic studies for the first time, as demonstrated on 770 tryptic yeast peptides analyzed in one 60-min experiment. The increased quality of quadrupole-orbitrap data has the potential to improve existing protein quantification methods in complex samples and address the pressing demand of systems biology or biomarker evaluation studies. PMID:22962056

  14. Hepatic fat quantification using chemical shift MR imaging and MR spectroscopy in the presence of hepatic iron deposition: validation in phantoms and in patients with chronic liver disease.

    PubMed

    Lee, Seung Soo; Lee, Youngjoo; Kim, Namkug; Kim, Seong Who; Byun, Jae Ho; Park, Seong Ho; Lee, Moon-Gyu; Ha, Hyun Kwon

    2011-06-01

    To compare the accuracy of four chemical shift magnetic resonance imaging (MRI) (CS-MRI) analysis methods and MR spectroscopy (MRS) with and without T2-correction in fat quantification in the presence of excess iron. CS-MRI with six opposed- and in-phase acquisitions and MRS with five-echo acquisitions (TEs of 20, 30, 40, 50, 60 msec) were performed at 1.5 T on phantoms containing various fat fractions (FFs), on phantoms containing various iron concentrations, and in 18 patients with chronic liver disease. For CS-MRI, FFs were estimated with the dual-echo method, with two T2*-correction methods (triple- and multiecho), and with multiinterference methods that corrected for both T2* and spectral interference effects. For MRS, FF was estimated without T2-correction (single-echo MRS) and with T2-correction (multiecho MRS). In the phantoms, T2*- or T2-correction methods for CS-MRI and MRS provided unbiased estimations of FFs (mean bias, -1.1% to 0.5%) regardless of iron concentration, whereas the dual-echo method (-5.5% to -8.4%) and single-echo MRS (12.1% to 37.3%) resulted in large biases in FFs. In patients, the FFs estimated with triple-echo (R = 0.98), multiecho (R = 0.99), and multiinterference (R = 0.99) methods had stronger correlations with multiecho MRS FFs than with the dual-echo method (R = 0.86; P ≤ 0.011). The FFs estimated with multiinterference method showed the closest agreement with multiecho MRS FFs (the 95% limit-of-agreement, -0.2 ± 1.1). T2*- or T2-correction methods are effective in correcting the confounding effects of iron, enabling an accurate fat quantification throughout a wide range of iron concentrations. Spectral modeling of fat may further improve the accuracy of CS-MRI in fat quantification. Copyright © 2011 Wiley-Liss, Inc.

  15. Automated tumor volumetry using computer-aided image segmentation.

    PubMed

    Gaonkar, Bilwaj; Macyszyn, Luke; Bilello, Michel; Sadaghiani, Mohammed Salehi; Akbari, Hamed; Atthiah, Mark A; Ali, Zarina S; Da, Xiao; Zhan, Yiqang; O'Rourke, Donald; Grady, Sean M; Davatzikos, Christos

    2015-05-01

    Accurate segmentation of brain tumors, and quantification of tumor volume, is important for diagnosis, monitoring, and planning therapeutic intervention. Manual segmentation is not widely used because of time constraints. Previous efforts have mainly produced methods that are tailored to a particular type of tumor or acquisition protocol and have mostly failed to produce a method that functions on different tumor types and is robust to changes in scanning parameters, resolution, and image quality, thereby limiting their clinical value. Herein, we present a semiautomatic method for tumor segmentation that is fast, accurate, and robust to a wide variation in image quality and resolution. A semiautomatic segmentation method based on the geodesic distance transform was developed and validated by using it to segment 54 brain tumors. Glioblastomas, meningiomas, and brain metastases were segmented. Qualitative validation was based on physician ratings provided by three clinical experts. Quantitative validation was based on comparing semiautomatic and manual segmentations. Tumor segmentations obtained using manual and automatic methods were compared quantitatively using the Dice measure of overlap. Subjective evaluation was performed by having human experts rate the computerized segmentations on a 0-5 rating scale where 5 indicated perfect segmentation. The proposed method addresses a significant, unmet need in the field of neuro-oncology. Specifically, this method enables clinicians to obtain accurate and reproducible tumor volumes without the need for manual segmentation. Copyright © 2015 AUR. Published by Elsevier Inc. All rights reserved.

  16. Automated Tumor Volumetry Using Computer-Aided Image Segmentation

    PubMed Central

    Bilello, Michel; Sadaghiani, Mohammed Salehi; Akbari, Hamed; Atthiah, Mark A.; Ali, Zarina S.; Da, Xiao; Zhan, Yiqang; O'Rourke, Donald; Grady, Sean M.; Davatzikos, Christos

    2015-01-01

    Rationale and Objectives Accurate segmentation of brain tumors, and quantification of tumor volume, is important for diagnosis, monitoring, and planning therapeutic intervention. Manual segmentation is not widely used because of time constraints. Previous efforts have mainly produced methods that are tailored to a particular type of tumor or acquisition protocol and have mostly failed to produce a method that functions on different tumor types and is robust to changes in scanning parameters, resolution, and image quality, thereby limiting their clinical value. Herein, we present a semiautomatic method for tumor segmentation that is fast, accurate, and robust to a wide variation in image quality and resolution. Materials and Methods A semiautomatic segmentation method based on the geodesic distance transform was developed and validated by using it to segment 54 brain tumors. Glioblastomas, meningiomas, and brain metastases were segmented. Qualitative validation was based on physician ratings provided by three clinical experts. Quantitative validation was based on comparing semiautomatic and manual segmentations. Results Tumor segmentations obtained using manual and automatic methods were compared quantitatively using the Dice measure of overlap. Subjective evaluation was performed by having human experts rate the computerized segmentations on a 0–5 rating scale where 5 indicated perfect segmentation. Conclusions The proposed method addresses a significant, unmet need in the field of neuro-oncology. Specifically, this method enables clinicians to obtain accurate and reproducible tumor volumes without the need for manual segmentation. PMID:25770633

  17. Evaluation of microRNA alignment techniques

    PubMed Central

    Kaspi, Antony; El-Osta, Assam

    2016-01-01

    Genomic alignment of small RNA (smRNA) sequences such as microRNAs poses considerable challenges due to their short length (∼21 nucleotides [nt]) as well as the large size and complexity of plant and animal genomes. While several tools have been developed for high-throughput mapping of longer mRNA-seq reads (>30 nt), there are few that are specifically designed for mapping of smRNA reads including microRNAs. The accuracy of these mappers has not been systematically determined in the case of smRNA-seq. In addition, it is unknown whether these aligners accurately map smRNA reads containing sequence errors and polymorphisms. By using simulated read sets, we determine the alignment sensitivity and accuracy of 16 short-read mappers and quantify their robustness to mismatches, indels, and nontemplated nucleotide additions. These were explored in the context of a plant genome (Oryza sativa, ∼500 Mbp) and a mammalian genome (Homo sapiens, ∼3.1 Gbp). Analysis of simulated and real smRNA-seq data demonstrates that mapper selection impacts differential expression results and interpretation. These results will inform on best practice for smRNA mapping and enable more accurate smRNA detection and quantification of expression and RNA editing. PMID:27284164

  18. Techniques to improve the direct ex vivo detection of low frequency antigen-specific CD8+ T cells with peptide-major histocompatibility complex class I tetramers

    PubMed Central

    Chattopadhyay, Pratip K.; Melenhorst, J. Joseph; Ladell, Kristin; Gostick, Emma; Scheinberg, Philip; Barrett, A. John; Wooldridge, Linda; Roederer, Mario; Sewell, Andrew K.; Price, David A.

    2008-01-01

    The ability to quantify and characterize antigen-specific CD8+ T cells irrespective of functional readouts using fluorochrome-conjugated tetrameric peptide-MHC class I (pMHCI) complexes in conjunction with flow cytometry has transformed our understanding of cellular immune responses over the past decade. In the case of prevalent CD8+ T cell populations that engage cognate pMHCI tetramers with high avidities, direct ex vivo identification and subsequent data interpretation is relatively straightforward. However, the accurate identification of low frequency antigen-specific CD8+ T cell populations can be complicated, especially in situations where TCR-mediated tetramer binding occurs at low avidities. Here, we highlight a few simple techniques that can be employed to improve the visual resolution, and hence the accurate quantification, of tetramer-binding CD8+ T cell populations by flow cytometry. These methodological modifications enhance signal intensity, especially in the case of specific CD8+ T cell populations that bind cognate antigen with low avidity, minimize background noise and enable improved discrimination of true pMHCI tetramer binding events from nonspecific uptake. PMID:18836993

  19. Quantifying Motor Impairment in Movement Disorders.

    PubMed

    FitzGerald, James J; Lu, Zhongjiao; Jareonsettasin, Prem; Antoniades, Chrystalina A

    2018-01-01

    Until recently the assessment of many movement disorders has relied on clinical rating scales that despite careful design are inherently subjective and non-linear. This makes accurate and truly observer-independent quantification difficult and limits the use of sensitive parametric statistical methods. At last, devices capable of measuring neurological problems quantitatively are becoming readily available. Examples include the use of oculometers to measure eye movements and accelerometers to measure tremor. Many applications are being developed for use on smartphones. The benefits include not just more accurate disease quantification, but also consistency of data for longitudinal studies, accurate stratification of patients for entry into trials, and the possibility of automated data capture for remote follow-up. In this mini review, we will look at movement disorders with a particular focus on Parkinson's disease, describe some of the limitations of existing clinical evaluation tools, and illustrate the ways in which objective metrics have already been successful.

  20. Effect of endogenous reference genes on digital PCR assessment of genetically engineered canola events.

    PubMed

    Demeke, Tigst; Eng, Monika

    2018-05-01

    Droplet digital PCR (ddPCR) has been used for absolute quantification of genetically engineered (GE) events. Absolute quantification of GE events by duplex ddPCR requires the use of appropriate primers and probes for target and reference gene sequences in order to accurately determine the amount of GE materials. Single copy reference genes are generally preferred for absolute quantification of GE events by ddPCR. Study has not been conducted on a comparison of reference genes for absolute quantification of GE canola events by ddPCR. The suitability of four endogenous reference sequences ( HMG-I/Y , FatA(A), CruA and Ccf) for absolute quantification of GE canola events by ddPCR was investigated. The effect of DNA extraction methods and DNA quality on the assessment of reference gene copy numbers was also investigated. ddPCR results were affected by the use of single vs. two copy reference genes. The single copy, FatA(A), reference gene was found to be stable and suitable for absolute quantification of GE canola events by ddPCR. For the copy numbers measured, the HMG-I/Y reference gene was less consistent than FatA(A) reference gene. The expected ddPCR values were underestimated when CruA and Ccf (two copy endogenous Cruciferin sequences) were used because of high number of copies. It is important to make an adjustment if two copy reference genes are used for ddPCR in order to obtain accurate results. On the other hand, real-time quantitative PCR results were not affected by the use of single vs. two copy reference genes.

  1. Direct quantification of lipopeptide biosurfactants in biological samples via HPLC and UPLC-MS requires sample modification with an organic solvent.

    PubMed

    Biniarz, Piotr; Łukaszewicz, Marcin

    2017-06-01

    The rapid and accurate quantification of biosurfactants in biological samples is challenging. In contrast to the orcinol method for rhamnolipids, no simple biochemical method is available for the rapid quantification of lipopeptides. Various liquid chromatography (LC) methods are promising tools for relatively fast and exact quantification of lipopeptides. Here, we report strategies for the quantification of the lipopeptides pseudofactin and surfactin in bacterial cultures using different high- (HPLC) and ultra-performance liquid chromatography (UPLC) systems. We tested three strategies for sample pretreatment prior to LC analysis. In direct analysis (DA), bacterial cultures were injected directly and analyzed via LC. As a modification, we diluted the samples with methanol and detected an increase in lipopeptide recovery in the presence of methanol. Therefore, we suggest this simple modification as a tool for increasing the accuracy of LC methods. We also tested freeze-drying followed by solvent extraction (FDSE) as an alternative for the analysis of "heavy" samples. In FDSE, the bacterial cultures were freeze-dried, and the resulting powder was extracted with different solvents. Then, the organic extracts were analyzed via LC. Here, we determined the influence of the extracting solvent on lipopeptide recovery. HPLC methods allowed us to quantify pseudofactin and surfactin with run times of 15 and 20 min per sample, respectively, whereas UPLC quantification was as fast as 4 and 5.5 min per sample, respectively. Our methods provide highly accurate measurements and high recovery levels for lipopeptides. At the same time, UPLC-MS provides the possibility to identify lipopeptides and their structural isoforms.

  2. Systematic development of small molecules to inhibit specific microscopic steps of Aβ42 aggregation in Alzheimer's disease.

    PubMed

    Habchi, Johnny; Chia, Sean; Limbocker, Ryan; Mannini, Benedetta; Ahn, Minkoo; Perni, Michele; Hansson, Oskar; Arosio, Paolo; Kumita, Janet R; Challa, Pavan Kumar; Cohen, Samuel I A; Linse, Sara; Dobson, Christopher M; Knowles, Tuomas P J; Vendruscolo, Michele

    2017-01-10

    The aggregation of the 42-residue form of the amyloid-β peptide (Aβ42) is a pivotal event in Alzheimer's disease (AD). The use of chemical kinetics has recently enabled highly accurate quantifications of the effects of small molecules on specific microscopic steps in Aβ42 aggregation. Here, we exploit this approach to develop a rational drug discovery strategy against Aβ42 aggregation that uses as a read-out the changes in the nucleation and elongation rate constants caused by candidate small molecules. We thus identify a pool of compounds that target specific microscopic steps in Aβ42 aggregation. We then test further these small molecules in human cerebrospinal fluid and in a Caenorhabditis elegans model of AD. Our results show that this strategy represents a powerful approach to identify systematically small molecule lead compounds, thus offering an appealing opportunity to reduce the attrition problem in drug discovery.

  3. Positron emission tomography (PET) advances in neurological applications

    NASA Astrophysics Data System (ADS)

    Sossi, V.

    2003-09-01

    Positron Emission Tomography (PET) is a functional imaging modality used in brain research to map in vivo neurotransmitter and receptor activity and to investigate glucose utilization or blood flow patterns both in healthy and disease states. Such research is made possible by the wealth of radiotracers available for PET, by the fact that metabolic and kinetic parameters of particular processes can be extracted from PET data and by the continuous development of imaging techniques. In recent years great advancements have been made in the areas of PET instrumentation, data quantification and image reconstruction that allow for more detailed and accurate biological information to be extracted from PET data. It is now possible to quantitatively compare data obtained either with different tracers or with the same tracer under different scanning conditions. These sophisticated imaging approaches enable detailed investigation of disease mechanisms and system response to disease and/or therapy.

  4. A single camera roentgen stereophotogrammetry method for static displacement analysis.

    PubMed

    Gussekloo, S W; Janssen, B A; George Vosselman, M; Bout, R G

    2000-06-01

    A new method to quantify motion or deformation of bony structures has been developed, since quantification is often difficult due to overlaying tissue, and the currently used roentgen stereophotogrammetry method requires significant investment. In our method, a single stationary roentgen source is used, as opposed to the usual two, which, in combination with a fixed radiogram cassette holder, forms a camera with constant interior orientation. By rotating the experimental object, it is possible to achieve a sufficient angle between the various viewing directions, enabling photogrammetric calculations. The photogrammetric procedure was performed on digitised radiograms and involved template matching to increase accuracy. Co-ordinates of spherical markers in the head of a bird (Rhea americana), were calculated with an accuracy of 0.12mm. When these co-ordinates were used in a deformation analysis, relocations of about 0.5mm could be accurately determined.

  5. BRAIN TUMOR SEGMENTATION WITH SYMMETRIC TEXTURE AND SYMMETRIC INTENSITY-BASED DECISION FORESTS.

    PubMed

    Bianchi, Anthony; Miller, James V; Tan, Ek Tsoon; Montillo, Albert

    2013-04-01

    Accurate automated segmentation of brain tumors in MR images is challenging due to overlapping tissue intensity distributions and amorphous tumor shape. However, a clinically viable solution providing precise quantification of tumor and edema volume would enable better pre-operative planning, treatment monitoring and drug development. Our contributions are threefold. First, we design efficient gradient and LBPTOP based texture features which improve classification accuracy over standard intensity features. Second, we extend our texture and intensity features to symmetric texture and symmetric intensity which further improve the accuracy for all tissue classes. Third, we demonstrate further accuracy enhancement by extending our long range features from 100mm to a full 200mm. We assess our brain segmentation technique on 20 patients in the BraTS 2012 dataset. Impact from each contribution is measured and the combination of all the features is shown to yield state-of-the-art accuracy and speed.

  6. Determination of gamma-aminobutyric acid in food matrices by isotope dilution hydrophilic interaction chromatography coupled to mass spectrometry.

    PubMed

    Zazzeroni, Raniero; Homan, Andrew; Thain, Emma

    2009-08-01

    The estimation of the dietary intake of gamma-aminobutyric acid (GABA) is dependent upon the knowledge of its concentration values in food matrices. To this end, an isotope dilution liquid chromatography-mass spectrometry method has been developed employing the hydrophilic interaction chromatography technique for analyte separation. This approach enabled accurate quantification of GABA in apple, potato, soybeans, and orange juice without the need of a pre- or post-column derivatization reaction. A selective and precise analytical measurement has been obtained with a triple quadrupole mass spectrometer operating in multiple reaction monitoring using the method of standard additions and GABA-d(6) as an internal standard. The concentrations of GABA found in the matrices tested are 7 microg/g of apple, 342 microg/g of potatoes, 211 microg/g of soybeans, and 344 microg/mL of orange juice.

  7. Light Water Reactor Sustainability Program FY13 Status Update for EPRI - RISMC Collaboration

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smith, Curtis

    2013-09-01

    The purpose of the Risk Informed Safety Margin Characterization (RISMC) Pathway research and development (R&D) is to support plant decisions for risk-informed margins management with the aim to improve economics, reliability, and sustain safety of current NPPs. Goals of the RISMC Pathway are twofold: (1) Develop and demonstrate a risk-assessment method coupled to safety margin quantification that can be used by NPP decision makers as part of their margin recovery strategies. (2) Create an advanced "RISMC toolkit" that enables more accurate representation of NPP safety margin. In order to carry out the R&D needed for the Pathway, the Idaho Nationalmore » Laboratory (INL) is collaborating with the Electric Power Research Institute (EPRI) in order to focus on applications of interest to the U.S. nuclear power industry. This report documents the collaboration activities performed between INL and EPRI during FY2013.« less

  8. Quantifying the density and utilization of active sites in non-precious metal oxygen electroreduction catalysts

    PubMed Central

    Sahraie, Nastaran Ranjbar; Kramm, Ulrike I.; Steinberg, Julian; Zhang, Yuanjian; Thomas, Arne; Reier, Tobias; Paraknowitsch, Jens-Peter; Strasser, Peter

    2015-01-01

    Carbon materials doped with transition metal and nitrogen are highly active, non-precious metal catalysts for the electrochemical conversion of molecular oxygen in fuel cells, metal air batteries, and electrolytic processes. However, accurate measurement of their intrinsic turn-over frequency and active-site density based on metal centres in bulk and surface has remained difficult to date, which has hampered a more rational catalyst design. Here we report a successful quantification of bulk and surface-based active-site density and associated turn-over frequency values of mono- and bimetallic Fe/N-doped carbons using a combination of chemisorption, desorption and 57Fe Mössbauer spectroscopy techniques. Our general approach yields an experimental descriptor for the intrinsic activity and the active-site utilization, aiding in the catalyst development process and enabling a previously unachieved level of understanding of reactivity trends owing to a deconvolution of site density and intrinsic activity. PMID:26486465

  9. Morphology supporting function: attenuation correction for SPECT/CT, PET/CT, and PET/MR imaging

    PubMed Central

    Lee, Tzu C.; Alessio, Adam M.; Miyaoka, Robert M.; Kinahan, Paul E.

    2017-01-01

    Both SPECT, and in particular PET, are unique in medical imaging for their high sensitivity and direct link to a physical quantity, i.e. radiotracer concentration. This gives PET and SPECT imaging unique capabilities for accurately monitoring disease activity for the purposes of clinical management or therapy development. However, to achieve a direct quantitative connection between the underlying radiotracer concentration and the reconstructed image values several confounding physical effects have to be estimated, notably photon attenuation and scatter. With the advent of dual-modality SPECT/CT, PET/CT, and PET/MR scanners, the complementary CT or MR image data can enable these corrections, although there are unique challenges for each combination. This review covers the basic physics underlying photon attenuation and scatter and summarizes technical considerations for multimodal imaging with regard to PET and SPECT quantification and methods to address the challenges for each multimodal combination. PMID:26576737

  10. An UPLC-MS/MS method for separation and accurate quantification of tamoxifen and its metabolites isomers.

    PubMed

    Arellano, Cécile; Allal, Ben; Goubaa, Anwar; Roché, Henri; Chatelut, Etienne

    2014-11-01

    A selective and accurate analytical method is needed to quantify tamoxifen and its phase I metabolites in a prospective clinical protocol, for evaluation of pharmacokinetic parameters of tamoxifen and its metabolites in adjuvant treatment of breast cancer. The selectivity of the analytical method is a fundamental criteria to allow the quantification of the main active metabolites (Z)-isomers from (Z)'-isomers. An UPLC-MS/MS method was developed and validated for the quantification of (Z)-tamoxifen, (Z)-endoxifen, (E)-endoxifen, Z'-endoxifen, (Z)'-endoxifen, (Z)-4-hydroxytamoxifen, (Z)-4'-hydroxytamoxifen, N-desmethyl tamoxifen, and tamoxifen-N-oxide. The validation range was set between 0.5ng/mL and 125ng/mL for 4-hydroxytamoxifen and endoxifen isomers, and between 12.5ng/mL and 300ng/mL for tamoxifen, tamoxifen N-desmethyl and tamoxifen-N-oxide. The application to patient plasma samples was performed. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. Lensless digital holographic microscopy and its applications in biomedicine and environmental monitoring.

    PubMed

    Wu, Yichen; Ozcan, Aydogan

    2018-03-01

    Optical compound microscope has been a major tool in biomedical imaging for centuries. Its performance relies on relatively complicated, bulky and expensive lenses and alignment mechanics. In contrast, the lensless microscope digitally reconstructs microscopic images of specimens without using any lenses, as a result of which it can be made much smaller, lighter and lower-cost. Furthermore, the limited space-bandwidth product of objective lenses in a conventional microscope can be significantly surpassed by a lensless microscope. Such lensless imaging designs have enabled high-resolution and high-throughput imaging of specimens using compact, portable and cost-effective devices to potentially address various point-of-care, global-health and telemedicine related challenges. In this review, we discuss the operation principles and the methods behind lensless digital holographic on-chip microscopy. We also go over various applications that are enabled by cost-effective and compact implementations of lensless microscopy, including some recent work on air quality monitoring, which utilized machine learning for high-throughput and accurate quantification of particulate matter in air. Finally, we conclude with a brief future outlook of this computational imaging technology. Copyright © 2017 Elsevier Inc. All rights reserved.

  12. PROCAL: A Set of 40 Peptide Standards for Retention Time Indexing, Column Performance Monitoring, and Collision Energy Calibration.

    PubMed

    Zolg, Daniel Paul; Wilhelm, Mathias; Yu, Peng; Knaute, Tobias; Zerweck, Johannes; Wenschuh, Holger; Reimer, Ulf; Schnatbaum, Karsten; Kuster, Bernhard

    2017-11-01

    Beyond specific applications, such as the relative or absolute quantification of peptides in targeted proteomic experiments, synthetic spike-in peptides are not yet systematically used as internal standards in bottom-up proteomics. A number of retention time standards have been reported that enable chromatographic aligning of multiple LC-MS/MS experiments. However, only few peptides are typically included in such sets limiting the analytical parameters that can be monitored. Here, we describe PROCAL (ProteomeTools Calibration Standard), a set of 40 synthetic peptides that span the entire hydrophobicity range of tryptic digests, enabling not only accurate determination of retention time indices but also monitoring of chromatographic separation performance over time. The fragmentation characteristics of the peptides can also be used to calibrate and compare collision energies between mass spectrometers. The sequences of all selected peptides do not occur in any natural protein, thus eliminating the need for stable isotope labeling. We anticipate that this set of peptides will be useful for multiple purposes in individual laboratories but also aiding the transfer of data acquisition and analysis methods between laboratories, notably the use of spectral libraries. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. PET Quantification of the Norepinephrine Transporter in Human Brain with (S,S)-18F-FMeNER-D2.

    PubMed

    Moriguchi, Sho; Kimura, Yasuyuki; Ichise, Masanori; Arakawa, Ryosuke; Takano, Harumasa; Seki, Chie; Ikoma, Yoko; Takahata, Keisuke; Nagashima, Tomohisa; Yamada, Makiko; Mimura, Masaru; Suhara, Tetsuya

    2017-07-01

    Norepinephrine transporter (NET) in the brain plays important roles in human cognition and the pathophysiology of psychiatric disorders. Two radioligands, ( S , S )- 11 C-MRB and ( S , S )- 18 F-FMeNER-D 2 , have been used for imaging NETs in the thalamus and midbrain (including locus coeruleus) using PET in humans. However, NET density in the equally important cerebral cortex has not been well quantified because of unfavorable kinetics with ( S , S )- 11 C-MRB and defluorination with ( S , S )- 18 F-FMeNER-D 2 , which can complicate NET quantification in the cerebral cortex adjacent to the skull containing defluorinated 18 F radioactivity. In this study, we have established analysis methods of quantification of NET density in the brain including the cerebral cortex using ( S , S )- 18 F-FMeNER-D 2 PET. Methods: We analyzed our previous ( S , S )- 18 F-FMeNER-D 2 PET data of 10 healthy volunteers dynamically acquired for 240 min with arterial blood sampling. The effects of defluorination on the NET quantification in the superficial cerebral cortex was evaluated by establishing a time stability of NET density estimations with an arterial input 2-tissue-compartment model, which guided the less-invasive reference tissue model and area under the time-activity curve methods to accurately quantify NET density in all brain regions including the cerebral cortex. Results: Defluorination of ( S , S )- 18 F-FMeNER-D 2 became prominent toward the latter half of the 240-min scan. Total distribution volumes in the superficial cerebral cortex increased with the scan duration beyond 120 min. We verified that 90-min dynamic scans provided a sufficient amount of data for quantification of NET density unaffected by defluorination. Reference tissue model binding potential values from the 90-min scan data and area under the time-activity curve ratios of 70- to 90-min data allowed for the accurate quantification of NET density in the cerebral cortex. Conclusion: We have established methods of quantification of NET densities in the brain including the cerebral cortex unaffected by defluorination using ( S , S )- 18 F-FMeNER-D 2 These results suggest that we can accurately quantify NET density with a 90-min ( S , S )- 18 F-FMeNER-D 2 scan in broad brain areas. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  14. Neutron-Encoded Protein Quantification by Peptide Carbamylation

    NASA Astrophysics Data System (ADS)

    Ulbrich, Arne; Merrill, Anna E.; Hebert, Alexander S.; Westphall, Michael S.; Keller, Mark P.; Attie, Alan D.; Coon, Joshua J.

    2014-01-01

    We describe a chemical tag for duplex proteome quantification using neutron encoding (NeuCode). The method utilizes the straightforward, efficient, and inexpensive carbamylation reaction. We demonstrate the utility of NeuCode carbamylation by accurately measuring quantitative ratios from tagged yeast lysates mixed in known ratios and by applying this method to quantify differential protein expression in mice fed a either control or high-fat diet.

  15. Impact of reconstruction parameters on quantitative I-131 SPECT

    NASA Astrophysics Data System (ADS)

    van Gils, C. A. J.; Beijst, C.; van Rooij, R.; de Jong, H. W. A. M.

    2016-07-01

    Radioiodine therapy using I-131 is widely used for treatment of thyroid disease or neuroendocrine tumors. Monitoring treatment by accurate dosimetry requires quantitative imaging. The high energy photons however render quantitative SPECT reconstruction challenging, potentially requiring accurate correction for scatter and collimator effects. The goal of this work is to assess the effectiveness of various correction methods on these effects using phantom studies. A SPECT/CT acquisition of the NEMA IEC body phantom was performed. Images were reconstructed using the following parameters: (1) without scatter correction, (2) with triple energy window (TEW) scatter correction and (3) with Monte Carlo-based scatter correction. For modelling the collimator-detector response (CDR), both (a) geometric Gaussian CDRs as well as (b) Monte Carlo simulated CDRs were compared. Quantitative accuracy, contrast to noise ratios and recovery coefficients were calculated, as well as the background variability and the residual count error in the lung insert. The Monte Carlo scatter corrected reconstruction method was shown to be intrinsically quantitative, requiring no experimentally acquired calibration factor. It resulted in a more accurate quantification of the background compartment activity density compared with TEW or no scatter correction. The quantification error relative to a dose calibrator derived measurement was found to be  <1%,-26% and 33%, respectively. The adverse effects of partial volume were significantly smaller with the Monte Carlo simulated CDR correction compared with geometric Gaussian or no CDR modelling. Scatter correction showed a small effect on quantification of small volumes. When using a weighting factor, TEW correction was comparable to Monte Carlo reconstruction in all measured parameters, although this approach is clinically impractical since this factor may be patient dependent. Monte Carlo based scatter correction including accurately simulated CDR modelling is the most robust and reliable method to reconstruct accurate quantitative iodine-131 SPECT images.

  16. Fluorescent quantification of melanin.

    PubMed

    Fernandes, Bruno; Matamá, Teresa; Guimarães, Diana; Gomes, Andreia; Cavaco-Paulo, Artur

    2016-11-01

    Melanin quantification is reportedly performed by absorption spectroscopy, commonly at 405 nm. Here, we propose the implementation of fluorescence spectroscopy for melanin assessment. In a typical in vitro assay to assess melanin production in response to an external stimulus, absorption spectroscopy clearly overvalues melanin content. This method is also incapable of distinguishing non-melanotic/amelanotic control cells from those that are actually capable of performing melanogenesis. Therefore, fluorescence spectroscopy is the best method for melanin quantification as it proved to be highly specific and accurate, detecting even small variations in the synthesis of melanin. This method can also be applied to the quantification of melanin in more complex biological matrices like zebrafish embryos and human hair. © 2016 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  17. Evaluation of a mass-balance approach to determine consumptive water use in northeastern Illinois

    USGS Publications Warehouse

    Mills, Patrick C.; Duncker, James J.; Over, Thomas M.; Marian Domanski,; ,; Engel, Frank

    2014-01-01

    Under ideal conditions, accurate quantification of consumptive use at the sewershed scale by the described mass-balance approach might be possible. Under most prevailing conditions, quantification likely would be more costly and time consuming than that of the present study, given the freely contributed technical support of the host community and relatively appropriate conditions of the study area. Essentials to quantification of consumptive use are a fully cooperative community, storm and sanitary sewers that are separate, and newer sewer infrastructure and (or) a robust program for limiting infiltration, exfiltration, and inflow.

  18. Accurate proteome-wide protein quantification from high-resolution 15N mass spectra

    PubMed Central

    2011-01-01

    In quantitative mass spectrometry-based proteomics, the metabolic incorporation of a single source of 15N-labeled nitrogen has many advantages over using stable isotope-labeled amino acids. However, the lack of a robust computational framework for analyzing the resulting spectra has impeded wide use of this approach. We have addressed this challenge by introducing a new computational methodology for analyzing 15N spectra in which quantification is integrated with identification. Application of this method to an Escherichia coli growth transition reveals significant improvement in quantification accuracy over previous methods. PMID:22182234

  19. OTM 33 Geospatial Measurement of Air Pollution, Remote Emissions Quantification (GMAP-REQ) and OTM33A Geospatial Measurement of Air Pollution-Remote Emissions Quantification-Direct Assessment (GMAP-REQ-DA)

    EPA Science Inventory

    Background: Next generation air measurement (NGAM) technologies are enabling new regulatory and compliance approaches that will help EPA better understand and meet emerging challenges associated with fugitive and area source emissions from industrial and oil and gas sectors. In...

  20. Agile SE Enablers and Quantification Project: Identification, Characterization, and Evaluation Criteria for Systems Engineering Agile Enablers

    DTIC Science & Technology

    2015-01-16

    Enablers Draft Technical Report SERC -2015-049-1 January 16, 2015 Principal Investigator: Dr. Richard Turner, Stevens Institute of...Hudson, Hoboken, NJ 07030 1 Copyright © 2015 Stevens Institute of Technology The Systems Engineering Research Center ( SERC ) is a federally...inappropriate enablers are not pursued. The identification criteria developed for RT-124 are based on earlier SERC work. [4, 5, 6]: 1 Operated by DAU

  1. dPCR: A Technology Review

    PubMed Central

    Quan, Phenix-Lan; Sauzade, Martin

    2018-01-01

    Digital Polymerase Chain Reaction (dPCR) is a novel method for the absolute quantification of target nucleic acids. Quantification by dPCR hinges on the fact that the random distribution of molecules in many partitions follows a Poisson distribution. Each partition acts as an individual PCR microreactor and partitions containing amplified target sequences are detected by fluorescence. The proportion of PCR-positive partitions suffices to determine the concentration of the target sequence without a need for calibration. Advances in microfluidics enabled the current revolution of digital quantification by providing efficient partitioning methods. In this review, we compare the fundamental concepts behind the quantification of nucleic acids by dPCR and quantitative real-time PCR (qPCR). We detail the underlying statistics of dPCR and explain how it defines its precision and performance metrics. We review the different microfluidic digital PCR formats, present their underlying physical principles, and analyze the technological evolution of dPCR platforms. We present the novel multiplexing strategies enabled by dPCR and examine how isothermal amplification could be an alternative to PCR in digital assays. Finally, we determine whether the theoretical advantages of dPCR over qPCR hold true by perusing studies that directly compare assays implemented with both methods. PMID:29677144

  2. SU-D-218-05: Material Quantification in Spectral X-Ray Imaging: Optimization and Validation.

    PubMed

    Nik, S J; Thing, R S; Watts, R; Meyer, J

    2012-06-01

    To develop and validate a multivariate statistical method to optimize scanning parameters for material quantification in spectral x-rayimaging. An optimization metric was constructed by extensively sampling the thickness space for the expected number of counts for m (two or three) materials. This resulted in an m-dimensional confidence region ofmaterial quantities, e.g. thicknesses. Minimization of the ellipsoidal confidence region leads to the optimization of energy bins. For the given spectrum, the minimum counts required for effective material separation can be determined by predicting the signal-to-noise ratio (SNR) of the quantification. A Monte Carlo (MC) simulation framework using BEAM was developed to validate the metric. Projection data of the m-materials was generated and material decomposition was performed for combinations of iodine, calcium and water by minimizing the z-score between the expected spectrum and binned measurements. The mean square error (MSE) and variance were calculated to measure the accuracy and precision of this approach, respectively. The minimum MSE corresponds to the optimal energy bins in the BEAM simulations. In the optimization metric, this is equivalent to the smallest confidence region. The SNR of the simulated images was also compared to the predictions from the metric. TheMSE was dominated by the variance for the given material combinations,which demonstrates accurate material quantifications. The BEAMsimulations revealed that the optimization of energy bins was accurate to within 1keV. The SNRs predicted by the optimization metric yielded satisfactory agreement but were expectedly higher for the BEAM simulations due to the inclusion of scattered radiation. The validation showed that the multivariate statistical method provides accurate material quantification, correct location of optimal energy bins and adequateprediction of image SNR. The BEAM code system is suitable for generating spectral x- ray imaging simulations. © 2012 American Association of Physicists in Medicine.

  3. Implications of Measurement Assay Type in Design of HIV Experiments.

    PubMed

    Cannon, LaMont; Jagarapu, Aditya; Vargas-Garcia, Cesar A; Piovoso, Michael J; Zurakowski, Ryan

    2017-12-01

    Time series measurements of circular viral episome (2-LTR) concentrations enable indirect quantification of persistent low-level Human Immunodeficiency Virus (HIV) replication in patients on Integrase-Inhibitor intensified Combined Antiretroviral Therapy (cART). In order to determine the magnitude of these low level infection events, blood has to be drawn from a patients at a frequency and volume that is strictly regulated by the Institutional Review Board (IRB). Once the blood is drawn, the 2-LTR concentration is determined by quantifying the amount of HIV DNA present in the sample via a PCR (Polymerase Chain Reaction) assay. Real time quantitative Polymerase Chain Reaction (qPCR) is a widely used method of performing PCR; however, a newer droplet digital Polymerase Chain Reaction (ddPCR) method has been shown to provide more accurate quantification of DNA. Using a validated model of HIV viral replication, this paper demonstrates the importance of considering DNA quantification assay type when optimizing experiment design conditions. Experiments are optimized using a Genetic Algorithm (GA) to locate a family of suboptimal sample schedules which yield the highest fitness. Fitness is defined as the expected information gained in the experiment, measured by the Kullback-Leibler Divergence (KLD) between the prior and posterior distributions of the model parameters. We compare the information content of the optimized schedules to uniform schedules as well as two clinical schedules implemented by researchers at UCSF and the University of Melbourne. This work shows that there is a significantly greater gain information in experiments using a ddPCR assay vs. a qPCR assay and that certain experiment design considerations should be taken when using either assay.

  4. cFinder: definition and quantification of multiple haplotypes in a mixed sample.

    PubMed

    Niklas, Norbert; Hafenscher, Julia; Barna, Agnes; Wiesinger, Karin; Pröll, Johannes; Dreiseitl, Stephan; Preuner-Stix, Sandra; Valent, Peter; Lion, Thomas; Gabriel, Christian

    2015-09-07

    Next-generation sequencing allows for determining the genetic composition of a mixed sample. For instance, when performing resistance testing for BCR-ABL1 it is necessary to identify clones and define compound mutations; together with an exact quantification this may complement diagnosis and therapy decisions with additional information. Moreover, that applies not only to oncological issues but also determination of viral, bacterial or fungal infection. The efforts to retrieve multiple haplotypes (more than two) and proportion information from data with conventional software are difficult, cumbersome and demand multiple manual steps. Therefore, we developed a tool called cFinder that is capable of automatic detection of haplotypes and their accurate quantification within one sample. BCR-ABL1 samples containing multiple clones were used for testing and our cFinder could identify all previously found clones together with their abundance and even refine some results. Additionally, reads were simulated using GemSIM with multiple haplotypes, the detection was very close to linear (R(2) = 0.96). Our aim is not to deduce haploblocks over statistics, but to characterize one sample's composition precisely. As a result the cFinder reports the connections of variants (haplotypes) with their readcount and relative occurrence (percentage). Download is available at http://sourceforge.net/projects/cfinder/. Our cFinder is implemented in an efficient algorithm that can be run on a low-performance desktop computer. Furthermore, it considers paired-end information (if available) and is generally open for any current next-generation sequencing technology and alignment strategy. To our knowledge, this is the first software that enables researchers without extensive bioinformatic support to designate multiple haplotypes and how they constitute to a sample.

  5. Systematic Comparison of Label-Free, Metabolic Labeling, and Isobaric Chemical Labeling for Quantitative Proteomics on LTQ Orbitrap Velos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Zhou; Adams, Rachel M; Chourey, Karuna

    2012-01-01

    A variety of quantitative proteomics methods have been developed, including label-free, metabolic labeling, and isobaric chemical labeling using iTRAQ or TMT. Here, these methods were compared in terms of the depth of proteome coverage, quantification accuracy, precision, and reproducibility using a high-performance hybrid mass spectrometer, LTQ Orbitrap Velos. Our results show that (1) the spectral counting method provides the deepest proteome coverage for identification, but its quantification performance is worse than labeling-based approaches, especially the quantification reproducibility; (2) metabolic labeling and isobaric chemical labeling are capable of accurate, precise, and reproducible quantification and provide deep proteome coverage for quantification. Isobaricmore » chemical labeling surpasses metabolic labeling in terms of quantification precision and reproducibility; (3) iTRAQ and TMT perform similarly in all aspects compared in the current study using a CID-HCD dual scan configuration. Based on the unique advantages of each method, we provide guidance for selection of the appropriate method for a quantitative proteomics study.« less

  6. QUESP and QUEST revisited - fast and accurate quantitative CEST experiments.

    PubMed

    Zaiss, Moritz; Angelovski, Goran; Demetriou, Eleni; McMahon, Michael T; Golay, Xavier; Scheffler, Klaus

    2018-03-01

    Chemical exchange saturation transfer (CEST) NMR or MRI experiments allow detection of low concentrated molecules with enhanced sensitivity via their proton exchange with the abundant water pool. Be it endogenous metabolites or exogenous contrast agents, an exact quantification of the actual exchange rate is required to design optimal pulse sequences and/or specific sensitive agents. Refined analytical expressions allow deeper insight and improvement of accuracy for common quantification techniques. The accuracy of standard quantification methodologies, such as quantification of exchange rate using varying saturation power or varying saturation time, is improved especially for the case of nonequilibrium initial conditions and weak labeling conditions, meaning the saturation amplitude is smaller than the exchange rate (γB 1  < k). The improved analytical 'quantification of exchange rate using varying saturation power/time' (QUESP/QUEST) equations allow for more accurate exchange rate determination, and provide clear insights on the general principles to execute the experiments and to perform numerical evaluation. The proposed methodology was evaluated on the large-shift regime of paramagnetic chemical-exchange-saturation-transfer agents using simulated data and data of the paramagnetic Eu(III) complex of DOTA-tetraglycineamide. The refined formulas yield improved exchange rate estimation. General convergence intervals of the methods that would apply for smaller shift agents are also discussed. Magn Reson Med 79:1708-1721, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  7. Highly sensitive quantification for human plasma-targeted metabolomics using an amine derivatization reagent.

    PubMed

    Arashida, Naoko; Nishimoto, Rumi; Harada, Masashi; Shimbo, Kazutaka; Yamada, Naoyuki

    2017-02-15

    Amino acids and their related metabolites play important roles in various physiological processes and have consequently become biomarkers for diseases. However, accurate quantification methods have only been established for major compounds, such as amino acids and a limited number of target metabolites. We previously reported a highly sensitive high-throughput method for the simultaneous quantification of amines using 3-aminopyridyl-N-succinimidyl carbamate as a derivatization reagent combined with liquid chromatography-tandem mass spectrometry (LC-MS/MS). Herein, we report the successful development of a practical and accurate LC-MS/MS method to analyze low concentrations of 40 physiological amines in 19 min. Thirty-five of these amines showed good linearity, limits of quantification, accuracy, precision, and recovery characteristics in plasma, with scheduled selected reaction monitoring acquisitions. Plasma samples from 10 healthy volunteers were evaluated using our newly developed method. The results revealed that 27 amines were detected in one of the samples, and that 24 of these compounds could be quantified. Notably, this new method successfully quantified metabolites with high accuracy across three orders of magnitude, with lowest and highest averaged concentrations of 31.7 nM (for spermine) and 18.3 μM (for α-aminobutyric acid), respectively. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Methods and techniques for measuring gas emissions from agricultural and animal feeding operations.

    PubMed

    Hu, Enzhu; Babcock, Esther L; Bialkowski, Stephen E; Jones, Scott B; Tuller, Markus

    2014-01-01

    Emissions of gases from agricultural and animal feeding operations contribute to climate change, produce odors, degrade sensitive ecosystems, and pose a threat to public health. The complexity of processes and environmental variables affecting these emissions complicate accurate and reliable quantification of gas fluxes and production rates. Although a plethora of measurement technologies exist, each method has its limitations that exacerbate accurate quantification of gas fluxes. Despite a growing interest in gas emission measurements, only a few available technologies include real-time, continuous monitoring capabilities. Commonly applied state-of-the-art measurement frameworks and technologies were critically examined and discussed, and recommendations for future research to address real-time monitoring requirements for forthcoming regulation and management needs are provided.

  9. An international collaboration to standardize HIV-2 viral load assays: results from the 2009 ACHI(E)V(2E) quality control study.

    PubMed

    Damond, F; Benard, A; Balotta, Claudia; Böni, Jürg; Cotten, Matthew; Duque, Vitor; Ferns, Bridget; Garson, Jeremy; Gomes, Perpetua; Gonçalves, Fátima; Gottlieb, Geoffrey; Kupfer, Bernd; Ruelle, Jean; Rodes, Berta; Soriano, Vicente; Wainberg, Mark; Taieb, Audrey; Matheron, Sophie; Chene, Genevieve; Brun-Vezinet, Francoise

    2011-10-01

    Accurate HIV-2 plasma viral load quantification is crucial for adequate HIV-2 patient management and for the proper conduct of clinical trials and international cohort collaborations. This study compared the homogeneity of HIV-2 RNA quantification when using HIV-2 assays from ACHI(E)V(2E) study sites and either in-house PCR calibration standards or common viral load standards supplied to all collaborators. Each of the 12 participating laboratories quantified blinded HIV-2 samples, using its own HIV-2 viral load assay and standard as well as centrally validated and distributed common HIV-2 group A and B standards (http://www.hiv.lanl.gov/content/sequence/HelpDocs/subtypes-more.html). Aliquots of HIV-2 group A and B strains, each at 2 theoretical concentrations (2.7 and 3.7 log(10) copies/ml), were tested. Intralaboratory, interlaboratory, and overall variances of quantification results obtained with both standards were compared using F tests. For HIV-2 group A quantifications, overall and interlaboratory and/or intralaboratory variances were significantly lower when using the common standard than when using in-house standards at the concentration levels of 2.7 log(10) copies/ml and 3.7 log(10) copies/ml, respectively. For HIV-2 group B, a high heterogeneity was observed and the variances did not differ according to the type of standard used. In this international collaboration, the use of a common standard improved the homogeneity of HIV-2 group A RNA quantification only. The diversity of HIV-2 group B, particularly in PCR primer-binding regions, may explain the heterogeneity in quantification of this strain. Development of a validated HIV-2 viral load assay that accurately quantifies distinct circulating strains is needed.

  10. An International Collaboration To Standardize HIV-2 Viral Load Assays: Results from the 2009 ACHIEV2E Quality Control Study▿

    PubMed Central

    Damond, F.; Benard, A.; Balotta, Claudia; Böni, Jürg; Cotten, Matthew; Duque, Vitor; Ferns, Bridget; Garson, Jeremy; Gomes, Perpetua; Gonçalves, Fátima; Gottlieb, Geoffrey; Kupfer, Bernd; Ruelle, Jean; Rodes, Berta; Soriano, Vicente; Wainberg, Mark; Taieb, Audrey; Matheron, Sophie; Chene, Genevieve; Brun-Vezinet, Francoise

    2011-01-01

    Accurate HIV-2 plasma viral load quantification is crucial for adequate HIV-2 patient management and for the proper conduct of clinical trials and international cohort collaborations. This study compared the homogeneity of HIV-2 RNA quantification when using HIV-2 assays from ACHIEV2E study sites and either in-house PCR calibration standards or common viral load standards supplied to all collaborators. Each of the 12 participating laboratories quantified blinded HIV-2 samples, using its own HIV-2 viral load assay and standard as well as centrally validated and distributed common HIV-2 group A and B standards (http://www.hiv.lanl.gov/content/sequence/HelpDocs/subtypes-more.html). Aliquots of HIV-2 group A and B strains, each at 2 theoretical concentrations (2.7 and 3.7 log10 copies/ml), were tested. Intralaboratory, interlaboratory, and overall variances of quantification results obtained with both standards were compared using F tests. For HIV-2 group A quantifications, overall and interlaboratory and/or intralaboratory variances were significantly lower when using the common standard than when using in-house standards at the concentration levels of 2.7 log10 copies/ml and 3.7 log10 copies/ml, respectively. For HIV-2 group B, a high heterogeneity was observed and the variances did not differ according to the type of standard used. In this international collaboration, the use of a common standard improved the homogeneity of HIV-2 group A RNA quantification only. The diversity of HIV-2 group B, particularly in PCR primer-binding regions, may explain the heterogeneity in quantification of this strain. Development of a validated HIV-2 viral load assay that accurately quantifies distinct circulating strains is needed. PMID:21813718

  11. Novel quantitative real-time LCR for the sensitive detection of SNP frequencies in pooled DNA: method development, evaluation and application.

    PubMed

    Psifidi, Androniki; Dovas, Chrysostomos; Banos, Georgios

    2011-01-19

    Single nucleotide polymorphisms (SNP) have proven to be powerful genetic markers for genetic applications in medicine, life science and agriculture. A variety of methods exist for SNP detection but few can quantify SNP frequencies when the mutated DNA molecules correspond to a small fraction of the wild-type DNA. Furthermore, there is no generally accepted gold standard for SNP quantification, and, in general, currently applied methods give inconsistent results in selected cohorts. In the present study we sought to develop a novel method for accurate detection and quantification of SNP in DNA pooled samples. The development and evaluation of a novel Ligase Chain Reaction (LCR) protocol that uses a DNA-specific fluorescent dye to allow quantitative real-time analysis is described. Different reaction components and thermocycling parameters affecting the efficiency and specificity of LCR were examined. Several protocols, including gap-LCR modifications, were evaluated using plasmid standard and genomic DNA pools. A protocol of choice was identified and applied for the quantification of a polymorphism at codon 136 of the ovine PRNP gene that is associated with susceptibility to a transmissible spongiform encephalopathy in sheep. The real-time LCR protocol developed in the present study showed high sensitivity, accuracy, reproducibility and a wide dynamic range of SNP quantification in different DNA pools. The limits of detection and quantification of SNP frequencies were 0.085% and 0.35%, respectively. The proposed real-time LCR protocol is applicable when sensitive detection and accurate quantification of low copy number mutations in DNA pools is needed. Examples include oncogenes and tumour suppressor genes, infectious diseases, pathogenic bacteria, fungal species, viral mutants, drug resistance resulting from point mutations, and genetically modified organisms in food.

  12. Novel Quantitative Real-Time LCR for the Sensitive Detection of SNP Frequencies in Pooled DNA: Method Development, Evaluation and Application

    PubMed Central

    Psifidi, Androniki; Dovas, Chrysostomos; Banos, Georgios

    2011-01-01

    Background Single nucleotide polymorphisms (SNP) have proven to be powerful genetic markers for genetic applications in medicine, life science and agriculture. A variety of methods exist for SNP detection but few can quantify SNP frequencies when the mutated DNA molecules correspond to a small fraction of the wild-type DNA. Furthermore, there is no generally accepted gold standard for SNP quantification, and, in general, currently applied methods give inconsistent results in selected cohorts. In the present study we sought to develop a novel method for accurate detection and quantification of SNP in DNA pooled samples. Methods The development and evaluation of a novel Ligase Chain Reaction (LCR) protocol that uses a DNA-specific fluorescent dye to allow quantitative real-time analysis is described. Different reaction components and thermocycling parameters affecting the efficiency and specificity of LCR were examined. Several protocols, including gap-LCR modifications, were evaluated using plasmid standard and genomic DNA pools. A protocol of choice was identified and applied for the quantification of a polymorphism at codon 136 of the ovine PRNP gene that is associated with susceptibility to a transmissible spongiform encephalopathy in sheep. Conclusions The real-time LCR protocol developed in the present study showed high sensitivity, accuracy, reproducibility and a wide dynamic range of SNP quantification in different DNA pools. The limits of detection and quantification of SNP frequencies were 0.085% and 0.35%, respectively. Significance The proposed real-time LCR protocol is applicable when sensitive detection and accurate quantification of low copy number mutations in DNA pools is needed. Examples include oncogenes and tumour suppressor genes, infectious diseases, pathogenic bacteria, fungal species, viral mutants, drug resistance resulting from point mutations, and genetically modified organisms in food. PMID:21283808

  13. A comparison of liver fat content as determined by magnetic resonance imaging-proton density fat fraction and MRS versus liver histology in non-alcoholic fatty liver disease.

    PubMed

    Idilman, Ilkay S; Keskin, Onur; Celik, Azim; Savas, Berna; Elhan, Atilla Halil; Idilman, Ramazan; Karcaaltincaba, Musturay

    2016-03-01

    Many imaging methods have been defined for quantification of hepatic steatosis in non-alcoholic fatty liver disease (NAFLD). However, studies comparing the efficiency of magnetic resonance imaging-proton density fat fraction (MRI-PDFF), magnetic resonance spectroscopy (MRS), and liver histology for quantification of liver fat content are limited. To compare the efficiency of MRI-PDFF and MRS in the quantification of liver fat content in individuals with NAFLD. A total of 19 NAFLD patients underwent MRI-PDFF, MRS, and liver biopsy for quantification of liver fat content. The MR examinations were performed on a 1.5 HDx MRI system. The MRI protocol included T1-independent volumetric multi-echo gradient-echo imaging with T2* correction and spectral fat modeling and MRS with STEAM technique. A close correlation was observed between liver MRI-PDFF- and histology- determined steatosis (r = 0.743, P < 0.001) and between liver MRS- and histology-determined steatosis (r = 0.712, P < 0.001), with no superiority between them (ƶ = 0.19, P = 0.849). For quantification of hepatic steatosis, a high correlation was observed between the two MRI methods (r = 0.986, P < 0.001). MRI-PDFF and MRS accurately differentiated moderate/severe steatosis from mild/no hepatic steatosis (P = 0.007 and 0.013, respectively), with no superiority between them (AUCMRI-PDFF = 0.881 ± 0.0856 versus AUCMRS = 0.857 ± 0.0924, P = 0.461). Both MRI-PDFF and MRS can be used for accurate quantification of hepatic steatosis. © The Foundation Acta Radiologica 2015.

  14. SAM Gcms Chromatography Performed at Mars : Elements of Interpretation

    NASA Astrophysics Data System (ADS)

    Szopa, C.; Coll, P. J.; Buch, A.; François, P.; Cabane, M.; Coscia, D.; Teinturier, S.; Navarro-Gonzalez, R.; Glavin, D. P.; Freissinet, C.; Mahaffy, P. R.

    2013-12-01

    The characterisation of the chemical and mineralogical composition of regolith samples collected with the Curiosity rover is a primary objective of the SAM experiment. These data should provide essential clues on the past habitability of Gale crater. Interpretation of the data collected after SAM pyrolysis evolved gas analysis (EGA) and gas chromatography mass spectrometry (GC-MS) experiments on the first soil samples collected by MSL at the Rocknest Aeolian Deposit in Gale Crater has been challenging due to the concomitant presence in the ovens of an oxychlorine phase present in the samples, and a derivatization agent coming from the SAM wet chemistry experiment (Glavin et al., 2013). Moreover, accurate identification and quantification, in the SAM EGA mode, of volatiles released from the heated sample, or generated by reactions occurring in the SAM pyrolysis oven, is also difficult for a few compounds due to evolution over similar temperature ranges and overlap of their MS signatures. Hence, the GC analyses, coupled with MS, enabled the separation and identification and quantification of most of the volatile compounds detected. These results can have been obtained through tests and calibration done with GC individual spare components and with the SAM testbed. This paper will present a view of the interpretation of the chromatograms obtained when analyzing the Rocknest and John Klein solid samples delivered to SAM, on sols 96 and 199 respectively, supported by laboratory calibrations.

  15. Left ventricle: fully automated segmentation based on spatiotemporal continuity and myocardium information in cine cardiac magnetic resonance imaging (LV-FAST).

    PubMed

    Wang, Lijia; Pei, Mengchao; Codella, Noel C F; Kochar, Minisha; Weinsaft, Jonathan W; Li, Jianqi; Prince, Martin R; Wang, Yi

    2015-01-01

    CMR quantification of LV chamber volumes typically and manually defines the basal-most LV, which adds processing time and user-dependence. This study developed an LV segmentation method that is fully automated based on the spatiotemporal continuity of the LV (LV-FAST). An iteratively decreasing threshold region growing approach was used first from the midventricle to the apex, until the LV area and shape discontinued, and then from midventricle to the base, until less than 50% of the myocardium circumference was observable. Region growth was constrained by LV spatiotemporal continuity to improve robustness of apical and basal segmentations. The LV-FAST method was compared with manual tracing on cardiac cine MRI data of 45 consecutive patients. Of the 45 patients, LV-FAST and manual selection identified the same apical slices at both ED and ES and the same basal slices at both ED and ES in 38, 38, 38, and 41 cases, respectively, and their measurements agreed within -1.6 ± 8.7 mL, -1.4 ± 7.8 mL, and 1.0 ± 5.8% for EDV, ESV, and EF, respectively. LV-FAST allowed LV volume-time course quantitatively measured within 3 seconds on a standard desktop computer, which is fast and accurate for processing the cine volumetric cardiac MRI data, and enables LV filling course quantification over the cardiac cycle.

  16. Droplet digital PCR technology promises new applications and research areas.

    PubMed

    Manoj, P

    2016-01-01

    Digital Polymerase Chain Reaction (dPCR) is used to quantify nucleic acids and its applications are in the detection and precise quantification of low-level pathogens, rare genetic sequences, quantification of copy number variants, rare mutations and in relative gene expressions. Here the PCR is performed in large number of reaction chambers or partitions and the reaction is carried out in each partition individually. This separation allows a more reliable collection and sensitive measurement of nucleic acid. Results are calculated by counting amplified target sequence (positive droplets) and the number of partitions in which there is no amplification (negative droplets). The mean number of target sequences was calculated by Poisson Algorithm. Poisson correction compensates the presence of more than one copy of target gene in any droplets. The method provides information with accuracy and precision which is highly reproducible and less susceptible to inhibitors than qPCR. It has been demonstrated in studying variations in gene sequences, such as copy number variants and point mutations, distinguishing differences between expression of nearly identical alleles, assessment of clinically relevant genetic variations and it is routinely used for clonal amplification of samples for NGS methods. dPCR enables more reliable predictors of tumor status and patient prognosis by absolute quantitation using reference normalizations. Rare mitochondrial DNA deletions associated with a range of diseases and disorders as well as aging can be accurately detected with droplet digital PCR.

  17. Accurate Quantification of T Cells by Measuring Loss of Germline T-Cell Receptor Loci with Generic Single Duplex Droplet Digital PCR Assays.

    PubMed

    Zoutman, Willem H; Nell, Rogier J; Versluis, Mieke; van Steenderen, Debby; Lalai, Rajshri N; Out-Luiting, Jacoba J; de Lange, Mark J; Vermeer, Maarten H; Langerak, Anton W; van der Velden, Pieter A

    2017-03-01

    Quantifying T cells accurately in a variety of tissues of benign, inflammatory, or malignant origin can be of great importance in a variety of clinical applications. Flow cytometry and immunohistochemistry are considered to be gold-standard methods for T-cell quantification. However, these methods require fresh, frozen, or fixated cells and tissue of a certain quality. In addition, conventional and droplet digital PCR (ddPCR), whether followed by deep sequencing techniques, have been used to elucidate T-cell content by focusing on rearranged T-cell receptor (TCR) genes. These approaches typically target the whole TCR repertoire, thereby supplying additional information about TCR use. We alternatively developed and validated two novel generic single duplex ddPCR assays to quantify T cells accurately by measuring loss of specific germline TCR loci and compared them with flow cytometry-based quantification. These assays target sequences between the Dδ2 and Dδ3 genes (TRD locus) and Dβ1 and Jβ1.1 genes (TRB locus) that become deleted systematically early during lymphoid differentiation. Because these ddPCR assays require small amounts of DNA instead of freshly isolated, frozen, or fixated material, initially unanalyzable (scarce) specimens can be assayed from now on, supplying valuable information about T-cell content. Our ddPCR method provides a novel and sensitive way for quantifying T cells relatively fast, accurate, and independent of the cellular context. Copyright © 2017 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  18. Accurate joint space quantification in knee osteoarthritis: a digital x-ray tomosynthesis phantom study

    NASA Astrophysics Data System (ADS)

    Sewell, Tanzania S.; Piacsek, Kelly L.; Heckel, Beth A.; Sabol, John M.

    2011-03-01

    The current imaging standard for diagnosis and monitoring of knee osteoarthritis (OA) is projection radiography. However radiographs may be insensitive to markers of early disease such as osteophytes and joint space narrowing (JSN). Relative to standard radiography, digital X-ray tomosynthesis (DTS) may provide improved visualization of the markers of knee OA without the interference of superimposed anatomy. DTS utilizes a series of low-dose projection images over an arc of +/-20 degrees to reconstruct tomographic images parallel to the detector. We propose that DTS can increase accuracy and precision in JSN quantification. The geometric accuracy of DTS was characterized by quantifying joint space width (JSW) as a function of knee flexion and position using physical and anthropomorphic phantoms. Using a commercially available digital X-ray system, projection and DTS images were acquired for a Lucite rod phantom with known gaps at various source-object-distances, and angles of flexion. Gap width, representative of JSW, was measured using a validated algorithm. Over an object-to-detector-distance range of 5-21cm, a 3.0mm gap width was reproducibly measured in the DTS images, independent of magnification. A simulated 0.50mm (+/-0.13) JSN was quantified accurately (95% CI 0.44-0.56mm) in the DTS images. Angling the rods to represent knee flexion, the minimum gap could be precisely determined from the DTS images and was independent of flexion angle. JSN quantification using DTS was insensitive to distance from patient barrier and flexion angle. Potential exists for the optimization of DTS for accurate radiographic quantification of knee OA independent of patient positioning.

  19. Quantification of mixed chimerism by real time PCR on whole blood-impregnated FTA cards.

    PubMed

    Pezzoli, N; Silvy, M; Woronko, A; Le Treut, T; Lévy-Mozziconacci, A; Reviron, D; Gabert, J; Picard, C

    2007-09-01

    This study has investigated quantification of chimerism in sex-mismatched transplantations by quantitative real time PCR (RQ-PCR) using FTA paper for blood sampling. First, we demonstrate that the quantification of DNA from EDTA-blood which has been deposit on FTA card is accurate and reproducible. Secondly, we show that fraction of recipient cells detected by RQ-PCR was concordant between the FTA and salting-out method, reference DNA extraction method. Furthermore, the sensitivity of detection of recipient cells is relatively similar with the two methods. Our results show that this innovative method can be used for MC assessment by RQ-PCR.

  20. Methods for the quantification of coarse woody debris and an examination of its spatial patterning: A study from the Tenderfoot Creek Experimental Forest, MT

    Treesearch

    Paul B. Alaback; Duncan C. Lutes

    1997-01-01

    Methods for the quantification of coarse woody debris volume and the description of spatial patterning were studied in the Tenderfoot Creek Experimental Forest, Montana. The line transect method was found to be an accurate, unbiased estimator of down debris volume (> 10cm diameter) on 1/4 hectare fixed-area plots, when perpendicular lines were used. The Fischer...

  1. Rapid, cost-effective and accurate quantification of Yucca schidigera Roezl. steroidal saponins using HPLC-ELSD method.

    PubMed

    Tenon, Mathieu; Feuillère, Nicolas; Roller, Marc; Birtić, Simona

    2017-04-15

    Yucca GRAS-labelled saponins have been and are increasingly used in food/feed, pharmaceutical or cosmetic industries. Existing techniques presently used for Yucca steroidal saponin quantification remain either inaccurate and misleading or accurate but time consuming and cost prohibitive. The method reported here addresses all of the above challenges. HPLC/ELSD technique is an accurate and reliable method that yields results of appropriate repeatability and reproducibility. This method does not over- or under-estimate levels of steroidal saponins. HPLC/ELSD method does not require each and every pure standard of saponins, to quantify the group of steroidal saponins. The method is a time- and cost-effective technique that is suitable for routine industrial analyses. HPLC/ELSD methods yield a saponin fingerprints specific to the plant species. As the method is capable of distinguishing saponin profiles from taxonomically distant species, it can unravel plant adulteration issues. Copyright © 2016 The Author(s). Published by Elsevier Ltd.. All rights reserved.

  2. Disease quantification in dermatology: in vivo near-infrared spectroscopy measures correlate strongly with the clinical assessment of psoriasis severity

    NASA Astrophysics Data System (ADS)

    Greve, Tanja Maria; Kamp, Søren; Jemec, Gregor B. E.

    2013-03-01

    Accurate documentation of disease severity is a prerequisite for clinical research and the practice of evidence-based medicine. The quantification of skin diseases such as psoriasis currently relies heavily on clinical scores. Although these clinical scoring methods are well established and very useful in quantifying disease severity, they require an extensive clinical experience and carry a risk of subjectivity. We explore the opportunity to use in vivo near-infrared (NIR) spectra as an objective and noninvasive method for local disease severity assessment in 31 psoriasis patients in whom selected plaques were scored clinically. A partial least squares (PLS) regression model was used to analyze and predict the severity scores on the NIR spectra of psoriatic and uninvolved skin. The correlation between predicted and clinically assigned scores was R=0.94 (RMSE=0.96), suggesting that in vivo NIR provides accurate clinical quantification of psoriatic plaques. Hence, NIR may be a practical solution to clinical severity assessment of psoriasis, providing a continuous, linear, numerical value of severity.

  3. Powder X-ray diffraction method for the quantification of cocrystals in the crystallization mixture.

    PubMed

    Padrela, Luis; de Azevedo, Edmundo Gomes; Velaga, Sitaram P

    2012-08-01

    The solid state purity of cocrystals critically affects their performance. Thus, it is important to accurately quantify the purity of cocrystals in the final crystallization product. The aim of this study was to develop a powder X-ray diffraction (PXRD) quantification method for investigating the purity of cocrystals. The method developed was employed to study the formation of indomethacin-saccharin (IND-SAC) cocrystals by mechanochemical methods. Pure IND-SAC cocrystals were geometrically mixed with 1:1 w/w mixture of indomethacin/saccharin in various proportions. An accurately measured amount (550 mg) of the mixture was used for the PXRD measurements. The most intense, non-overlapping, characteristic diffraction peak of IND-SAC was used to construct the calibration curve in the range 0-100% (w/w). This calibration model was validated and used to monitor the formation of IND-SAC cocrystals by liquid-assisted grinding (LAG). The IND-SAC cocrystal calibration curve showed excellent linearity (R(2) = 0.9996) over the entire concentration range, displaying limit of detection (LOD) and limit of quantification (LOQ) values of 1.23% (w/w) and 3.74% (w/w), respectively. Validation results showed excellent correlations between actual and predicted concentrations of IND-SAC cocrystals (R(2) = 0.9981). The accuracy and reliability of the PXRD quantification method depend on the methods of sample preparation and handling. The crystallinity of the IND-SAC cocrystals was higher when larger amounts of methanol were used in the LAG method. The PXRD quantification method is suitable and reliable for verifying the purity of cocrystals in the final crystallization product.

  4. Bivariate quadratic method in quantifying the differential capacitance and energy capacity of supercapacitors under high current operation

    NASA Astrophysics Data System (ADS)

    Goh, Chin-Teng; Cruden, Andrew

    2014-11-01

    Capacitance and resistance are the fundamental electrical parameters used to evaluate the electrical characteristics of a supercapacitor, namely the dynamic voltage response, energy capacity, state of charge and health condition. In the British Standards EN62391 and EN62576, the constant capacitance method can be further improved with a differential capacitance that more accurately describes the dynamic voltage response of supercapacitors. This paper presents a novel bivariate quadratic based method to model the dynamic voltage response of supercapacitors under high current charge-discharge cycling, and to enable the derivation of the differential capacitance and energy capacity directly from terminal measurements, i.e. voltage and current, rather than from multiple pulsed-current or excitation signal tests across different bias levels. The estimation results the author achieves are in close agreement with experimental measurements, within a relative error of 0.2%, at various high current levels (25-200 A), more accurate than the constant capacitance method (4-7%). The archival value of this paper is the introduction of an improved quantification method for the electrical characteristics of supercapacitors, and the disclosure of the distinct properties of supercapacitors: the nonlinear capacitance-voltage characteristic, capacitance variation between charging and discharging, and distribution of energy capacity across the operating voltage window.

  5. ShatterProof: operational detection and quantification of chromothripsis.

    PubMed

    Govind, Shaylan K; Zia, Amin; Hennings-Yeomans, Pablo H; Watson, John D; Fraser, Michael; Anghel, Catalina; Wyatt, Alexander W; van der Kwast, Theodorus; Collins, Colin C; McPherson, John D; Bristow, Robert G; Boutros, Paul C

    2014-03-19

    Chromothripsis, a newly discovered type of complex genomic rearrangement, has been implicated in the evolution of several types of cancers. To date, it has been described in bone cancer, SHH-medulloblastoma and acute myeloid leukemia, amongst others, however there are still no formal or automated methods for detecting or annotating it in high throughput sequencing data. As such, findings of chromothripsis are difficult to compare and many cases likely escape detection altogether. We introduce ShatterProof, a software tool for detecting and quantifying chromothriptic events. ShatterProof takes structural variation calls (translocations, copy-number variations, short insertions and loss of heterozygosity) produced by any algorithm and using an operational definition of chromothripsis performs robust statistical tests to accurately predict the presence and location of chromothriptic events. Validation of our tool was conducted using clinical data sets including matched normal, prostate cancer samples in addition to the colorectal cancer and SCLC data sets used in the original description of chromothripsis. ShatterProof is computationally efficient, having low memory requirements and near linear computation time. This allows it to become a standard component of sequencing analysis pipelines, enabling researchers to routinely and accurately assess samples for chromothripsis. Source code and documentation can be found at http://search.cpan.org/~sgovind/Shatterproof.

  6. Spatial mapping and quantification of developmental branching morphogenesis.

    PubMed

    Short, Kieran; Hodson, Mark; Smyth, Ian

    2013-01-15

    Branching morphogenesis is a fundamental developmental mechanism that shapes the formation of many organs. The complex three-dimensional shapes derived by this process reflect equally complex genetic interactions between branching epithelia and their surrounding mesenchyme. Despite the importance of this process to normal adult organ function, analysis of branching has been stymied by the absence of a bespoke method to quantify accurately the complex spatial datasets that describe it. As a consequence, although many developmentally important genes are proposed to influence branching morphogenesis, we have no way of objectively assessing their individual contributions to this process. We report the development of a method for accurately quantifying many aspects of branching morphogenesis and we demonstrate its application to the study of organ development. As proof of principle we have employed this approach to analyse the developing mouse lung and kidney, describing the spatial characteristics of the branching ureteric bud and pulmonary epithelia. To demonstrate further its capacity to profile unrecognised genetic contributions to organ development, we examine Tgfb2 mutant kidneys, identifying elements of both developmental delay and specific spatial dysmorphology caused by haplo-insufficiency for this gene. This technical advance provides a crucial resource that will enable rigorous characterisation of the genetic and environmental factors that regulate this essential and evolutionarily conserved developmental mechanism.

  7. An optimized method for neurotransmitters and their metabolites analysis in mouse hypothalamus by high performance liquid chromatography-Q Exactive hybrid quadrupole-orbitrap high-resolution accurate mass spectrometry.

    PubMed

    Yang, Zong-Lin; Li, Hui; Wang, Bing; Liu, Shu-Ying

    2016-02-15

    Neurotransmitters (NTs) and their metabolites are known to play an essential role in maintaining various physiological functions in nervous system. However, there are many difficulties in the detection of NTs together with their metabolites in biological samples. A new method for NTs and their metabolites detection by high performance liquid chromatography coupled with Q Exactive hybrid quadruple-orbitrap high-resolution accurate mass spectrometry (HPLC-HRMS) was established in this paper. This method was a great development of the applying of Q Exactive MS in the quantitative analysis. This method enabled a rapid quantification of ten compounds within 18min. Good linearity was obtained with a correlation coefficient above 0.99. The concentration range of the limit of detection (LOD) and the limit of quantitation (LOQ) level were 0.0008-0.05nmol/mL and 0.002-25.0nmol/mL respectively. Precisions (relative standard deviation, RSD) of this method were at 0.36-12.70%. Recovery ranges were between 81.83% and 118.04%. Concentrations of these compounds in mouse hypothalamus were detected by Q Exactive LC-MS technology with this method. Copyright © 2016 Elsevier B.V. All rights reserved.

  8. Measuring the Hydraulic Effectiveness of Low Impact Development Practices in a Heavily Urbanised Environment: A Case Study from London, UK

    NASA Astrophysics Data System (ADS)

    El Hattab, M. H.; Vernon, D.; Mijic, A.

    2017-12-01

    Low impact development practices (LID) are deemed to have a synergetic effect in mitigating urban storm water flooding. Designing and implementing effective LID practices require reliable real-life data about their performance in different applications; however, there are limited studies providing such data. In this study an innovative micro-monitoring system to assess the performance of porous pavement and rain gardens as retrofitting technologies was developed. Three pilot streets in London, UK were selected as part of Thames Water Utilities Limited's Counters Creek scheme. The system includes a V-notch weir installed at the outlet of each LID device to provide an accurate and reliable quantification over a wide range of discharges. In addition to, a low flow sensor installed downstream of the V-notch to cross-check the readings. Having a flow survey time-series of the pre-retrofitting conditions from the study streets, extensive laboratory calibrations under different flow conditions depicting the exact site conditions were performed prior to installing the devices in the field. The micro-monitoring system is well suited for high-resolution temporal monitoring and enables accurate long-term evaluation of LID components' performance. Initial results from the field validated the robustness of the system in fulfilling its requirements.

  9. Metabolomics by Gas Chromatography-Mass Spectrometry: the combination of targeted and untargeted profiling

    PubMed Central

    Fiehn, Oliver

    2016-01-01

    Gas chromatography-mass spectrometry (GC-MS)-based metabolomics is ideal for identifying and quantitating small molecular metabolites (<650 daltons), including small acids, alcohols, hydroxyl acids, amino acids, sugars, fatty acids, sterols, catecholamines, drugs, and toxins, often using chemical derivatization to make these compounds volatile enough for gas chromatography. This unit shows that on GC-MS- based metabolomics easily allows integrating targeted assays for absolute quantification of specific metabolites with untargeted metabolomics to discover novel compounds. Complemented by database annotations using large spectral libraries and validated, standardized standard operating procedures, GC-MS can identify and semi-quantify over 200 compounds per study in human body fluids (e.g., plasma, urine or stool) samples. Deconvolution software enables detection of more than 300 additional unidentified signals that can be annotated through accurate mass instruments with appropriate data processing workflows, similar to liquid chromatography-MS untargeted profiling (LC-MS). Hence, GC-MS is a mature technology that not only uses classic detectors (‘quadrupole’) but also target mass spectrometers (‘triple quadrupole’) and accurate mass instruments (‘quadrupole-time of flight’). This unit covers the following aspects of GC-MS-based metabolomics: (i) sample preparation from mammalian samples, (ii) acquisition of data, (iii) quality control, and (iv) data processing. PMID:27038389

  10. A liquid chromatography-tandem mass spectrometry-based targeted proteomics assay for monitoring P-glycoprotein levels in human breast tissue.

    PubMed

    Yang, Ting; Chen, Fei; Xu, Feifei; Wang, Fengliang; Xu, Qingqing; Chen, Yun

    2014-09-25

    P-glycoprotein (P-gp) can efflux drugs from cancer cells, and its overexpression is commonly associated with multi-drug resistance (MDR). Thus, the accurate quantification of P-gp would help predict the response to chemotherapy and for prognosis of breast cancer patients. An advanced liquid chromatography-tandem mass spectrometry (LC/MS/MS)-based targeted proteomics assay was developed and validated for monitoring P-gp levels in breast tissue. Tryptic peptide 368IIDNKPSIDSYSK380 was selected as a surrogate analyte for quantification, and immuno-depleted tissue extract was used as a surrogate matrix. Matched pairs of breast tissue samples from 60 patients who were suspected to have drug resistance were subject to analysis. The levels of P-gp were quantified. Using data from normal tissue, we suggested a P-gp reference interval. The experimental values of tumor tissue samples were compared with those obtained from Western blotting and immunohistochemistry (IHC). The result indicated that the targeted proteomics approach was comparable to IHC but provided a lower limit of quantification (LOQ) and could afford more reliable results at low concentrations than the other two methods. LC/MS/MS-based targeted proteomics may allow the quantification of P-gp in breast tissue in a more accurate manner. Copyright © 2014 Elsevier B.V. All rights reserved.

  11. miR-MaGiC improves quantification accuracy for small RNA-seq.

    PubMed

    Russell, Pamela H; Vestal, Brian; Shi, Wen; Rudra, Pratyaydipta D; Dowell, Robin; Radcliffe, Richard; Saba, Laura; Kechris, Katerina

    2018-05-15

    Many tools have been developed to profile microRNA (miRNA) expression from small RNA-seq data. These tools must contend with several issues: the small size of miRNAs, the small number of unique miRNAs, the fact that similar miRNAs can be transcribed from multiple loci, and the presence of miRNA isoforms known as isomiRs. Methods failing to address these issues can return misleading information. We propose a novel quantification method designed to address these concerns. We present miR-MaGiC, a novel miRNA quantification method, implemented as a cross-platform tool in Java. miR-MaGiC performs stringent mapping to a core region of each miRNA and defines a meaningful set of target miRNA sequences by collapsing the miRNA space to "functional groups". We hypothesize that these two features, mapping stringency and collapsing, provide more optimal quantification to a more meaningful unit (i.e., miRNA family). We test miR-MaGiC and several published methods on 210 small RNA-seq libraries, evaluating each method's ability to accurately reflect global miRNA expression profiles. We define accuracy as total counts close to the total number of input reads originating from miRNAs. We find that miR-MaGiC, which incorporates both stringency and collapsing, provides the most accurate counts.

  12. Resolution and quantification accuracy enhancement of functional delay and sum beamforming for three-dimensional acoustic source identification with solid spherical arrays

    NASA Astrophysics Data System (ADS)

    Chu, Zhigang; Yang, Yang; Shen, Linbang

    2017-05-01

    Functional delay and sum (FDAS) is a novel beamforming algorithm introduced for the three-dimensional (3D) acoustic source identification with solid spherical microphone arrays. Being capable of offering significantly attenuated sidelobes with a fast speed, the algorithm promises to play an important role in interior acoustic source identification. However, it presents some intrinsic imperfections, specifically poor spatial resolution and low quantification accuracy. This paper focuses on conquering these imperfections by ridge detection (RD) and deconvolution approach for the mapping of acoustic sources (DAMAS). The suggested methods are referred to as FDAS+RD and FDAS+RD+DAMAS. Both computer simulations and experiments are utilized to validate their effects. Several interesting conclusions have emerged: (1) FDAS+RD and FDAS+RD+DAMAS both can dramatically ameliorate FDAS's spatial resolution and at the same time inherit its advantages. (2) Compared to the conventional DAMAS, FDAS+RD+DAMAS enjoys the same super spatial resolution, stronger sidelobe attenuation capability and more than two hundred times faster speed. (3) FDAS+RD+DAMAS can effectively conquer FDAS's low quantification accuracy. Whether the focus distance is equal to the distance from the source to the array center or not, it can quantify the source average pressure contribution accurately. This study will be of great significance to the accurate and quick localization and quantification of acoustic sources in cabin environments.

  13. Targeted Data Extraction of the MS/MS Spectra Generated by Data-independent Acquisition: A New Concept for Consistent and Accurate Proteome Analysis*

    PubMed Central

    Gillet, Ludovic C.; Navarro, Pedro; Tate, Stephen; Röst, Hannes; Selevsek, Nathalie; Reiter, Lukas; Bonner, Ron; Aebersold, Ruedi

    2012-01-01

    Most proteomic studies use liquid chromatography coupled to tandem mass spectrometry to identify and quantify the peptides generated by the proteolysis of a biological sample. However, with the current methods it remains challenging to rapidly, consistently, reproducibly, accurately, and sensitively detect and quantify large fractions of proteomes across multiple samples. Here we present a new strategy that systematically queries sample sets for the presence and quantity of essentially any protein of interest. It consists of using the information available in fragment ion spectral libraries to mine the complete fragment ion maps generated using a data-independent acquisition method. For this study, the data were acquired on a fast, high resolution quadrupole-quadrupole time-of-flight (TOF) instrument by repeatedly cycling through 32 consecutive 25-Da precursor isolation windows (swaths). This SWATH MS acquisition setup generates, in a single sample injection, time-resolved fragment ion spectra for all the analytes detectable within the 400–1200 m/z precursor range and the user-defined retention time window. We show that suitable combinations of fragment ions extracted from these data sets are sufficiently specific to confidently identify query peptides over a dynamic range of 4 orders of magnitude, even if the precursors of the queried peptides are not detectable in the survey scans. We also show that queried peptides are quantified with a consistency and accuracy comparable with that of selected reaction monitoring, the gold standard proteomic quantification method. Moreover, targeted data extraction enables ad libitum quantification refinement and dynamic extension of protein probing by iterative re-mining of the once-and-forever acquired data sets. This combination of unbiased, broad range precursor ion fragmentation and targeted data extraction alleviates most constraints of present proteomic methods and should be equally applicable to the comprehensive analysis of other classes of analytes, beyond proteomics. PMID:22261725

  14. A quantification model for the structure of clay materials.

    PubMed

    Tang, Liansheng; Sang, Haitao; Chen, Haokun; Sun, Yinlei; Zhang, Longjian

    2016-07-04

    In this paper, the quantification for clay structure is explicitly explained, and the approach and goals of quantification are also discussed. The authors consider that the purpose of the quantification for clay structure is to determine some parameters that can be used to quantitatively characterize the impact of clay structure on the macro-mechanical behaviour. According to the system theory and the law of energy conservation, a quantification model for the structure characteristics of clay materials is established and three quantitative parameters (i.e., deformation structure potential, strength structure potential and comprehensive structure potential) are proposed. And the corresponding tests are conducted. The experimental results show that these quantitative parameters can accurately reflect the influence of clay structure on the deformation behaviour, strength behaviour and the relative magnitude of structural influence on the above two quantitative parameters, respectively. These quantitative parameters have explicit mechanical meanings, and can be used to characterize the structural influences of clay on its mechanical behaviour.

  15. [Building Mass Spectrometry Spectral Libraries of Human Cancer Cell Lines].

    PubMed

    Faktor, J; Bouchal, P

    Cancer research often focuses on protein quantification in model cancer cell lines and cancer tissues. SWATH (sequential windowed acquisition of all theoretical fragment ion spectra), the state of the art method, enables the quantification of all proteins included in spectral library. Spectral library contains fragmentation patterns of each detectable protein in a sample. Thorough spectral library preparation will improve quantitation of low abundant proteins which usually play an important role in cancer. Our research is focused on the optimization of spectral library preparation aimed at maximizing the number of identified proteins in MCF-7 breast cancer cell line. First, we optimized the sample preparation prior entering the mass spectrometer. We examined the effects of lysis buffer composition, peptide dissolution protocol and the material of sample vial on the number of proteins identified in spectral library. Next, we optimized mass spectrometry (MS) method for spectral library data acquisition. Our thorough optimized protocol for spectral library building enabled the identification of 1,653 proteins (FDR < 1%) in 1 µg of MCF-7 lysate. This work contributed to the enhancement of protein coverage in SWATH digital biobanks which enable quantification of arbitrary protein from physically unavailable samples. In future, high quality spectral libraries could play a key role in preparing of patient proteome digital fingerprints.Key words: biomarker - mass spectrometry - proteomics - digital biobanking - SWATH - protein quantificationThis work was supported by the project MEYS - NPS I - LO1413.The authors declare they have no potential conflicts of interest concerning drugs, products, or services used in the study.The Editorial Board declares that the manuscript met the ICMJE recommendation for biomedical papers.Submitted: 7. 5. 2016Accepted: 9. 6. 2016.

  16. Rapid quantification of vesicle concentration for DOPG/DOPC and Cardiolipin/DOPC mixed lipid systems of variable composition.

    PubMed

    Elmer-Dixon, Margaret M; Bowler, Bruce E

    2018-05-19

    A novel approach to quantify mixed lipid systems is described. Traditional approaches to lipid vesicle quantification are time consuming, require large amounts of material and are destructive. We extend our recently described method for quantification of pure lipid systems to mixed lipid systems. The method only requires a UV-Vis spectrometer and does not destroy sample. Mie scattering data from absorbance measurements are used as input into a Matlab program to calculate the total vesicle concentration and the concentrations of each lipid in the mixed lipid system. The technique is fast and accurate, which is essential for analytical lipid binding experiments. Copyright © 2018. Published by Elsevier Inc.

  17. Lesion Quantification in Dual-Modality Mammotomography

    NASA Astrophysics Data System (ADS)

    Li, Heng; Zheng, Yibin; More, Mitali J.; Goodale, Patricia J.; Williams, Mark B.

    2007-02-01

    This paper describes a novel x-ray/SPECT dual modality breast imaging system that provides 3D structural and functional information. While only a limited number of views on one side of the breast can be acquired due to mechanical and time constraints, we developed a technique to compensate for the limited angle artifact in reconstruction images and accurately estimate both the lesion size and radioactivity concentration. Various angular sampling strategies were evaluated using both simulated and experimental data. It was demonstrated that quantification of lesion size to an accuracy of 10% and quantification of radioactivity to an accuracy of 20% are feasible from limited-angle data acquired with clinically practical dosage and acquisition time

  18. Technical Note: Deep learning based MRAC using rapid ultra-short echo time imaging.

    PubMed

    Jang, Hyungseok; Liu, Fang; Zhao, Gengyan; Bradshaw, Tyler; McMillan, Alan B

    2018-05-15

    In this study, we explore the feasibility of a novel framework for MR-based attenuation correction for PET/MR imaging based on deep learning via convolutional neural networks, which enables fully automated and robust estimation of a pseudo CT image based on ultrashort echo time (UTE), fat, and water images obtained by a rapid MR acquisition. MR images for MRAC are acquired using dual echo ramped hybrid encoding (dRHE), where both UTE and out-of-phase echo images are obtained within a short single acquisition (35 sec). Tissue labeling of air, soft tissue, and bone in the UTE image is accomplished via a deep learning network that was pre-trained with T1-weighted MR images. UTE images are used as input to the network, which was trained using labels derived from co-registered CT images. The tissue labels estimated by deep learning are refined by a conditional random field based correction. The soft tissue labels are further separated into fat and water components using the two-point Dixon method. The estimated bone, air, fat, and water images are then assigned appropriate Hounsfield units, resulting in a pseudo CT image for PET attenuation correction. To evaluate the proposed MRAC method, PET/MR imaging of the head was performed on 8 human subjects, where Dice similarity coefficients of the estimated tissue labels and relative PET errors were evaluated through comparison to a registered CT image. Dice coefficients for air (within the head), soft tissue, and bone labels were 0.76±0.03, 0.96±0.006, and 0.88±0.01. In PET quantification, the proposed MRAC method produced relative PET errors less than 1% within most brain regions. The proposed MRAC method utilizing deep learning with transfer learning and an efficient dRHE acquisition enables reliable PET quantification with accurate and rapid pseudo CT generation. This article is protected by copyright. All rights reserved. This article is protected by copyright. All rights reserved.

  19. Development of Droplet Microfluidics Enabling High-Throughput Single-Cell Analysis.

    PubMed

    Wen, Na; Zhao, Zhan; Fan, Beiyuan; Chen, Deyong; Men, Dong; Wang, Junbo; Chen, Jian

    2016-07-05

    This article reviews recent developments in droplet microfluidics enabling high-throughput single-cell analysis. Five key aspects in this field are included in this review: (1) prototype demonstration of single-cell encapsulation in microfluidic droplets; (2) technical improvements of single-cell encapsulation in microfluidic droplets; (3) microfluidic droplets enabling single-cell proteomic analysis; (4) microfluidic droplets enabling single-cell genomic analysis; and (5) integrated microfluidic droplet systems enabling single-cell screening. We examine the advantages and limitations of each technique and discuss future research opportunities by focusing on key performances of throughput, multifunctionality, and absolute quantification.

  20. Systematic development of small molecules to inhibit specific microscopic steps of Aβ42 aggregation in Alzheimer’s disease

    PubMed Central

    Habchi, Johnny; Chia, Sean; Limbocker, Ryan; Mannini, Benedetta; Ahn, Minkoo; Perni, Michele; Hansson, Oskar; Arosio, Paolo; Kumita, Janet R.; Challa, Pavan Kumar; Cohen, Samuel I. A.; Dobson, Christopher M.; Knowles, Tuomas P. J.; Vendruscolo, Michele

    2017-01-01

    The aggregation of the 42-residue form of the amyloid-β peptide (Aβ42) is a pivotal event in Alzheimer’s disease (AD). The use of chemical kinetics has recently enabled highly accurate quantifications of the effects of small molecules on specific microscopic steps in Aβ42 aggregation. Here, we exploit this approach to develop a rational drug discovery strategy against Aβ42 aggregation that uses as a read-out the changes in the nucleation and elongation rate constants caused by candidate small molecules. We thus identify a pool of compounds that target specific microscopic steps in Aβ42 aggregation. We then test further these small molecules in human cerebrospinal fluid and in a Caenorhabditis elegans model of AD. Our results show that this strategy represents a powerful approach to identify systematically small molecule lead compounds, thus offering an appealing opportunity to reduce the attrition problem in drug discovery. PMID:28011763

  1. Atlas-based liver segmentation and hepatic fat-fraction assessment for clinical trials.

    PubMed

    Yan, Zhennan; Zhang, Shaoting; Tan, Chaowei; Qin, Hongxing; Belaroussi, Boubakeur; Yu, Hui Jing; Miller, Colin; Metaxas, Dimitris N

    2015-04-01

    Automated assessment of hepatic fat-fraction is clinically important. A robust and precise segmentation would enable accurate, objective and consistent measurement of hepatic fat-fraction for disease quantification, therapy monitoring and drug development. However, segmenting the liver in clinical trials is a challenging task due to the variability of liver anatomy as well as the diverse sources the images were acquired from. In this paper, we propose an automated and robust framework for liver segmentation and assessment. It uses single statistical atlas registration to initialize a robust deformable model to obtain fine segmentation. Fat-fraction map is computed by using chemical shift based method in the delineated region of liver. This proposed method is validated on 14 abdominal magnetic resonance (MR) volumetric scans. The qualitative and quantitative comparisons show that our proposed method can achieve better segmentation accuracy with less variance comparing with two other atlas-based methods. Experimental results demonstrate the promises of our assessment framework. Copyright © 2014 Elsevier Ltd. All rights reserved.

  2. microRNA Expression Profiling: Technologies, Insights, and Prospects.

    PubMed

    Roden, Christine; Mastriano, Stephen; Wang, Nayi; Lu, Jun

    2015-01-01

    Since the early days of microRNA (miRNA) research, miRNA expression profiling technologies have provided important tools toward both better understanding of the biological functions of miRNAs and using miRNA expression as potential diagnostics. Multiple technologies, such as microarrays, next-generation sequencing, bead-based detection system, single-molecule measurements, and quantitative RT-PCR, have enabled accurate quantification of miRNAs and the subsequent derivation of key insights into diverse biological processes. As a class of ~22 nt long small noncoding RNAs, miRNAs present unique challenges in expression profiling that require careful experimental design and data analyses. We will particularly discuss how normalization and the presence of miRNA isoforms can impact data interpretation. We will present one example in which the consideration in data normalization has provided insights that helped to establish the global miRNA expression as a tumor suppressor. Finally, we discuss two future prospects of using miRNA profiling technologies to understand single cell variability and derive new rules for the functions of miRNA isoforms.

  3. Incorporation of unique molecular identifiers in TruSeq adapters improves the accuracy of quantitative sequencing.

    PubMed

    Hong, Jungeui; Gresham, David

    2017-11-01

    Quantitative analysis of next-generation sequencing (NGS) data requires discriminating duplicate reads generated by PCR from identical molecules that are of unique origin. Typically, PCR duplicates are identified as sequence reads that align to the same genomic coordinates using reference-based alignment. However, identical molecules can be independently generated during library preparation. Misidentification of these molecules as PCR duplicates can introduce unforeseen biases during analyses. Here, we developed a cost-effective sequencing adapter design by modifying Illumina TruSeq adapters to incorporate a unique molecular identifier (UMI) while maintaining the capacity to undertake multiplexed, single-index sequencing. Incorporation of UMIs into TruSeq adapters (TrUMIseq adapters) enables identification of bona fide PCR duplicates as identically mapped reads with identical UMIs. Using TrUMIseq adapters, we show that accurate removal of PCR duplicates results in improved accuracy of both allele frequency (AF) estimation in heterogeneous populations using DNA sequencing and gene expression quantification using RNA-Seq.

  4. Use of satellite imagery for wildland resource evaluation

    NASA Technical Reports Server (NTRS)

    Tueller, P. T. (Principal Investigator)

    1972-01-01

    The author has identified the following significant results. Accurate identification and delineation of crested wheatgrass seedlings has enabled a broad inventory of this resource. The entire state of Nevada is being inventoried for crested wheatgrass seedlings. Irrigated fields and pastures are easily visible from ERTS-1 imagery and were quantified in total acres on 12,500 square miles of the state. Recent fire scars may be monitored and inventoried from satellite-borne imagery. Inventory and quantification of large native meadows of Nevada have been accomplished on one frame of ERTS-1 data. This inventory would not have been economically feasible with any known ground inventory method. The U-2 sequential data taken in the spring revealed several resource management oriented phenological changes in the vegetation. The green-up of grasses and shrubs was detected on the imagery and supplied a good indicator for livestock turn-out dates. Water level manipulations in the Ruby Marsh were readily detected by noting changes in vegetation growth and reflectance.

  5. Application of dietary fiber method AOAC 2011.25 in fruit and comparison with AOAC 991.43 method.

    PubMed

    Tobaruela, Eric de C; Santos, Aline de O; Almeida-Muradian, Ligia B de; Araujo, Elias da S; Lajolo, Franco M; Menezes, Elizabete W

    2018-01-01

    AOAC 2011.25 method enables the quantification of most of the dietary fiber (DF) components according to the definition proposed by Codex Alimentarius. This study aimed to compare the DF content in fruits analyzed by the AOAC 2011.25 and AOAC 991.43 methods. Plums (Prunus salicina), atemoyas (Annona x atemoya), jackfruits (Artocarpus heterophyllus), and mature coconuts (Cocos nucifera) from different Brazilian regions (3 lots/fruit) were analyzed for DF, resistant starch, and fructans contents. The AOAC 2011.25 method was evaluated for precision, accuracy, and linearity in different food matrices and carbohydrate standards. The DF contents of plums, atemoyas, and jackfruits obtained by AOAC 2011.25 was higher than those obtained by AOAC 991.43 due to the presence of fructans. The DF content of mature coconuts obtained by the same methods did not present a significant difference. The AOAC 2011.25 method is recommended for fruits with considerable fructans content because it achieves more accurate values. Copyright © 2016 Elsevier Ltd. All rights reserved.

  6. Development and Validation of an HPLC Method for Karanjin in Pongamia pinnata linn. Leaves.

    PubMed

    Katekhaye, S; Kale, M S; Laddha, K S

    2012-01-01

    A rapid, simple and specific reversed-phase HPLC method has been developed for analysis of karanjin in Pongamia pinnata Linn. leaves. HPLC analysis was performed on a C(18) column using an 85:13.5:1.5 (v/v) mixtures of methanol, water and acetic acid as isocratic mobile phase at a flow rate of 1 ml/min. UV detection was at 300 nm. The method was validated for accuracy, precision, linearity, specificity. Validation revealed the method is specific, accurate, precise, reliable and reproducible. Good linear correlation coefficients (r(2)>0.997) were obtained for calibration plots in the ranges tested. Limit of detection was 4.35 μg and limit of quantification was 16.56 μg. Intra and inter-day RSD of retention times and peak areas was less than 1.24% and recovery was between 95.05 and 101.05%. The established HPLC method is appropriate enabling efficient quantitative analysis of karanjin in Pongamia pinnata leaves.

  7. Development and Validation of an HPLC Method for Karanjin in Pongamia pinnata linn. Leaves

    PubMed Central

    Katekhaye, S; Kale, M. S.; Laddha, K. S.

    2012-01-01

    A rapid, simple and specific reversed-phase HPLC method has been developed for analysis of karanjin in Pongamia pinnata Linn. leaves. HPLC analysis was performed on a C18 column using an 85:13.5:1.5 (v/v) mixtures of methanol, water and acetic acid as isocratic mobile phase at a flow rate of 1 ml/min. UV detection was at 300 nm. The method was validated for accuracy, precision, linearity, specificity. Validation revealed the method is specific, accurate, precise, reliable and reproducible. Good linear correlation coefficients (r2>0.997) were obtained for calibration plots in the ranges tested. Limit of detection was 4.35 μg and limit of quantification was 16.56 μg. Intra and inter-day RSD of retention times and peak areas was less than 1.24% and recovery was between 95.05 and 101.05%. The established HPLC method is appropriate enabling efficient quantitative analysis of karanjin in Pongamia pinnata leaves. PMID:23204626

  8. Efficient visualization of high-throughput targeted proteomics experiments: TAPIR.

    PubMed

    Röst, Hannes L; Rosenberger, George; Aebersold, Ruedi; Malmström, Lars

    2015-07-15

    Targeted mass spectrometry comprises a set of powerful methods to obtain accurate and consistent protein quantification in complex samples. To fully exploit these techniques, a cross-platform and open-source software stack based on standardized data exchange formats is required. We present TAPIR, a fast and efficient Python visualization software for chromatograms and peaks identified in targeted proteomics experiments. The input formats are open, community-driven standardized data formats (mzML for raw data storage and TraML encoding the hierarchical relationships between transitions, peptides and proteins). TAPIR is scalable to proteome-wide targeted proteomics studies (as enabled by SWATH-MS), allowing researchers to visualize high-throughput datasets. The framework integrates well with existing automated analysis pipelines and can be extended beyond targeted proteomics to other types of analyses. TAPIR is available for all computing platforms under the 3-clause BSD license at https://github.com/msproteomicstools/msproteomicstools. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  9. Assessment of metal ion concentration in water with structured feature selection.

    PubMed

    Naula, Pekka; Airola, Antti; Pihlasalo, Sari; Montoya Perez, Ileana; Salakoski, Tapio; Pahikkala, Tapio

    2017-10-01

    We propose a cost-effective system for the determination of metal ion concentration in water, addressing a central issue in water resources management. The system combines novel luminometric label array technology with a machine learning algorithm that selects a minimal number of array reagents (modulators) and liquid sample dilutions, such that enable accurate quantification. The algorithm is able to identify the optimal modulators and sample dilutions leading to cost reductions since less manual labour and resources are needed. Inferring the ion detector involves a unique type of a structured feature selection problem, which we formalize in this paper. We propose a novel Cartesian greedy forward feature selection algorithm for solving the problem. The novel algorithm was evaluated in the concentration assessment of five metal ions and the performance was compared to two known feature selection approaches. The results demonstrate that the proposed system can assist in lowering the costs with minimal loss in accuracy. Copyright © 2017 Elsevier Ltd. All rights reserved.

  10. Morphometric and molecular identification of individual barnacle cyprids from wild plankton: an approach to detecting fouling and invasive barnacle species.

    PubMed

    Chen, Hsi-Nien; Høeg, Jens T; Chan, Benny K K

    2013-01-01

    The present study used DNA barcodes to identify individual cyprids to species. This enables accurate quantification of larvae of potential fouling species in the plankton. In addition, it explains the settlement patterns of barnacles and serves as an early warning system of unwanted immigrant species. Sequences from a total of 540 individual cypris larvae from Taiwanese waters formed 36 monophyletic clades (species) in a phylogenetic tree. Of these clades, 26 were identified to species, but 10 unknown monophyletic clades represented non-native species. Cyprids of the invasive barnacle, Megabalanus cocopoma, were identified. Multivariate analysis of antennular morphometric characters revealed three significant clusters in a nMDS plot, viz. a bell-shaped attachment organ (most species), a shoe-shaped attachment organ (some species), and a spear-shaped attachment organ (coral barnacles only). These differences in attachment organ structure indicate that antennular structures interact directly with the diverse substrata involved in cirripede settlement.

  11. Systematic analysis of protein turnover in primary cells.

    PubMed

    Mathieson, Toby; Franken, Holger; Kosinski, Jan; Kurzawa, Nils; Zinn, Nico; Sweetman, Gavain; Poeckel, Daniel; Ratnu, Vikram S; Schramm, Maike; Becher, Isabelle; Steidel, Michael; Noh, Kyung-Min; Bergamini, Giovanna; Beck, Martin; Bantscheff, Marcus; Savitski, Mikhail M

    2018-02-15

    A better understanding of proteostasis in health and disease requires robust methods to determine protein half-lives. Here we improve the precision and accuracy of peptide ion intensity-based quantification, enabling more accurate protein turnover determination in non-dividing cells by dynamic SILAC-based proteomics. This approach allows exact determination of protein half-lives ranging from 10 to >1000 h. We identified 4000-6000 proteins in several non-dividing cell types, corresponding to 9699 unique protein identifications over the entire data set. We observed similar protein half-lives in B-cells, natural killer cells and monocytes, whereas hepatocytes and mouse embryonic neurons show substantial differences. Our data set extends and statistically validates the previous observation that subunits of protein complexes tend to have coherent turnover. Moreover, analysis of different proteasome and nuclear pore complex assemblies suggests that their turnover rate is architecture dependent. These results illustrate that our approach allows investigating protein turnover and its implications in various cell types.

  12. Whispering gallery mode resonators for rapid label-free biosensing in small volume droplets.

    PubMed

    Wildgen, Sarah M; Dunn, Robert C

    2015-03-23

    Rapid biosensing requires fast mass transport of the analyte to the surface of the sensing element. To optimize analysis times, both mass transport in solution and the geometry and size of the sensing element need to be considered. Small dielectric spheres, tens of microns in diameter, can act as label-free biosensors using whispering gallery mode (WGM) resonances. WGM resonances are sensitive to the effective refractive index, which changes upon analyte binding to recognition sites on functionalized resonators. The spherical geometry and tens of microns diameter of these resonators provides an efficient target for sensing while their compact size enables detection in limited volumes. Here, we explore conditions leading to rapid analyte detection using WGM resonators as label-free sensors in 10 μL sample droplets. Droplet evaporation leads to potentially useful convective mixing, but also limits the time over which analysis can be completed. We show that active droplet mixing combined with initial binding rate measurements is required for accurate nanomolar protein quantification within the first minute following injection.

  13. Sigma: Strain-level inference of genomes from metagenomic analysis for biosurveillance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ahn, Tae-Hyuk; Chai, Juanjuan; Pan, Chongle

    Motivation: Metagenomic sequencing of clinical samples provides a promising technique for direct pathogen detection and characterization in biosurveillance. Taxonomic analysis at the strain level can be used to resolve serotypes of a pathogen in biosurveillance. Sigma was developed for strain-level identification and quantification of pathogens using their reference genomes based on metagenomic analysis. Results: Sigma provides not only accurate strain-level inferences, but also three unique capabilities: (i) Sigma quantifies the statistical uncertainty of its inferences, which includes hypothesis testing of identified genomes and confidence interval estimation of their relative abundances; (ii) Sigma enables strain variant calling by assigning metagenomic readsmore » to their most likely reference genomes; and (iii) Sigma supports parallel computing for fast analysis of large datasets. In conclusion, the algorithm performance was evaluated using simulated mock communities and fecal samples with spike-in pathogen strains. Availability and Implementation: Sigma was implemented in C++ with source codes and binaries freely available at http://sigma.omicsbio.org.« less

  14. Sigma: Strain-level inference of genomes from metagenomic analysis for biosurveillance

    DOE PAGES

    Ahn, Tae-Hyuk; Chai, Juanjuan; Pan, Chongle

    2014-09-29

    Motivation: Metagenomic sequencing of clinical samples provides a promising technique for direct pathogen detection and characterization in biosurveillance. Taxonomic analysis at the strain level can be used to resolve serotypes of a pathogen in biosurveillance. Sigma was developed for strain-level identification and quantification of pathogens using their reference genomes based on metagenomic analysis. Results: Sigma provides not only accurate strain-level inferences, but also three unique capabilities: (i) Sigma quantifies the statistical uncertainty of its inferences, which includes hypothesis testing of identified genomes and confidence interval estimation of their relative abundances; (ii) Sigma enables strain variant calling by assigning metagenomic readsmore » to their most likely reference genomes; and (iii) Sigma supports parallel computing for fast analysis of large datasets. In conclusion, the algorithm performance was evaluated using simulated mock communities and fecal samples with spike-in pathogen strains. Availability and Implementation: Sigma was implemented in C++ with source codes and binaries freely available at http://sigma.omicsbio.org.« less

  15. Rapid quantification and sex determination of forensic evidence materials.

    PubMed

    Andréasson, Hanna; Allen, Marie

    2003-11-01

    DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.

  16. Infectious helminth ova in wastewater and sludge: A review on public health issues and current quantification practices.

    PubMed

    Gyawali, P

    2018-02-01

    Raw and partially treated wastewater has been widely used to maintain the global water demand. Presence of viable helminth ova and larvae in the wastewater raised significant public health concern especially when used for agriculture and aquaculture. Depending on the prevalence of helminth infections in communities, up to 1.0 × 10 3 ova/larvae can be presented per litre of wastewater and 4 gm (dry weight) of sludge. Multi-barrier approaches including pathogen reduction, risk assessment, and exposure reduction have been suggested by health regulators to minimise the potential health risk. However, with a lack of a sensitive and specific method for the quantitative detection of viable helminth ova from wastewater, an accurate health risk assessment is difficult to achieve. As a result, helminth infections are difficult to control from the communities despite two decades of global effort (mass drug administration). Molecular methods can be more sensitive and specific than currently adapted culture-based and vital stain methods. The molecular methods, however, required more and thorough investigation for its ability with accurate quantification of viable helminth ova/larvae from wastewater and sludge samples. Understanding different cell stages and corresponding gene copy numbers is pivotal for accurate quantification of helminth ova/larvae in wastewater samples. Identifying specific genetic markers including protein, lipid, and metabolites using multiomics approach could be utilized for cheap, rapid, sensitive, specific and point of care detection tools for helminth ova and larva in the wastewater.

  17. A sensitive and accurate quantification method for the detection of hepatitis B virus covalently closed circular DNA by the application of a droplet digital polymerase chain reaction amplification system.

    PubMed

    Mu, Di; Yan, Liang; Tang, Hui; Liao, Yong

    2015-10-01

    To develop a sensitive and accurate assay system for the quantification of covalently closed circular HBV DNA (cccDNA) for future clinical monitoring of cccDNA fluctuation during antiviral therapy in the liver of infected patients. A droplet digital PCR (ddPCR)-based assay system detected template DNA input at the single copy level (or ~10(-5) pg of plasmid HBV DNA) by using serially diluted plasmid HBV DNA samples. Compared with the conventional quantitative PCR assay in the detection of cccDNA, which required at least 50 ng of template DNA input, a parallel experiment applying a ddPCR system demonstrates that the lowest detection limit of cccDNA from HepG2.215 cellular DNA samples is around 1 ng, which is equivalent to 0.54 ± 0.94 copies of cccDNA. In addition, we demonstrated that the addition of cccDNA-safe exonuclease and utilization of cccDNA-specific primers in the ddPCR assay system significantly improved the detection accuracy of HBV cccDNA from HepG2.215 cellular DNA samples. The ddPCR-based cccDNA detection system is a sensitive and accurate assay for the quantification of cccDNA in HBV-transfected HepG2.215 cellular DNA samples and may represent an important method for future application in monitoring cccDNA fluctuation during antiviral therapy.

  18. Recommendations for Accurate Resolution of Gene and Isoform Allele-Specific Expression in RNA-Seq Data

    PubMed Central

    Wood, David L. A.; Nones, Katia; Steptoe, Anita; Christ, Angelika; Harliwong, Ivon; Newell, Felicity; Bruxner, Timothy J. C.; Miller, David; Cloonan, Nicole; Grimmond, Sean M.

    2015-01-01

    Genetic variation modulates gene expression transcriptionally or post-transcriptionally, and can profoundly alter an individual’s phenotype. Measuring allelic differential expression at heterozygous loci within an individual, a phenomenon called allele-specific expression (ASE), can assist in identifying such factors. Massively parallel DNA and RNA sequencing and advances in bioinformatic methodologies provide an outstanding opportunity to measure ASE genome-wide. In this study, matched DNA and RNA sequencing, genotyping arrays and computationally phased haplotypes were integrated to comprehensively and conservatively quantify ASE in a single human brain and liver tissue sample. We describe a methodological evaluation and assessment of common bioinformatic steps for ASE quantification, and recommend a robust approach to accurately measure SNP, gene and isoform ASE through the use of personalized haplotype genome alignment, strict alignment quality control and intragenic SNP aggregation. Our results indicate that accurate ASE quantification requires careful bioinformatic analyses and is adversely affected by sample specific alignment confounders and random sampling even at moderate sequence depths. We identified multiple known and several novel ASE genes in liver, including WDR72, DSP and UBD, as well as genes that contained ASE SNPs with imbalance direction discordant with haplotype phase, explainable by annotated transcript structure, suggesting isoform derived ASE. The methods evaluated in this study will be of use to researchers performing highly conservative quantification of ASE, and the genes and isoforms identified as ASE of interest to researchers studying those loci. PMID:25965996

  19. Feasibility and accuracy of dual-layer spectral detector computed tomography for quantification of gadolinium: a phantom study.

    PubMed

    van Hamersvelt, Robbert W; Willemink, Martin J; de Jong, Pim A; Milles, Julien; Vlassenbroek, Alain; Schilham, Arnold M R; Leiner, Tim

    2017-09-01

    The aim of this study was to evaluate the feasibility and accuracy of dual-layer spectral detector CT (SDCT) for the quantification of clinically encountered gadolinium concentrations. The cardiac chamber of an anthropomorphic thoracic phantom was equipped with 14 tubular inserts containing different gadolinium concentrations, ranging from 0 to 26.3 mg/mL (0.0, 0.1, 0.2, 0.4, 0.5, 1.0, 2.0, 3.0, 4.0, 5.1, 10.6, 15.7, 20.7 and 26.3 mg/mL). Images were acquired using a novel 64-detector row SDCT system at 120 and 140 kVp. Acquisitions were repeated five times to assess reproducibility. Regions of interest (ROIs) were drawn on three slices per insert. A spectral plot was extracted for every ROI and mean attenuation profiles were fitted to known attenuation profiles of water and pure gadolinium using in-house-developed software to calculate gadolinium concentrations. At both 120 and 140 kVp, excellent correlations between scan repetitions and true and measured gadolinium concentrations were found (R > 0.99, P < 0.001; ICCs > 0.99, CI 0.99-1.00). Relative mean measurement errors stayed below 10% down to 2.0 mg/mL true gadolinium concentration at 120 kVp and below 5% down to 1.0 mg/mL true gadolinium concentration at 140 kVp. SDCT allows for accurate quantification of gadolinium at both 120 and 140 kVp. Lowest measurement errors were found for 140 kVp acquisitions. • Gadolinium quantification may be useful in patients with contraindication to iodine. • Dual-layer spectral detector CT allows for overall accurate quantification of gadolinium. • Interscan variability of gadolinium quantification using SDCT material decomposition is excellent.

  20. A rapid and accurate quantification method for real-time dynamic analysis of cellular lipids during microalgal fermentation processes in Chlorella protothecoides with low field nuclear magnetic resonance.

    PubMed

    Wang, Tao; Liu, Tingting; Wang, Zejian; Tian, Xiwei; Yang, Yi; Guo, Meijin; Chu, Ju; Zhuang, Yingping

    2016-05-01

    The rapid and real-time lipid determination can provide valuable information on process regulation and optimization in the algal lipid mass production. In this study, a rapid, accurate and precise quantification method of in vivo cellular lipids of Chlorella protothecoides using low field nuclear magnetic resonance (LF-NMR) was newly developed. LF-NMR was extremely sensitive to the algal lipids with the limits of the detection (LOD) of 0.0026g and 0.32g/L in dry lipid samples and algal broth, respectively, as well as limits of quantification (LOQ) of 0.0093g and 1.18g/L. Moreover, the LF-NMR signal was specifically proportional to the cellular lipids of C. protothecoides, thus the superior regression curves existing in a wide detection range from 0.02 to 0.42g for dry lipids and from 1.12 to 8.97gL(-1) of lipid concentration for in vivo lipid quantification were obtained with all R(2) higher than 0.99, irrespective of the lipid content and fatty acids profile variations. The accuracy of this novel method was further verified to be reliable by comparing lipid quantification results to those obtained by GC-MS. And the relative standard deviation (RSD) of LF-NMR results were smaller than 2%, suggesting the precision of this method. Finally, this method was successfully used in the on-line lipid monitoring during the algal lipid fermentation processes, making it possible for better understanding of the lipid accumulation mechanism and dynamic bioprocess control. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Quantification of doping compounds in faecal samples from racing pigeons, by liquid chromatography-tandem mass spectrometry.

    PubMed

    Moreira, Fernando X; Silva, Renata; André, Maria B; de Pinho, Paula G; Bastos, Maria L; Ruivo, João; Ruivo, Patrícia; Carmo, Helena

    2018-07-01

    The use of performance enhancing drugs is not only common in humans, but also in animal sports, including racing of horses, greyhounds and pigeons. The development of accurate analytical procedures to detect doping agents in sports is crucial in order to protect the fair-play of the game, avoid financial fraud in the attribution of eventual awards and, even more important, to protect the animals from harmful drugs and/or dangerous dosage regimens. The present study aimed to develop and validate, a method that enabled the screening and confirmation of the presence of a beta-agonist (clenbuterol) and three corticosteroids (betamethasone, prednisolone and budesonide) in faeces from pigeons. The extraction procedure entailed the combination of liquid-liquid extraction with solid-phase extraction and the analysis was performed by liquid- chromatography coupled to tandem mass spectrometry, with a single 15 minute chromatographic run-time. The method was validated concerning selectivity, linearity (with coefficients of determination always >0.99), accuracy (87.5-114.9%), inter-day and intra-day precisions, limits of detection (0.14-1.81 ng/g) and limits of quantification (0.49-6.08 ng/g), stability and extraction recovery (71.0%-99.3%). The method was successfully applied for the analysis of samples from two pigeons that had been orally administered betamethasone, demonstrating its suitability for doping control purposes. Copyright © 2018. Published by Elsevier B.V.

  2. Variable Streamflow Contributions in Nested Subwatersheds of a US Midwestern Urban Watershed

    DOE PAGES

    Wei, Liang; Hubbart, Jason A.; Zhou, Hang

    2017-09-09

    Quantification of runoff is critical to estimate and control water pollution in urban regions, but variation in impervious area and land-use type can complicate the quantification of runoff. We quantified the streamflow contributions of subwatersheds and the historical changes in streamflow in a flood prone urbanizing watershed in US Midwest to guide the establishment of a future pollution-control plan. Streamflow data from five nested hydrological stations enabled accurate estimations of streamflow contribution from five subwatersheds with variable impervious areas (from 0.5% to 26.6%). We corrected the impact of Missouri river backwatering at the most downstream station by comparing its streamflowmore » with an upstream station using double-mass analysis combined with Bernaola-Galvan Heuristic Segmentation approach. We also compared the streamflow of the urbanizing watershed with seven surrounding rural watersheds to estimate the cumulative impact of urbanization on the streamflow regime. The two most urbanized subwatersheds contributed >365 mm streamflow in 2012 with 657 mm precipitation, which was more than fourfold greater than the two least urbanized subwatersheds. Runoff occurred almost exclusively over the most urbanized subwatersheds during the dry period. The frequent floods occurred and the same amount of precipitation produced ~100 mm more streamflow in 2008–2014 than 1967–1980 in the urbanizing watershed; such phenomena did not occur in surrounding rural watersheds. Our approaches provide comprehensive information for planning on runoff control and pollutant reduction in urban watersheds.« less

  3. Variable Streamflow Contributions in Nested Subwatersheds of a US Midwestern Urban Watershed

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wei, Liang; Hubbart, Jason A.; Zhou, Hang

    Quantification of runoff is critical to estimate and control water pollution in urban regions, but variation in impervious area and land-use type can complicate the quantification of runoff. We quantified the streamflow contributions of subwatersheds and the historical changes in streamflow in a flood prone urbanizing watershed in US Midwest to guide the establishment of a future pollution-control plan. Streamflow data from five nested hydrological stations enabled accurate estimations of streamflow contribution from five subwatersheds with variable impervious areas (from 0.5% to 26.6%). We corrected the impact of Missouri river backwatering at the most downstream station by comparing its streamflowmore » with an upstream station using double-mass analysis combined with Bernaola-Galvan Heuristic Segmentation approach. We also compared the streamflow of the urbanizing watershed with seven surrounding rural watersheds to estimate the cumulative impact of urbanization on the streamflow regime. The two most urbanized subwatersheds contributed >365 mm streamflow in 2012 with 657 mm precipitation, which was more than fourfold greater than the two least urbanized subwatersheds. Runoff occurred almost exclusively over the most urbanized subwatersheds during the dry period. The frequent floods occurred and the same amount of precipitation produced ~100 mm more streamflow in 2008–2014 than 1967–1980 in the urbanizing watershed; such phenomena did not occur in surrounding rural watersheds. Our approaches provide comprehensive information for planning on runoff control and pollutant reduction in urban watersheds.« less

  4. Frequency domain near-infrared multiwavelength imager design using high-speed, direct analog-to-digital conversion

    NASA Astrophysics Data System (ADS)

    Zimmermann, Bernhard B.; Fang, Qianqian; Boas, David A.; Carp, Stefan A.

    2016-01-01

    Frequency domain near-infrared spectroscopy (FD-NIRS) has proven to be a reliable method for quantification of tissue absolute optical properties. We present a full-sampling direct analog-to-digital conversion FD-NIR imager. While we developed this instrument with a focus on high-speed optical breast tomographic imaging, the proposed design is suitable for a wide-range of biophotonic applications where fast, accurate quantification of absolute optical properties is needed. Simultaneous dual wavelength operation at 685 and 830 nm is achieved by concurrent 67.5 and 75 MHz frequency modulation of each laser source, respectively, followed by digitization using a high-speed (180 MS/s) 16-bit A/D converter and hybrid FPGA-assisted demodulation. The instrument supports 25 source locations and features 20 concurrently operating detectors. The noise floor of the instrument was measured at <1.4 pW/√Hz, and a dynamic range of 115+ dB, corresponding to nearly six orders of magnitude, has been demonstrated. Titration experiments consisting of 200 different absorption and scattering values were conducted to demonstrate accurate optical property quantification over the entire range of physiologically expected values.

  5. Frequency domain near-infrared multiwavelength imager design using high-speed, direct analog-to-digital conversion

    PubMed Central

    Zimmermann, Bernhard B.; Fang, Qianqian; Boas, David A.; Carp, Stefan A.

    2016-01-01

    Abstract. Frequency domain near-infrared spectroscopy (FD-NIRS) has proven to be a reliable method for quantification of tissue absolute optical properties. We present a full-sampling direct analog-to-digital conversion FD-NIR imager. While we developed this instrument with a focus on high-speed optical breast tomographic imaging, the proposed design is suitable for a wide-range of biophotonic applications where fast, accurate quantification of absolute optical properties is needed. Simultaneous dual wavelength operation at 685 and 830 nm is achieved by concurrent 67.5 and 75 MHz frequency modulation of each laser source, respectively, followed by digitization using a high-speed (180  MS/s) 16-bit A/D converter and hybrid FPGA-assisted demodulation. The instrument supports 25 source locations and features 20 concurrently operating detectors. The noise floor of the instrument was measured at <1.4  pW/√Hz, and a dynamic range of 115+ dB, corresponding to nearly six orders of magnitude, has been demonstrated. Titration experiments consisting of 200 different absorption and scattering values were conducted to demonstrate accurate optical property quantification over the entire range of physiologically expected values. PMID:26813081

  6. Quantification of Global DNA Methylation Levels by Mass Spectrometry.

    PubMed

    Fernandez, Agustin F; Valledor, Luis; Vallejo, Fernando; Cañal, Maria Jesús; Fraga, Mario F

    2018-01-01

    Global DNA methylation was classically considered the relative percentage of 5-methylcysine (5mC) with respect to total cytosine (C). Early approaches were based on the use of high-performance separation technologies and UV detection. However, the recent development of protocols using mass spectrometry for the detection has increased sensibility and permitted the precise identification of peak compounds based on their molecular masses. This allows work to be conducted with much less genomic DNA starting material and also to quantify 5-hydroxymethyl-cytosine (5hmC), a recently identified form of methylated cytosine that could play an important role in active DNA demethylation. Here, we describe the protocol that we currently use in our laboratory to analyze 5mC and 5hmC by mass spectrometry. The protocol, which is based on the method originally developed by Le and colleagues using Ultra Performance Liquid Chromatography (UPLC) and mass spectrometry (triple Quadrupole (QqQ)) detection, allows for the rapid and accurate quantification of relative global 5mC and 5hmC levels starting from just 1 μg of genomic DNA, which allows for the rapid and accurate quantification of relative global 5mC and 5hmC levels.

  7. Stable isotope dilution HILIC-MS/MS method for accurate quantification of glutamic acid, glutamine, pyroglutamic acid, GABA and theanine in mouse brain tissues.

    PubMed

    Inoue, Koichi; Miyazaki, Yasuto; Unno, Keiko; Min, Jun Zhe; Todoroki, Kenichiro; Toyo'oka, Toshimasa

    2016-01-01

    In this study, we developed the stable isotope dilution hydrophilic interaction liquid chromatography with tandem mass spectrometry (HILIC-MS/MS) technique for the accurate, reasonable and simultaneous quantification of glutamic acid (Glu), glutamine (Gln), pyroglutamic acid (pGlu), γ-aminobutyric acid (GABA) and theanine in mouse brain tissues. The quantification of these analytes was accomplished using stable isotope internal standards and the HILIC separating mode to fully correct the intramolecular cyclization during the electrospray ionization. It was shown that linear calibrations were available with high coefficients of correlation (r(2)  > 0.999, range from 10 pmol/mL to 50 mol/mL). For application of the theanine intake, the determination of Glu, Gln, pGlu, GABA and theanine in the hippocampus and central cortex tissues was performed based on our developed method. In the region of the hippocampus, the concentration levels of Glu and pGlu were significantly reduced during reality-based theanine intake. Conversely, the concentration level of GABA increased. This result showed that transited theanine has an effect on the metabolic balance of Glu analogs in the hippocampus. Copyright © 2015 John Wiley & Sons, Ltd.

  8. Antibiotic Resistome: Improving Detection and Quantification Accuracy for Comparative Metagenomics.

    PubMed

    Elbehery, Ali H A; Aziz, Ramy K; Siam, Rania

    2016-04-01

    The unprecedented rise of life-threatening antibiotic resistance (AR), combined with the unparalleled advances in DNA sequencing of genomes and metagenomes, has pushed the need for in silico detection of the resistance potential of clinical and environmental metagenomic samples through the quantification of AR genes (i.e., genes conferring antibiotic resistance). Therefore, determining an optimal methodology to quantitatively and accurately assess AR genes in a given environment is pivotal. Here, we optimized and improved existing AR detection methodologies from metagenomic datasets to properly consider AR-generating mutations in antibiotic target genes. Through comparative metagenomic analysis of previously published AR gene abundance in three publicly available metagenomes, we illustrate how mutation-generated resistance genes are either falsely assigned or neglected, which alters the detection and quantitation of the antibiotic resistome. In addition, we inspected factors influencing the outcome of AR gene quantification using metagenome simulation experiments, and identified that genome size, AR gene length, total number of metagenomics reads and selected sequencing platforms had pronounced effects on the level of detected AR. In conclusion, our proposed improvements in the current methodologies for accurate AR detection and resistome assessment show reliable results when tested on real and simulated metagenomic datasets.

  9. In-Gel Stable-Isotope Labeling (ISIL): a strategy for mass spectrometry-based relative quantification.

    PubMed

    Asara, John M; Zhang, Xiang; Zheng, Bin; Christofk, Heather H; Wu, Ning; Cantley, Lewis C

    2006-01-01

    Most proteomics approaches for relative quantification of protein expression use a combination of stable-isotope labeling and mass spectrometry. Traditionally, researchers have used difference gel electrophoresis (DIGE) from stained 1D and 2D gels for relative quantification. While differences in protein staining intensity can often be visualized, abundant proteins can obscure less abundant proteins, and quantification of post-translational modifications is difficult. A method is presented for quantifying changes in the abundance of a specific protein or changes in specific modifications of a protein using In-gel Stable-Isotope Labeling (ISIL). Proteins extracted from any source (tissue, cell line, immunoprecipitate, etc.), treated under two experimental conditions, are resolved in separate lanes by gel electrophoresis. The regions of interest (visualized by staining) are reacted separately with light versus heavy isotope-labeled reagents, and the gel slices are then mixed and digested with proteases. The resulting peptides are then analyzed by LC-MS to determine relative abundance of light/heavy isotope pairs and analyzed by LC-MS/MS for identification of sequence and modifications. The strategy compares well with other relative quantification strategies, and in silico calculations reveal its effectiveness as a global relative quantification strategy. An advantage of ISIL is that visualization of gel differences can be used as a first quantification step followed by accurate and sensitive protein level stable-isotope labeling and mass spectrometry-based relative quantification.

  10. CASL Dakota Capabilities Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Simmons, Chris; Williams, Brian J.

    2017-10-10

    The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.

  11. An accurate proteomic quantification method: fluorescence labeling absolute quantification (FLAQ) using multidimensional liquid chromatography and tandem mass spectrometry.

    PubMed

    Liu, Junyan; Liu, Yang; Gao, Mingxia; Zhang, Xiangmin

    2012-08-01

    A facile proteomic quantification method, fluorescent labeling absolute quantification (FLAQ), was developed. Instead of using MS for quantification, the FLAQ method is a chromatography-based quantification in combination with MS for identification. Multidimensional liquid chromatography (MDLC) with laser-induced fluorescence (LIF) detection with high accuracy and tandem MS system were employed for FLAQ. Several requirements should be met for fluorescent labeling in MS identification: Labeling completeness, minimum side-reactions, simple MS spectra, and no extra tandem MS fragmentations for structure elucidations. A fluorescence dye, 5-iodoacetamidofluorescein, was finally chosen to label proteins on all cysteine residues. The fluorescent dye was compatible with the process of the trypsin digestion and MALDI MS identification. Quantitative labeling was achieved with optimization of reacting conditions. A synthesized peptide and model proteins, BSA (35 cysteines), OVA (five cysteines), were used for verifying the completeness of labeling. Proteins were separated through MDLC and quantified based on fluorescent intensities, followed by MS identification. High accuracy (RSD% < 1.58) and wide linearity of quantification (1-10(5) ) were achieved by LIF detection. The limit of quantitation for the model protein was as low as 0.34 amol. Parts of proteins in human liver proteome were quantified and demonstrated using FLAQ. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. Accurately tracking single-cell movement trajectories in microfluidic cell sorting devices.

    PubMed

    Jeong, Jenny; Frohberg, Nicholas J; Zhou, Enlu; Sulchek, Todd; Qiu, Peng

    2018-01-01

    Microfluidics are routinely used to study cellular properties, including the efficient quantification of single-cell biomechanics and label-free cell sorting based on the biomechanical properties, such as elasticity, viscosity, stiffness, and adhesion. Both quantification and sorting applications require optimal design of the microfluidic devices and mathematical modeling of the interactions between cells, fluid, and the channel of the device. As a first step toward building such a mathematical model, we collected video recordings of cells moving through a ridged microfluidic channel designed to compress and redirect cells according to cell biomechanics. We developed an efficient algorithm that automatically and accurately tracked the cell trajectories in the recordings. We tested the algorithm on recordings of cells with different stiffness, and showed the correlation between cell stiffness and the tracked trajectories. Moreover, the tracking algorithm successfully picked up subtle differences of cell motion when passing through consecutive ridges. The algorithm for accurately tracking cell trajectories paves the way for future efforts of modeling the flow, forces, and dynamics of cell properties in microfluidics applications.

  13. Development and validation of a fast and simple multi-analyte procedure for quantification of 40 drugs relevant to emergency toxicology using GC-MS and one-point calibration.

    PubMed

    Meyer, Golo M J; Weber, Armin A; Maurer, Hans H

    2014-05-01

    Diagnosis and prognosis of poisonings should be confirmed by comprehensive screening and reliable quantification of xenobiotics, for example by gas chromatography-mass spectrometry (GC-MS) or liquid chromatography-mass spectrometry (LC-MS). The turnaround time should be short enough to have an impact on clinical decisions. In emergency toxicology, quantification using full-scan acquisition is preferable because this allows screening and quantification of expected and unexpected drugs in one run. Therefore, a multi-analyte full-scan GC-MS approach was developed and validated with liquid-liquid extraction and one-point calibration for quantification of 40 drugs relevant to emergency toxicology. Validation showed that 36 drugs could be determined quickly, accurately, and reliably in the range of upper therapeutic to toxic concentrations. Daily one-point calibration with calibrators stored for up to four weeks reduced workload and turn-around time to less than 1 h. In summary, the multi-analyte approach with simple liquid-liquid extraction, GC-MS identification, and quantification over fast one-point calibration could successfully be applied to proficiency tests and real case samples. Copyright © 2013 John Wiley & Sons, Ltd.

  14. Colorimetric protein determination in microalgae (Chlorophyta): association of milling and SDS treatment for total protein extraction.

    PubMed

    Mota, Maria Fernanda S; Souza, Marcella F; Bon, Elba P S; Rodrigues, Marcoaurelio A; Freitas, Suely Pereira

    2018-05-24

    The use of colorimetric methods for protein quantification in microalgae is hindered by their elevated amounts of membrane-embedded intracellular proteins. In this work, the protein content of three species of microalgae was determined by the Lowry method after the cells were dried, ball-milled, and treated with the detergent sodium dodecyl sulfate (SDS). Results demonstrated that the association of milling and SDS treatment resulted in a 3- to 7-fold increase in protein quantification. Milling promoted microalgal disaggregation and cell wall disruption enabling access of the SDS detergent to the microalgal intracellular membrane proteins and their efficient solubilization and quantification. © 2018 Phycological Society of America.

  15. A liquid chromatography-tandem mass spectrometry assay for the detection and quantification of trehalose in biological samples.

    PubMed

    Kretschmer, Philip M; Bannister, Austin M; O Brien, Molly K; MacManus-Spencer, Laura A; Paulick, Margot G

    2016-10-15

    Trehalose is an important disaccharide that is used as a cellular protectant by many different organisms, helping these organisms better survive extreme conditions, such as dehydration, oxidative stress, and freezing temperatures. Methods to detect and accurately measure trehalose from different organisms will help us gain a better understanding of the mechanisms behind trehalose's ability to act as a cellular protectant. A liquid chromatography-tandem mass spectrometry (LC-MS/MS) assay using selected reaction monitoring mode for the detection and quantification of trehalose using maltose as an internal standard has been developed. This assay uses a commercially available LC column for trehalose separation and a standard triple quadrupole mass spectrometer, thus allowing many scientists to take advantage of this simple assay. The calibration curve from 3 to 100μM trehalose was fit best by a single polynomial. This LC-MS/MS assay directly detects and accurately quantifies trehalose, with an instrument limit of detection (LOD) that is 2-1000 times more sensitive than the most commonly-used assays for trehalose detection and quantification. Furthermore, this assay was used to detect and quantify endogenous trehalose produced by Escherichia coli (E. coli) cells, which were found to have an intracellular concentration of 8.5±0.9mM trehalose. This method thus shows promise for the reliable detection and quantification of trehalose from different biological sources. Copyright © 2016 Elsevier B.V. All rights reserved.

  16. Development of a targeted method for twenty-three metabolites related to polyphenol gut microbial metabolism in biological samples, using SPE and UHPLC-ESI-MS/MS.

    PubMed

    Gasperotti, Mattia; Masuero, Domenico; Guella, Graziano; Mattivi, Fulvio; Vrhovsek, Urska

    2014-10-01

    An increasing number of studies have concerned the profiling of polyphenol microbial metabolites, especially in urine or plasma, but only a few have regarded their accurate quantification. This study reports on a new ultra-performance liquid chromatography tandem mass spectrometry method with electrospray ionisation (UHPLC-ESI-MS/MS) using a simple clean-up step with solid phase extraction (SPE) and validation on different biological matrices. The method was tested with spiked samples of liver, heart, kidneys, brain, blood and urine. The purification procedure, after the evaluation of three different cartridges, makes it possible to obtain cleaner samples and better quantification of putative trace metabolites, especially related to dietary studies, with concentrations below ng/g in tissue and for urine and blood, starting from ng/ml. Limits of detection and linear range were also assessed using mixed polyphenol metabolite standards. Short chromatographic separation was carried out for 23 target compounds related to the polyphenol microbial metabolism, coupled with a triple quadrupole mass spectrometer for their accurate quantification. By analysing different spiked biological samples we were able to test metabolite detection in the matrix and validate the overall recovery of the method, from purification to quantification. The method developed can be successfully applied and is suitable for high-throughput targeted metabolomics analysis related to nutritional intervention, or the study of the metabolic mechanism in response to a polyphenol-rich diet. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Critical assessment of digital PCR for the detection and quantification of genetically modified organisms.

    PubMed

    Demeke, Tigst; Dobnik, David

    2018-07-01

    The number of genetically modified organisms (GMOs) on the market is steadily increasing. Because of regulation of cultivation and trade of GMOs in several countries, there is pressure for their accurate detection and quantification. Today, DNA-based approaches are more popular for this purpose than protein-based methods, and real-time quantitative PCR (qPCR) is still the gold standard in GMO analytics. However, digital PCR (dPCR) offers several advantages over qPCR, making this new technique appealing also for GMO analysis. This critical review focuses on the use of dPCR for the purpose of GMO quantification and addresses parameters which are important for achieving accurate and reliable results, such as the quality and purity of DNA and reaction optimization. Three critical factors are explored and discussed in more depth: correct classification of partitions as positive, correctly determined partition volume, and dilution factor. This review could serve as a guide for all laboratories implementing dPCR. Most of the parameters discussed are applicable to fields other than purely GMO testing. Graphical abstract There are generally three different options for absolute quantification of genetically modified organisms (GMOs) using digital PCR: droplet- or chamber-based and droplets in chambers. All have in common the distribution of reaction mixture into several partitions, which are all subjected to PCR and scored at the end-point as positive or negative. Based on these results GMO content can be calculated.

  18. Multiscale recurrence quantification analysis of order recurrence plots

    NASA Astrophysics Data System (ADS)

    Xu, Mengjia; Shang, Pengjian; Lin, Aijing

    2017-03-01

    In this paper, we propose a new method of multiscale recurrence quantification analysis (MSRQA) to analyze the structure of order recurrence plots. The MSRQA is based on order patterns over a range of time scales. Compared with conventional recurrence quantification analysis (RQA), the MSRQA can show richer and more recognizable information on the local characteristics of diverse systems which successfully describes their recurrence properties. Both synthetic series and stock market indexes exhibit their properties of recurrence at large time scales that quite differ from those at a single time scale. Some systems present more accurate recurrence patterns under large time scales. It demonstrates that the new approach is effective for distinguishing three similar stock market systems and showing some inherent differences.

  19. Automation of a Nile red staining assay enables high throughput quantification of microalgal lipid production.

    PubMed

    Morschett, Holger; Wiechert, Wolfgang; Oldiges, Marco

    2016-02-09

    Within the context of microalgal lipid production for biofuels and bulk chemical applications, specialized higher throughput devices for small scale parallelized cultivation are expected to boost the time efficiency of phototrophic bioprocess development. However, the increasing number of possible experiments is directly coupled to the demand for lipid quantification protocols that enable reliably measuring large sets of samples within short time and that can deal with the reduced sample volume typically generated at screening scale. To meet these demands, a dye based assay was established using a liquid handling robot to provide reproducible high throughput quantification of lipids with minimized hands-on-time. Lipid production was monitored using the fluorescent dye Nile red with dimethyl sulfoxide as solvent facilitating dye permeation. The staining kinetics of cells at different concentrations and physiological states were investigated to successfully down-scale the assay to 96 well microtiter plates. Gravimetric calibration against a well-established extractive protocol enabled absolute quantification of intracellular lipids improving precision from ±8 to ±2 % on average. Implementation into an automated liquid handling platform allows for measuring up to 48 samples within 6.5 h, reducing hands-on-time to a third compared to manual operation. Moreover, it was shown that automation enhances accuracy and precision compared to manual preparation. It was revealed that established protocols relying on optical density or cell number for biomass adjustion prior to staining may suffer from errors due to significant changes of the cells' optical and physiological properties during cultivation. Alternatively, the biovolume was used as a measure for biomass concentration so that errors from morphological changes can be excluded. The newly established assay proved to be applicable for absolute quantification of algal lipids avoiding limitations of currently established protocols, namely biomass adjustment and limited throughput. Automation was shown to improve data reliability, as well as experimental throughput simultaneously minimizing the needed hands-on-time to a third. Thereby, the presented protocol meets the demands for the analysis of samples generated by the upcoming generation of devices for higher throughput phototrophic cultivation and thereby contributes to boosting the time efficiency for setting up algae lipid production processes.

  20. 3D Geometric Analysis of the Pediatric Aorta in 3D MRA Follow-Up Images with Application to Aortic Coarctation.

    PubMed

    Wörz, Stefan; Schenk, Jens-Peter; Alrajab, Abdulsattar; von Tengg-Kobligk, Hendrik; Rohr, Karl; Arnold, Raoul

    2016-10-17

    Coarctation of the aorta is one of the most common congenital heart diseases. Despite different treatment opportunities, long-term outcome after surgical or interventional therapy is diverse. Serial morphologic follow-up of vessel growth is necessary, because vessel growth cannot be predicted by primer morphology or a therapeutic option. For the analysis of the long-term outcome after therapy of congenital diseases such as aortic coarctation, accurate 3D geometric analysis of the aorta from follow-up 3D medical image data such as magnetic resonance angiography (MRA) is important. However, for an objective, fast, and accurate 3D geometric analysis, an automatic approach for 3D segmentation and quantification of the aorta from pediatric images is required. We introduce a new model-based approach for the segmentation of the thoracic aorta and its main branches from follow-up pediatric 3D MRA image data. For robust segmentation of vessels even in difficult cases (e.g., neighboring structures), we propose a new extended parametric cylinder model that requires only relatively few model parameters. Moreover, we include a novel adaptive background-masking scheme used for least-squares model fitting, we use a spatial normalization scheme to align the segmentation results from follow-up examinations, and we determine relevant 3D geometric parameters of the aortic arch. We have evaluated our proposed approach using different 3D synthetic images. Moreover, we have successfully applied the approach to follow-up pediatric 3D MRA image data, we have normalized the 3D segmentation results of follow-up images of individual patients, and we have combined the results of all patients. We also present a quantitative evaluation of our approach for four follow-up 3D MRA images of a patient, which confirms that our approach yields accurate 3D segmentation results. An experimental comparison with two previous approaches demonstrates that our approach yields superior results. From the results, we found that our approach is well suited for the quantification of the 3D geometry of the aortic arch from follow-up pediatric 3D MRA image data. In future work, this will enable to investigate the long-term outcome of different surgical and interventional therapies for aortic coarctation.

  1. A sun-tracking environmental chamber for the outdoor quantification of CPV modules

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Faiman, David, E-mail: faiman@bgu.ac.il; Melnichak, Vladimir, E-mail: faiman@bgu.ac.il; Bokobza, Dov, E-mail: faiman@bgu.ac.il

    2014-09-26

    The paper describes a sun-tracking environmental chamber and its associated fast electronics, devised for the accurate outdoor characterization of CPV cells, receivers, mono-modules, and modules. Some typical measurement results are presented.

  2. Quantification of Liver Fat in the Presence of Iron Overload

    PubMed Central

    Horng, Debra E.; Hernando, Diego; Reeder, Scott B.

    2017-01-01

    Purpose To evaluate the accuracy of R2* models (1/T2* = R2*) for chemical shift-encoded magnetic resonance imaging (CSE-MRI)-based proton density fat-fraction (PDFF) quantification in patients with fatty liver and iron overload, using MR spectroscopy (MRS) as the reference standard. Materials and Methods Two Monte Carlo simulations were implemented to compare the root-mean-squared-error (RMSE) performance of single-R2* and dual-R2* correction in a theoretical liver environment with high iron. Fatty liver was defined as hepatic PDFF >5.6% based on MRS; only subjects with fatty liver were considered for analyses involving fat. From a group of 40 patients with known/suspected iron overload, nine patients were identified at 1.5T, and 13 at 3.0T with fatty liver. MRS linewidth measurements were used to estimate R2* values for water and fat peaks. PDFF was measured from CSE-MRI data using single-R2* and dual-R2* correction with magnitude and complex fitting. Results Spectroscopy-based R2* analysis demonstrated that the R2* of water and fat remain close in value, both increasing as iron overload increases: linear regression between R2*W and R2*F resulted in slope = 0.95 [0.79–1.12] (95% limits of agreement) at 1.5T and slope = 0.76 [0.49–1.03] at 3.0T. MRI-PDFF using dual-R2* correction had severe artifacts. MRI-PDFF using single-R2* correction had good agreement with MRS-PDFF: Bland–Altman analysis resulted in −0.7% (bias) ± 2.9% (95% limits of agreement) for magnitude-fit and −1.3% ± 4.3% for complex-fit at 1.5T, and −1.5% ± 8.4% for magnitude-fit and −2.2% ± 9.6% for complex-fit at 3.0T. Conclusion Single-R2* modeling enables accurate PDFF quantification, even in patients with iron overload. PMID:27405703

  3. Detection and Quantification of Gluten during the Brewing and Fermentation of Beer Using Antibody-Based Technologies.

    PubMed

    Panda, Rakhi; Zoerb, Hans F; Cho, Chung Y; Jackson, Lauren S; Garber, Eric A E

    2015-06-01

    In 2013 the U.S. Food and Drug Administration (FDA) defined the term ''gluten-free'' and identified a gap in the analytical methodology for detection and quantification of gluten in foods subjected to fermentation and hydrolysis. To ascertain the ability of current enzyme-linked immunosorbent assays (ELISAs) to detect and quantify gluten in fermented and hydrolyzed products, sorghum beer was spiked in the initial phases of production with 0, 20, and 200 μg/ml wheat gluten, and samples were collected throughout the beer production process. The samples were analyzed using five sandwich ELISAs and two competitive ELISAs and by sodium dodecyl sulfate-polyacrylamide gel electrophoresis with Western analysis employing four antibodies (MIoBS, R5, G12, and Skerritt). The sensitivity of the MIoBS ELISA (0.25 ppm) enabled the reliable detection of gluten throughout the manufacturing process, including fermentation, when the initial concentration of 20 μg/ml dropped to 2 μg/ml. The R5 antibody-based and G12 antibody-based sandwich ELISAs were unable to reliably detect gluten, initially at 20 μg/ml, after the onset of production. The Skerritt antibody-based sandwich ELISA overestimated the gluten concentration in all samples. The R5 antibody-based and G12 antibody-based competitive ELISAs were less sensitive than the sandwich ELISAs and did not provide accurate results for quantifying gluten concentration. The Western analyses were able to detect gluten at less than 5 μg/ml in the samples and confirmed the results of the ELISAs. Although further research is necessary before all problems associated with detection and quantification of hydrolyzed and fermented gluten are resolved, the analytical methods recommended by the FDA for regulatory samples can detect ≥ 20 μg/ml gluten that has undergone brewing and fermentation processes associated with the manufacture of beer.

  4. Simultaneous Detection of Human C-Terminal p53 Isoforms by Single Template Molecularly Imprinted Polymers (MIPs) Coupled with Liquid Chromatography-Tandem Mass Spectrometry (LC-MS/MS)-Based Targeted Proteomics.

    PubMed

    Jiang, Wenting; Liu, Liang; Chen, Yun

    2018-03-06

    Abnormal expression of C-terminal p53 isoforms α, β, and γ can cause the development of cancers including breast cancer. To date, much evidence has demonstrated that these isoforms can differentially regulate target genes and modulate their expression. Thus, quantification of individual isoforms may help to link clinical outcome to p53 status and to improve cancer patient treatment. However, there are few studies on accurate determination of p53 isoforms, probably due to sequence homology of these isoforms and also their low abundance. In this study, a targeted proteomics assay combining molecularly imprinted polymers (MIPs) and liquid chromatography-tandem mass spectrometry (LC-MS/MS) was developed for simultaneous quantification of C-terminal p53 isoforms. Isoform-specific surrogate peptides (i.e., KPLDGEYFTLQIR (peptide-α) for isoform α, KPLDGEYFTLQDQTSFQK (peptide-β) for isoform β, and KPLDGEYFTLQMLLDLR (peptide-γ) for isoform γ) were first selected and used in both MIPs enrichment and mass spectrometric detection. The common sequence KPLDGEYFTLQ of these three surrogate peptides was used as single template in MIPs. In addition to optimization of imprinting conditions and characterization of the prepared MIPs, binding affinity and cross-reactivity of the MIPs for each surrogate peptide were also evaluated. As a result, a LOQ of 5 nM was achieved, which was >15-fold more sensitive than that without MIPs. Finally, the assay was validated and applied to simultaneous quantitative analysis of C-terminal p53 isoforms α, β, and γ in several human breast cell lines (i.e., MCF-10A normal cells, MCF-7 and MDA-MB-231 cancer cells, and drug-resistant MCF-7/ADR cancer cells). This study is among the first to employ single template MIPs and cross-reactivity phenomenon to select isoform-specific surrogate peptides and enable simultaneous quantification of protein isoforms in LC-MS/MS-based targeted proteomics.

  5. Quantification of meat proportions by measuring DNA contents in raw and boiled sausages using matrix-adapted calibrators and multiplex real-time PCR.

    PubMed

    Köppel, René; Eugster, Albert; Ruf, Jürg; Rentsch, Jürg

    2012-01-01

    The quantification of meat proportions in raw and boiled sausage according to the recipe was evaluated using three different calibrators. To measure the DNA contents from beef, pork, sheep (mutton), and horse, a tetraplex real-time PCR method was applied. Nineteen laboratories analyzed four meat products each made of different proportions of beef, pork, sheep, and horse meat. Three kinds of calibrators were used: raw and boiled sausages of known proportions ranging from 1 to 55% of meat, and a dilution series of DNA from muscle tissue. In general, results generated using calibration sausages were more accurate than those resulting from the use of DNA from muscle tissue, and exhibited smaller measurement uncertainties. Although differences between uses of raw and boiled calibration sausages were small, the most precise and accurate results were obtained by calibration with fine-textured boiled reference sausages.

  6. Motion-aware stroke volume quantification in 4D PC-MRI data of the human aorta.

    PubMed

    Köhler, Benjamin; Preim, Uta; Grothoff, Matthias; Gutberlet, Matthias; Fischbach, Katharina; Preim, Bernhard

    2016-02-01

    4D PC-MRI enables the noninvasive measurement of time-resolved, three-dimensional blood flow data that allow quantification of the hemodynamics. Stroke volumes are essential to assess the cardiac function and evolution of different cardiovascular diseases. The calculation depends on the wall position and vessel orientation, which both change during the cardiac cycle due to the heart muscle contraction and the pumped blood. However, current systems for the quantitative 4D PC-MRI data analysis neglect the dynamic character and instead employ a static 3D vessel approximation. We quantify differences between stroke volumes in the aorta obtained with and without consideration of its dynamics. We describe a method that uses the approximating 3D segmentation to automatically initialize segmentation algorithms that require regions inside and outside the vessel for each temporal position. This enables the use of graph cuts to obtain 4D segmentations, extract vessel surfaces including centerlines for each temporal position and derive motion information. The stroke volume quantification is compared using measuring planes in static (3D) vessels, planes with fixed angulation inside dynamic vessels (this corresponds to the common 2D PC-MRI) and moving planes inside dynamic vessels. Seven datasets with different pathologies such as aneurysms and coarctations were evaluated in close collaboration with radiologists. Compared to the experts' manual stroke volume estimations, motion-aware quantification performs, on average, 1.57% better than calculations without motion consideration. The mean difference between stroke volumes obtained with the different methods is 7.82%. Automatically obtained 4D segmentations overlap by 85.75% with manually generated ones. Incorporating motion information in the stroke volume quantification yields slight but not statistically significant improvements. The presented method is feasible for the clinical routine, since computation times are low and essential parts run fully automatically. The 4D segmentations can be used for other algorithms as well. The simultaneous visualization and quantification may support the understanding and interpretation of cardiac blood flow.

  7. ATAQS: A computational software tool for high throughput transition optimization and validation for selected reaction monitoring mass spectrometry

    PubMed Central

    2011-01-01

    Background Since its inception, proteomics has essentially operated in a discovery mode with the goal of identifying and quantifying the maximal number of proteins in a sample. Increasingly, proteomic measurements are also supporting hypothesis-driven studies, in which a predetermined set of proteins is consistently detected and quantified in multiple samples. Selected reaction monitoring (SRM) is a targeted mass spectrometric technique that supports the detection and quantification of specific proteins in complex samples at high sensitivity and reproducibility. Here, we describe ATAQS, an integrated software platform that supports all stages of targeted, SRM-based proteomics experiments including target selection, transition optimization and post acquisition data analysis. This software will significantly facilitate the use of targeted proteomic techniques and contribute to the generation of highly sensitive, reproducible and complete datasets that are particularly critical for the discovery and validation of targets in hypothesis-driven studies in systems biology. Result We introduce a new open source software pipeline, ATAQS (Automated and Targeted Analysis with Quantitative SRM), which consists of a number of modules that collectively support the SRM assay development workflow for targeted proteomic experiments (project management and generation of protein, peptide and transitions and the validation of peptide detection by SRM). ATAQS provides a flexible pipeline for end-users by allowing the workflow to start or end at any point of the pipeline, and for computational biologists, by enabling the easy extension of java algorithm classes for their own algorithm plug-in or connection via an external web site. This integrated system supports all steps in a SRM-based experiment and provides a user-friendly GUI that can be run by any operating system that allows the installation of the Mozilla Firefox web browser. Conclusions Targeted proteomics via SRM is a powerful new technique that enables the reproducible and accurate identification and quantification of sets of proteins of interest. ATAQS is the first open-source software that supports all steps of the targeted proteomics workflow. ATAQS also provides software API (Application Program Interface) documentation that enables the addition of new algorithms to each of the workflow steps. The software, installation guide and sample dataset can be found in http://tools.proteomecenter.org/ATAQS/ATAQS.html PMID:21414234

  8. Quantitative Method for Simultaneous Analysis of Acetaminophen and 6 Metabolites.

    PubMed

    Lammers, Laureen A; Achterbergh, Roos; Pistorius, Marcel C M; Romijn, Johannes A; Mathôt, Ron A A

    2017-04-01

    Hepatotoxicity after ingestion of high-dose acetaminophen [N-acetyl-para-aminophenol (APAP)] is caused by the metabolites of the drug. To gain more insight into factors influencing susceptibility to APAP hepatotoxicity, quantification of APAP and metabolites is important. A few methods have been developed to simultaneously quantify APAP and its most important metabolites. However, these methods require a comprehensive sample preparation and long run times. The aim of this study was to develop and validate a simplified, but sensitive method for the simultaneous quantification of acetaminophen, the main metabolites acetaminophen glucuronide and acetaminophen sulfate, and 4 Cytochrome P450-mediated metabolites by using liquid chromatography with mass spectrometric (LC-MS) detection. The method was developed and validated for the human plasma, and it entailed a single method for sample preparation, enabling quick processing of the samples followed by an LC-MS method with a chromatographic run time of 9 minutes. The method was validated for selectivity, linearity, accuracy, imprecision, dilution integrity, recovery, process efficiency, ionization efficiency, and carryover effect. The method showed good selectivity without matrix interferences. For all analytes, the mean process efficiency was >86%, and the mean ionization efficiency was >94%. Furthermore, the accuracy was between 90.3% and 112% for all analytes, and the within- and between-run imprecision were <20% for the lower limit of quantification and <14.3% for the middle level and upper limit of quantification. The method presented here enables the simultaneous quantification of APAP and 6 of its metabolites. It is less time consuming than previously reported methods because it requires only a single and simple method for the sample preparation followed by an LC-MS method with a short run time. Therefore, this analytical method provides a useful method for both clinical and research purposes.

  9. Quantification of lycopene in the processed tomato-based products by means of the light-emitting diode (LED) and compact photoacoustic (PA) detector

    NASA Astrophysics Data System (ADS)

    Bicanic, D.; Skenderović, H.; Marković, K.; Dóka, O.; Pichler, L.; Pichler, G.; Luterotti, S.

    2010-03-01

    The combined use of a high power light emitting diode (LED) and the compact photoacoustic (PA) detector offers the possibility for a rapid (no extraction needed), accurate (precision 1.5%) and inexpensive quantification of lycopene in different products derived from the thermally processed tomatoes. The concentration of lycopene in selected products ranges from a few mg to several tens mg per 100 g fresh weight. The HPLC was used as the well established reference method.

  10. Investigating the limits of PET/CT imaging at very low true count rates and high random fractions in ion-beam therapy monitoring.

    PubMed

    Kurz, Christopher; Bauer, Julia; Conti, Maurizio; Guérin, Laura; Eriksson, Lars; Parodi, Katia

    2015-07-01

    External beam radiotherapy with protons and heavier ions enables a tighter conformation of the applied dose to arbitrarily shaped tumor volumes with respect to photons, but is more sensitive to uncertainties in the radiotherapeutic treatment chain. Consequently, an independent verification of the applied treatment is highly desirable. For this purpose, the irradiation-induced β(+)-emitter distribution within the patient is detected shortly after irradiation by a commercial full-ring positron emission tomography/x-ray computed tomography (PET/CT) scanner installed next to the treatment rooms at the Heidelberg Ion-Beam Therapy Center (HIT). A major challenge to this approach is posed by the small number of detected coincidences. This contribution aims at characterizing the performance of the used PET/CT device and identifying the best-performing reconstruction algorithm under the particular statistical conditions of PET-based treatment monitoring. Moreover, this study addresses the impact of radiation background from the intrinsically radioactive lutetium-oxyorthosilicate (LSO)-based detectors at low counts. The authors have acquired 30 subsequent PET scans of a cylindrical phantom emulating a patientlike activity pattern and spanning the entire patient counting regime in terms of true coincidences and random fractions (RFs). Accuracy and precision of activity quantification, image noise, and geometrical fidelity of the scanner have been investigated for various reconstruction algorithms and settings in order to identify a practical, well-suited reconstruction scheme for PET-based treatment verification. Truncated listmode data have been utilized for separating the effects of small true count numbers and high RFs on the reconstructed images. A corresponding simulation study enabled extending the results to an even wider range of counting statistics and to additionally investigate the impact of scatter coincidences. Eventually, the recommended reconstruction scheme has been applied to exemplary postirradiation patient data-sets. Among the investigated reconstruction options, the overall best results in terms of image noise, activity quantification, and accurate geometrical recovery were achieved using the ordered subset expectation maximization reconstruction algorithm with time-of-flight (TOF) and point-spread function (PSF) information. For this algorithm, reasonably accurate (better than 5%) and precise (uncertainty of the mean activity below 10%) imaging can be provided down to 80,000 true coincidences at 96% RF. Image noise and geometrical fidelity are generally improved for fewer iterations. The main limitation for PET-based treatment monitoring has been identified in the small number of true coincidences, rather than the high intrinsic random background. Application of the optimized reconstruction scheme to patient data-sets results in a 25% - 50% reduced image noise at a comparable activity quantification accuracy and an improved geometrical performance with respect to the formerly used reconstruction scheme at HIT, adopted from nuclear medicine applications. Under the poor statistical conditions in PET-based treatment monitoring, improved results can be achieved by considering PSF and TOF information during image reconstruction and by applying less iterations than in conventional nuclear medicine imaging. Geometrical fidelity and image noise are mainly limited by the low number of true coincidences, not the high LSO-related random background. The retrieved results might also impact other emerging PET applications at low counting statistics.

  11. Simultaneous Estimation of Withaferin A and Z-Guggulsterone in Marketed Formulation by RP-HPLC.

    PubMed

    Agrawal, Poonam; Vegda, Rashmi; Laddha, Kirti

    2015-07-01

    A simple, rapid, precise and accurate high-performance liquid chromatography (HPLC) method was developed for simultaneous estimation of withaferin A and Z-guggulsterone in a polyherbal formulation containing Withania somnifera and Commiphora wightii. The chromatographic separation was achieved on a Purosphere RP-18 column (particle size 5 µm) with a mobile phase consisting of Solvent A (acetonitrile) and Solvent B (water) with the following gradients: 0-7 min, 50% A in B; 7-9 min, 50-80% A in B; 9-20 min, 80% A in B at a flow rate of 1 mL/min and detection at 235 nm. The marker compounds were well separated on the chromatogram within 20 min. The results obtained indicate accuracy and reliability of the developed simultaneous HPLC method for the quantification of withaferin A and Z-guggulsterone. The proposed method was found to be reproducible, specific, precise and accurate for simultaneous estimation of these marker compounds in a combined dosage form. The HPLC method was appropriate and the two markers are well resolved, enabling efficient quantitative analysis of withaferin A and Z-guggulsterone. The method can be successively used for quantitative analysis of these two marker constituents in combination of marketed polyherbal formulation. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  12. Quantitative contrast-enhanced optical coherence tomography

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Winetraub, Yonatan; SoRelle, Elliott D.; Bio-X Program, Stanford University, 299 Campus Drive, Stanford, California 94305

    2016-01-11

    We have developed a model to accurately quantify the signals produced by exogenous scattering agents used for contrast-enhanced Optical Coherence Tomography (OCT). This model predicts distinct concentration-dependent signal trends that arise from the underlying physics of OCT detection. Accordingly, we show that real scattering particles can be described as simplified ideal scatterers with modified scattering intensity and concentration. The relation between OCT signal and particle concentration is approximately linear at concentrations lower than 0.8 particle per imaging voxel. However, at higher concentrations, interference effects cause signal to increase with a square root dependence on the number of particles within amore » voxel. Finally, high particle concentrations cause enough light attenuation to saturate the detected signal. Predictions were validated by comparison with measured OCT signals from gold nanorods (GNRs) prepared in water at concentrations ranging over five orders of magnitude (50 fM to 5 nM). In addition, we validated that our model accurately predicts the signal responses of GNRs in highly heterogeneous scattering environments including whole blood and living animals. By enabling particle quantification, this work provides a valuable tool for current and future contrast-enhanced in vivo OCT studies. More generally, the model described herein may inform the interpretation of detected signals in modalities that rely on coherence-based detection or are susceptible to interference effects.« less

  13. Accurate quantification of hydration number for polyethylene glycol molecules

    NASA Astrophysics Data System (ADS)

    Guo, Wei; Zhao, Lishan; Gao, Xin; Cao, Zexian; Wang, Qiang

    2018-05-01

    Not Available Project supported by the National Natural Science Foundation of China (Grant Nos. 11474325 and 11290161) and the Knowledge Innovation Project of Chinese Academy of Sciences on Water Science Research (Grant No. KJZD-EW-M03).

  14. Quantitative proteomics in cardiovascular research: global and targeted strategies

    PubMed Central

    Shen, Xiaomeng; Young, Rebeccah; Canty, John M.; Qu, Jun

    2014-01-01

    Extensive technical advances in the past decade have substantially expanded quantitative proteomics in cardiovascular research. This has great promise for elucidating the mechanisms of cardiovascular diseases (CVD) and the discovery of cardiac biomarkers used for diagnosis and treatment evaluation. Global and targeted proteomics are the two major avenues of quantitative proteomics. While global approaches enable unbiased discovery of altered proteins via relative quantification at the proteome level, targeted techniques provide higher sensitivity and accuracy, and are capable of multiplexed absolute quantification in numerous clinical/biological samples. While promising, technical challenges need to be overcome to enable full utilization of these techniques in cardiovascular medicine. Here we discuss recent advances in quantitative proteomics and summarize applications in cardiovascular research with an emphasis on biomarker discovery and elucidating molecular mechanisms of disease. We propose the integration of global and targeted strategies as a high-throughput pipeline for cardiovascular proteomics. Targeted approaches enable rapid, extensive validation of biomarker candidates discovered by global proteomics. These approaches provide a promising alternative to immunoassays and other low-throughput means currently used for limited validation. PMID:24920501

  15. Quantification of atherosclerotic plaque activity and vascular inflammation using [18-F] fluorodeoxyglucose positron emission tomography/computed tomography (FDG-PET/CT).

    PubMed

    Mehta, Nehal N; Torigian, Drew A; Gelfand, Joel M; Saboury, Babak; Alavi, Abass

    2012-05-02

    Conventional non-invasive imaging modalities of atherosclerosis such as coronary artery calcium (CAC) and carotid intimal medial thickness (C-IMT) provide information about the burden of disease. However, despite multiple validation studies of CAC, and C-IMT, these modalities do not accurately assess plaque characteristics, and the composition and inflammatory state of the plaque determine its stability and, therefore, the risk of clinical events. [(18)F]-2-fluoro-2-deoxy-D-glucose (FDG) imaging using positron-emission tomography (PET)/computed tomography (CT) has been extensively studied in oncologic metabolism. Studies using animal models and immunohistochemistry in humans show that FDG-PET/CT is exquisitely sensitive for detecting macrophage activity, an important source of cellular inflammation in vessel walls. More recently, we and others have shown that FDG-PET/CT enables highly precise, novel measurements of inflammatory activity of activity of atherosclerotic plaques in large and medium-sized arteries. FDG-PET/CT studies have many advantages over other imaging modalities: 1) high contrast resolution; 2) quantification of plaque volume and metabolic activity allowing for multi-modal atherosclerotic plaque quantification; 3) dynamic, real-time, in vivo imaging; 4) minimal operator dependence. Finally, vascular inflammation detected by FDG-PET/CT has been shown to predict cardiovascular (CV) events independent of traditional risk factors and is also highly associated with overall burden of atherosclerosis. Plaque activity by FDG-PET/CT is modulated by known beneficial CV interventions such as short term (12 week) statin therapy as well as longer term therapeutic lifestyle changes (16 months). The current methodology for quantification of FDG uptake in atherosclerotic plaque involves measurement of the standardized uptake value (SUV) of an artery of interest and of the venous blood pool in order to calculate a target to background ratio (TBR), which is calculated by dividing the arterial SUV by the venous blood pool SUV. This method has shown to represent a stable, reproducible phenotype over time, has a high sensitivity for detection of vascular inflammation, and also has high inter-and intra-reader reliability. Here we present our methodology for patient preparation, image acquisition, and quantification of atherosclerotic plaque activity and vascular inflammation using SUV, TBR, and a global parameter called the metabolic volumetric product (MVP). These approaches may be applied to assess vascular inflammation in various study samples of interest in a consistent fashion as we have shown in several prior publications.

  16. Self-optimized construction of transition rate matrices from accelerated atomistic simulations with Bayesian uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Swinburne, Thomas D.; Perez, Danny

    2018-05-01

    A massively parallel method to build large transition rate matrices from temperature-accelerated molecular dynamics trajectories is presented. Bayesian Markov model analysis is used to estimate the expected residence time in the known state space, providing crucial uncertainty quantification for higher-scale simulation schemes such as kinetic Monte Carlo or cluster dynamics. The estimators are additionally used to optimize where exploration is performed and the degree of temperature acceleration on the fly, giving an autonomous, optimal procedure to explore the state space of complex systems. The method is tested against exactly solvable models and used to explore the dynamics of C15 interstitial defects in iron. Our uncertainty quantification scheme allows for accurate modeling of the evolution of these defects over timescales of several seconds.

  17. Targeted Quantification of Isoforms of a Thylakoid-Bound Protein: MRM Method Development.

    PubMed

    Bru-Martínez, Roque; Martínez-Márquez, Ascensión; Morante-Carriel, Jaime; Sellés-Marchart, Susana; Martínez-Esteso, María José; Pineda-Lucas, José Luis; Luque, Ignacio

    2018-01-01

    Targeted mass spectrometric methods such as selected/multiple reaction monitoring (SRM/MRM) have found intense application in protein detection and quantification which competes with classical immunoaffinity techniques. It provides a universal procedure to develop a fast, highly specific, sensitive, accurate, and cheap methodology for targeted detection and quantification of proteins based on the direct analysis of their surrogate peptides typically generated by tryptic digestion. This methodology can be advantageously applied in the field of plant proteomics and particularly for non-model species since immunoreagents are scarcely available. Here, we describe the issues to take into consideration in order to develop a MRM method to detect and quantify isoforms of the thylakoid-bound protein polyphenol oxidase from the non-model and database underrepresented species Eriobotrya japonica Lindl.

  18. The Synergy Between Total Scattering and Advanced Simulation Techniques: Quantifying Geopolymer Gel Evolution

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    White, Claire; Bloomer, Breaunnah E.; Provis, John L.

    2012-05-16

    With the ever increasing demands for technologically advanced structural materials, together with emerging environmental consciousness due to climate change, geopolymer cement is fast becoming a viable alternative to traditional cements due to proven mechanical engineering characteristics and the reduction in CO2 emitted (approximately 80% less CO2 emitted compared to ordinary Portland cement). Nevertheless, much remains unknown regarding the kinetics of the molecular changes responsible for nanostructural evolution during the geopolymerization process. Here, in-situ total scattering measurements in the form of X-ray pair distribution function (PDF) analysis are used to quantify the extent of reaction of metakaolin/slag alkali-activated geopolymer binders, includingmore » the effects of various activators (alkali hydroxide/silicate) on the kinetics of the geopolymerization reaction. Restricting quantification of the kinetics to the initial ten hours of reaction does not enable elucidation of the true extent of the reaction, but using X-ray PDF data obtained after 128 days of reaction enables more accurate determination of the initial extent of reaction. The synergies between the in-situ X-ray PDF data and simulations conducted by multiscale density functional theory-based coarse-grained Monte Carlo analysis are outlined, particularly with regard to the potential for the X-ray data to provide a time scale for kinetic analysis of the extent of reaction obtained from the multiscale simulation methodology.« less

  19. A model based bayesian solution for characterization of complex damage scenarios in aerospace composite structures.

    PubMed

    Reed, H; Leckey, Cara A C; Dick, A; Harvey, G; Dobson, J

    2018-01-01

    Ultrasonic damage detection and characterization is commonly used in nondestructive evaluation (NDE) of aerospace composite components. In recent years there has been an increased development of guided wave based methods. In real materials and structures, these dispersive waves result in complicated behavior in the presence of complex damage scenarios. Model-based characterization methods utilize accurate three dimensional finite element models (FEMs) of guided wave interaction with realistic damage scenarios to aid in defect identification and classification. This work describes an inverse solution for realistic composite damage characterization by comparing the wavenumber-frequency spectra of experimental and simulated ultrasonic inspections. The composite laminate material properties are first verified through a Bayesian solution (Markov chain Monte Carlo), enabling uncertainty quantification surrounding the characterization. A study is undertaken to assess the efficacy of the proposed damage model and comparative metrics between the experimental and simulated output. The FEM is then parameterized with a damage model capable of describing the typical complex damage created by impact events in composites. The damage is characterized through a transdimensional Markov chain Monte Carlo solution, enabling a flexible damage model capable of adapting to the complex damage geometry investigated here. The posterior probability distributions of the individual delamination petals as well as the overall envelope of the damage site are determined. Copyright © 2017 Elsevier B.V. All rights reserved.

  20. Predicting the disinfection efficiency range in chlorine contact tanks through a CFD-based approach.

    PubMed

    Angeloudis, Athanasios; Stoesser, Thorsten; Falconer, Roger A

    2014-09-01

    In this study three-dimensional computational fluid dynamics (CFD) models, incorporating appropriately selected kinetic models, were developed to simulate the processes of chlorine decay, pathogen inactivation and the formation of potentially carcinogenic by-products in disinfection contact tanks (CTs). Currently, the performance of CT facilities largely relies on Hydraulic Efficiency Indicators (HEIs), extracted from experimentally derived Residence Time Distribution (RTD) curves. This approach has more recently been aided with the application of CFD models, which can be calibrated to predict accurately RTDs, enabling the assessment of disinfection facilities prior to their construction. However, as long as it depends on HEIs, the CT design process does not directly take into consideration the disinfection biochemistry which needs to be optimized. The main objective of this study is to address this issue by refining the modelling practices to simulate some reactive processes of interest, while acknowledging the uneven contact time stemming from the RTD curves. Initially, the hydraulic performances of seven CT design variations were reviewed through available experimental and computational data. In turn, the same design configurations were tested using numerical modelling techniques, featuring kinetic models that enable the quantification of disinfection operational parameters. Results highlight that the optimization of the hydrodynamic conditions facilitates a more uniform disinfectant contact time, which correspond to greater levels of pathogen inactivation and a more controlled by-product accumulation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  1. Configuration of a high-content imaging platform for hit identification and pharmacological assessment of JMJD3 demethylase enzyme inhibitors.

    PubMed

    Mulji, Alpa; Haslam, Carl; Brown, Fiona; Randle, Rebecca; Karamshi, Bhumika; Smith, Julia; Eagle, Robert; Munoz-Muriedas, Jordi; Taylor, Joanna; Sheikh, Arshad; Bridges, Angela; Gill, Kirsty; Jepras, Rob; Smee, Penny; Barker, Mike; Woodrow, Mike; Liddle, John; Thomas, Pamela; Jones, Emma; Gordon, Laurie; Tanner, Rob; Leveridge, Melanie; Hutchinson, Sue; Martin, Margaret; Brown, Murray; Kruidenier, Laurens; Katso, Roy

    2012-01-01

    The biological complexity associated with the regulation of histone demethylases makes it desirable to configure a cellular mechanistic assay format that simultaneously encompasses as many of the relevant cellular processes as possible. In this report, the authors describe the configuration of a JMJD3 high-content cellular mechanistic imaging assay that uses single-cell multiparameter measurements to accurately assess cellular viability and the enzyme-dependent demethylation of the H3K27(Me)3 mark by exogenously expressed JMJD3. This approach couples robust statistical analyses with the spatial resolving power of cellular imaging. This enables segregation of expressing and nonexpressing cells into discrete subpopulations and consequently pharmacological quantification of compounds of interest in the expressing population at varying JMJD3 expression levels. Moreover, the authors demonstrate the utility of this hit identification strategy through the successful prosecution of a medium-throughput focused campaign of an 87 500-compound file, which has enabled the identification of JMJD3 cellular-active chemotypes. This study represents the first report of a demethylase high-content imaging assay with the ability to capture a repertoire of pharmacological tools, which are likely both to inform our mechanistic understanding of how JMJD3 is modulated and, more important, to contribute to the identification of novel therapeutic modalities for this demethylase enzyme.

  2. GMO quantification: valuable experience and insights for the future.

    PubMed

    Milavec, Mojca; Dobnik, David; Yang, Litao; Zhang, Dabing; Gruden, Kristina; Zel, Jana

    2014-10-01

    Cultivation and marketing of genetically modified organisms (GMOs) have been unevenly adopted worldwide. To facilitate international trade and to provide information to consumers, labelling requirements have been set up in many countries. Quantitative real-time polymerase chain reaction (qPCR) is currently the method of choice for detection, identification and quantification of GMOs. This has been critically assessed and the requirements for the method performance have been set. Nevertheless, there are challenges that should still be highlighted, such as measuring the quantity and quality of DNA, and determining the qPCR efficiency, possible sequence mismatches, characteristics of taxon-specific genes and appropriate units of measurement, as these remain potential sources of measurement uncertainty. To overcome these problems and to cope with the continuous increase in the number and variety of GMOs, new approaches are needed. Statistical strategies of quantification have already been proposed and expanded with the development of digital PCR. The first attempts have been made to use new generation sequencing also for quantitative purposes, although accurate quantification of the contents of GMOs using this technology is still a challenge for the future, and especially for mixed samples. New approaches are needed also for the quantification of stacks, and for potential quantification of organisms produced by new plant breeding techniques.

  3. MEERCAT: Multiplexed Efficient Cell Free Expression of Recombinant QconCATs For Large Scale Absolute Proteome Quantification*

    PubMed Central

    Takemori, Nobuaki; Takemori, Ayako; Tanaka, Yuki; Endo, Yaeta; Hurst, Jane L.; Gómez-Baena, Guadalupe; Harman, Victoria M.; Beynon, Robert J.

    2017-01-01

    A major challenge in proteomics is the absolute accurate quantification of large numbers of proteins. QconCATs, artificial proteins that are concatenations of multiple standard peptides, are well established as an efficient means to generate standards for proteome quantification. Previously, QconCATs have been expressed in bacteria, but we now describe QconCAT expression in a robust, cell-free system. The new expression approach rescues QconCATs that previously were unable to be expressed in bacteria and can reduce the incidence of proteolytic damage to QconCATs. Moreover, it is possible to cosynthesize QconCATs in a highly-multiplexed translation reaction, coexpressing tens or hundreds of QconCATs simultaneously. By obviating bacterial culture and through the gain of high level multiplexing, it is now possible to generate tens of thousands of standard peptides in a matter of weeks, rendering absolute quantification of a complex proteome highly achievable in a reproducible, broadly deployable system. PMID:29055021

  4. A deterministic model of electron transport for electron probe microanalysis

    NASA Astrophysics Data System (ADS)

    Bünger, J.; Richter, S.; Torrilhon, M.

    2018-01-01

    Within the last decades significant improvements in the spatial resolution of electron probe microanalysis (EPMA) were obtained by instrumental enhancements. In contrast, the quantification procedures essentially remained unchanged. As the classical procedures assume either homogeneity or a multi-layered structure of the material, they limit the spatial resolution of EPMA. The possibilities of improving the spatial resolution through more sophisticated quantification procedures are therefore almost untouched. We investigate a new analytical model (M 1-model) for the quantification procedure based on fast and accurate modelling of electron-X-ray-matter interactions in complex materials using a deterministic approach to solve the electron transport equations. We outline the derivation of the model from the Boltzmann equation for electron transport using the method of moments with a minimum entropy closure and present first numerical results for three different test cases (homogeneous, thin film and interface). Taking Monte Carlo as a reference, the results for the three test cases show that the M 1-model is able to reproduce the electron dynamics in EPMA applications very well. Compared to classical analytical models like XPP and PAP, the M 1-model is more accurate and far more flexible, which indicates the potential of deterministic models of electron transport to further increase the spatial resolution of EPMA.

  5. Evaluation of the impact of matrix effect on quantification of pesticides in foods by gas chromatography-mass spectrometry using isotope-labeled internal standards.

    PubMed

    Yarita, Takashi; Aoyagi, Yoshie; Otake, Takamitsu

    2015-05-29

    The impact of the matrix effect in GC-MS quantification of pesticides in food using the corresponding isotope-labeled internal standards was evaluated. A spike-and-recovery study of nine target pesticides was first conducted using paste samples of corn, green soybean, carrot, and pumpkin. The observed analytical values using isotope-labeled internal standards were more accurate for most target pesticides than that obtained using the external calibration method, but were still biased from the spiked concentrations when a matrix-free calibration solution was used for calibration. The respective calibration curves for each target pesticide were also prepared using matrix-free calibration solutions and matrix-matched calibration solutions with blank soybean extract. The intensity ratio of the peaks of most target pesticides to that of the corresponding isotope-labeled internal standards was influenced by the presence of the matrix in the calibration solution; therefore, the observed slope varied. The ratio was also influenced by the type of injection method (splitless or on-column). These results indicated that matrix-matching of the calibration solution is required for very accurate quantification, even if isotope-labeled internal standards were used for calibration. Copyright © 2015 Elsevier B.V. All rights reserved.

  6. Accuracy of iodine quantification using dual energy CT in latest generation dual source and dual layer CT.

    PubMed

    Pelgrim, Gert Jan; van Hamersvelt, Robbert W; Willemink, Martin J; Schmidt, Bernhard T; Flohr, Thomas; Schilham, Arnold; Milles, Julien; Oudkerk, Matthijs; Leiner, Tim; Vliegenthart, Rozemarijn

    2017-09-01

    To determine the accuracy of iodine quantification with dual energy computed tomography (DECT) in two high-end CT systems with different spectral imaging techniques. Five tubes with different iodine concentrations (0, 5, 10, 15, 20 mg/ml) were analysed in an anthropomorphic thoracic phantom. Adding two phantom rings simulated increased patient size. For third-generation dual source CT (DSCT), tube voltage combinations of 150Sn and 70, 80, 90, 100 kVp were analysed. For dual layer CT (DLCT), 120 and 140 kVp were used. Scans were repeated three times. Median normalized values and interquartile ranges (IQRs) were calculated for all kVp settings and phantom sizes. Correlation between measured and known iodine concentrations was excellent for both systems (R = 0.999-1.000, p < 0.0001). For DSCT, median measurement errors ranged from -0.5% (IQR -2.0, 2.0%) at 150Sn/70 kVp and -2.3% (IQR -4.0, -0.1%) at 150Sn/80 kVp to -4.0% (IQR -6.0, -2.8%) at 150Sn/90 kVp. For DLCT, median measurement errors ranged from -3.3% (IQR -4.9, -1.5%) at 140 kVp to -4.6% (IQR -6.0, -3.6%) at 120 kVp. Larger phantom sizes increased variability of iodine measurements (p < 0.05). Iodine concentration can be accurately quantified with state-of-the-art DECT systems from two vendors. The lowest absolute errors were found for DSCT using the 150Sn/70 kVp or 150Sn/80 kVp combinations, which was slightly more accurate than 140 kVp in DLCT. • High-end CT scanners allow accurate iodine quantification using different DECT techniques. • Lowest measurement error was found in scans with largest photon energy separation. • Dual-source CT quantified iodine slightly more accurately than dual layer CT.

  7. Improved Quantification of the Beta Cell Mass after Pancreas Visualization with 99mTc-demobesin-4 and Beta Cell Imaging with 111In-exendin-3 in Rodents.

    PubMed

    van der Kroon, Inge; Joosten, Lieke; Nock, Berthold A; Maina, Theodosia; Boerman, Otto C; Brom, Maarten; Gotthardt, Martin

    2016-10-03

    Accurate assessment of the 111 In-exendin-3 uptake within the pancreas requires exact delineation of the pancreas, which is highly challenging by MRI and CT in rodents. In this study, the pancreatic tracer 99m Tc-demobesin-4 was evaluated for accurate delineation of the pancreas to be able to accurately quantify 111 In-exendin-3 uptake within the pancreas. Healthy and alloxan-induced diabetic Brown Norway rats were injected with the pancreatic tracer 99m Tc-demobesin-4 ([ 99m Tc-N 4 -Pro 1 ,Tyr 4 ,Nle 14 ]bombesin) and the beta cell tracer 111 In-exendin-3 ([ 111 In-DTPA-Lys 40 ]exendin-3). After dual isotope acquisition of SPECT images, 99m Tc-demobesin-4 was used to define a volume of interest for the pancreas in SPECT images subsequently the 111 In-exendin-3 uptake within this region was quantified. Furthermore, biodistribution and autoradiography were performed in order to gain insight in the distribution of both tracers in the animals. 99m Tc-demobesin-4 showed high accumulation in the pancreas. The uptake was highly homogeneous throughout the pancreas, independent of diabetic status, as demonstrated by autoradiography, whereas 111 In-exendin-3 only accumulates in the islets of Langerhans. Quantification of both ex vivo and in vivo SPECT images resulted in an excellent linear correlation between the pancreatic uptake, determined with ex vivo counting and 111 In-exendin-3 uptake, determined from the quantitative analysis of the SPECT images (Pearson r = 0.97, Pearson r = 0.92). 99m Tc-demobesin-4 shows high accumulation in the pancreas of rats. It is a suitable tracer for accurate delineation of the pancreas and can be conveniently used for simultaneous acquisition with 111 In labeled exendin-3. This method provides a straightforward, reliable, and objective method for preclinical beta cell mass (BCM) quantification with 111 In-exendin-3.

  8. Simple and accurate quantification of BTEX in ambient air by SPME and GC-MS.

    PubMed

    Baimatova, Nassiba; Kenessov, Bulat; Koziel, Jacek A; Carlsen, Lars; Bektassov, Marat; Demyanenko, Olga P

    2016-07-01

    Benzene, toluene, ethylbenzene and xylenes (BTEX) comprise one of the most ubiquitous and hazardous groups of ambient air pollutants of concern. Application of standard analytical methods for quantification of BTEX is limited by the complexity of sampling and sample preparation equipment, and budget requirements. Methods based on SPME represent simpler alternative, but still require complex calibration procedures. The objective of this research was to develop a simpler, low-budget, and accurate method for quantification of BTEX in ambient air based on SPME and GC-MS. Standard 20-mL headspace vials were used for field air sampling and calibration. To avoid challenges with obtaining and working with 'zero' air, slope factors of external standard calibration were determined using standard addition and inherently polluted lab air. For polydimethylsiloxane (PDMS) fiber, differences between the slope factors of calibration plots obtained using lab and outdoor air were below 14%. PDMS fiber provided higher precision during calibration while the use of Carboxen/PDMS fiber resulted in lower detection limits for benzene and toluene. To provide sufficient accuracy, the use of 20mL vials requires triplicate sampling and analysis. The method was successfully applied for analysis of 108 ambient air samples from Almaty, Kazakhstan. Average concentrations of benzene, toluene, ethylbenzene and o-xylene were 53, 57, 11 and 14µgm(-3), respectively. The developed method can be modified for further quantification of a wider range of volatile organic compounds in air. In addition, the new method is amenable to automation. Copyright © 2016 Elsevier B.V. All rights reserved.

  9. Multiplex Real-Time qPCR Assay for Simultaneous and Sensitive Detection of Phytoplasmas in Sesame Plants and Insect Vectors

    PubMed Central

    Ikten, Cengiz; Ustun, Rustem; Catal, Mursel; Yol, Engin; Uzun, Bulent

    2016-01-01

    Phyllody, a destructive and economically important disease worldwide caused by phytoplasma infections, is characterized by the abnormal development of floral structures into stunted leafy parts and contributes to serious losses in crop plants, including sesame (Sesamum indicum L.). Accurate identification, differentiation, and quantification of phyllody-causing phytoplasmas are essential for effective management of this plant disease and for selection of resistant sesame varieties. In this study, a diagnostic multiplex qPCR assay was developed using TaqMan® chemistry based on detection of the 16S ribosomal RNA gene of phytoplasmas and the 18S ribosomal gene of sesame. Phytoplasma and sesame specific primers and probes labeled with different fluorescent dyes were used for simultaneous amplification of 16SrII and 16SrIX phytoplasmas in a single tube. The multiplex real-time qPCR assay allowed accurate detection, differentiation, and quantification of 16SrII and 16SrIX groups in 109 sesame plant and 92 insect vector samples tested. The assay was found to have a detection sensitivity of 1.8 x 102 and 1.6 x 102 DNA copies for absolute quantification of 16SrII and 16SrIX group phytoplasmas, respectively. Relative quantification was effective and reliable for determination of phyllody phytoplasma DNA amounts normalized to sesame DNA in infected plant tissues. The development of this qPCR assay provides a method for the rapid measurement of infection loads to identify resistance levels of sesame genotypes against phyllody phytoplasma disease. PMID:27195795

  10. Estimating phosphorus loss in runoff from manure and fertilizer for a phosphorus loss quantification tool.

    PubMed

    Vadas, P A; Good, L W; Moore, P A; Widman, N

    2009-01-01

    Nonpoint-source pollution of fresh waters by P is a concern because it contributes to accelerated eutrophication. Given the state of the science concerning agricultural P transport, a simple tool to quantify annual, field-scale P loss is a realistic goal. We developed new methods to predict annual dissolved P loss in runoff from surface-applied manures and fertilizers and validated the methods with data from 21 published field studies. We incorporated these manure and fertilizer P runoff loss methods into an annual, field-scale P loss quantification tool that estimates dissolved and particulate P loss in runoff from soil, manure, fertilizer, and eroded sediment. We validated the P loss tool using independent data from 28 studies that monitored P loss in runoff from a variety of agricultural land uses for at least 1 yr. Results demonstrated (i) that our new methods to estimate P loss from surface manure and fertilizer are an improvement over methods used in existing Indexes, and (ii) that it was possible to reliably quantify annual dissolved, sediment, and total P loss in runoff using relatively simple methods and readily available inputs. Thus, a P loss quantification tool that does not require greater degrees of complexity or input data than existing P Indexes could accurately predict P loss across a variety of management and fertilization practices, soil types, climates, and geographic locations. However, estimates of runoff and erosion are still needed that are accurate to a level appropriate for the intended use of the quantification tool.

  11. Implement Method for Automated Testing of Markov Chain Convergence into INVERSE for ORNL12-RS-108J: Advanced Multi-Dimensional Forward and Inverse Modeling

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bledsoe, Keith C.

    2015-04-01

    The DiffeRential Evolution Adaptive Metropolis (DREAM) method is a powerful optimization/uncertainty quantification tool used to solve inverse transport problems in Los Alamos National Laboratory’s INVERSE code system. The DREAM method has been shown to be adept at accurate uncertainty quantification, but it can be very computationally demanding. Previously, the DREAM method in INVERSE performed a user-defined number of particle transport calculations. This placed a burden on the user to guess the number of calculations that would be required to accurately solve any given problem. This report discusses a new approach that has been implemented into INVERSE, the Gelman-Rubin convergence metric.more » This metric automatically detects when an appropriate number of transport calculations have been completed and the uncertainty in the inverse problem has been accurately calculated. In a test problem with a spherical geometry, this method was found to decrease the number of transport calculations (and thus time required) to solve a problem by an average of over 90%. In a cylindrical test geometry, a 75% decrease was obtained.« less

  12. High-performance Thin-layer Chromatographic-densitometric Quantification and Recovery of Bioactive Compounds for Identification of Elite Chemotypes of Gloriosa superba L. Collected from Sikkim Himalayas (India)

    PubMed Central

    Misra, Ankita; Shukla, Pushpendra Kumar; Kumar, Bhanu; Chand, Jai; Kushwaha, Poonam; Khalid, Md.; Singh Rawat, Ajay Kumar; Srivastava, Sharad

    2017-01-01

    Background: Gloriosa superba L. (Colchicaceae) is used as adjuvant therapy in gout for its potential antimitotic activity due to high colchicine(s) alkaloids. Objective: This study aimed to develop an easy, cheap, precise, and accurate high-performance thin-layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L. and to identify its elite chemotype(s) from Sikkim Himalayas (India). Methods: The HPTLC chromatographic method was developed using mobile phase of chloroform: acetone: diethyl amine (5:4:1) at λmax of 350 nm. Results: Five germplasms were collected from targeted region, and on morpho-anatomical inspection, no significant variation was observed among them. Quantification data reveal that content of colchicine (Rf: 0.72) and gloriosine (Rf: 0.61) varies from 0.035%–0.150% to 0.006%–0.032% (dry wt. basis). Linearity of method was obtained in the concentration range of 100–400 ng/spot of marker(s), exhibiting regression coefficient of 0.9987 (colchicine) and 0.9983 (gloriosine) with optimum recovery of 97.79 ± 3.86 and 100.023% ± 0.01%, respectively. Limit of detection and limit of quantification were analyzed, respectively, as 6.245, 18.926 and 8.024, 24.316 (ng). Two germplasms, namely NBG-27 and NBG-26, were found to be elite chemotype of both the markers. Conclusion: The developed method is validated in terms of accuracy, recovery, and precision studies as per the ICH guidelines (2005) and can be adopted for the simultaneous quantification of colchicine and gloriosine in phytopharmaceuticals. In addition, this study is relevant to explore the chemotypic variability in metabolite content for commercial and medicinal purposes. SUMMARY An easy, cheap, precise, and accurate high performance thin layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L.Five germplasms were collected from targeted region, and on morpho anatomical inspection, no significant variation was observed among themQuantification data reveal that content of colchicine (Rf: 0.72) and gloriosine (Rf: 0.61) varies from 0.035%–0.150% to 0.006%–0.032% (dry wt. basis)Two germplasms, namely NBG 27 and NBG 26, were found to be elite chemotype of both the markers. PMID:29142436

  13. High-performance Thin-layer Chromatographic-densitometric Quantification and Recovery of Bioactive Compounds for Identification of Elite Chemotypes of Gloriosa superba L. Collected from Sikkim Himalayas (India).

    PubMed

    Misra, Ankita; Shukla, Pushpendra Kumar; Kumar, Bhanu; Chand, Jai; Kushwaha, Poonam; Khalid, Md; Singh Rawat, Ajay Kumar; Srivastava, Sharad

    2017-10-01

    Gloriosa superba L. (Colchicaceae) is used as adjuvant therapy in gout for its potential antimitotic activity due to high colchicine(s) alkaloids. This study aimed to develop an easy, cheap, precise, and accurate high-performance thin-layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L. and to identify its elite chemotype(s) from Sikkim Himalayas (India). The HPTLC chromatographic method was developed using mobile phase of chloroform: acetone: diethyl amine (5:4:1) at λ max of 350 nm. Five germplasms were collected from targeted region, and on morpho-anatomical inspection, no significant variation was observed among them. Quantification data reveal that content of colchicine ( R f : 0.72) and gloriosine ( R f : 0.61) varies from 0.035%-0.150% to 0.006%-0.032% (dry wt. basis). Linearity of method was obtained in the concentration range of 100-400 ng/spot of marker(s), exhibiting regression coefficient of 0.9987 (colchicine) and 0.9983 (gloriosine) with optimum recovery of 97.79 ± 3.86 and 100.023% ± 0.01%, respectively. Limit of detection and limit of quantification were analyzed, respectively, as 6.245, 18.926 and 8.024, 24.316 (ng). Two germplasms, namely NBG-27 and NBG-26, were found to be elite chemotype of both the markers. The developed method is validated in terms of accuracy, recovery, and precision studies as per the ICH guidelines (2005) and can be adopted for the simultaneous quantification of colchicine and gloriosine in phytopharmaceuticals. In addition, this study is relevant to explore the chemotypic variability in metabolite content for commercial and medicinal purposes. An easy, cheap, precise, and accurate high performance thin layer chromatographic (HPTLC) validated method for simultaneous quantification of bioactive alkaloids (colchicine and gloriosine) in G. superba L.Five germplasms were collected from targeted region, and on morpho anatomical inspection, no significant variation was observed among themQuantification data reveal that content of colchicine (Rf: 0.72) and gloriosine (Rf: 0.61) varies from 0.035%-0.150% to 0.006%-0.032% (dry wt. basis)Two germplasms, namely NBG 27 and NBG 26, were found to be elite chemotype of both the markers.

  14. A comparison of two colorimetric assays, based upon Lowry and Bradford techniques, to estimate total protein in soil extracts.

    PubMed

    Redmile-Gordon, M A; Armenise, E; White, R P; Hirsch, P R; Goulding, K W T

    2013-12-01

    Soil extracts usually contain large quantities of dissolved humified organic material, typically reflected by high polyphenolic content. Since polyphenols seriously confound quantification of extracted protein, minimising this interference is important to ensure measurements are representative. Although the Bradford colorimetric assay is used routinely in soil science for rapid quantification protein in soil-extracts, it has several limitations. We therefore investigated an alternative colorimetric technique based on the Lowry assay (frequently used to measure protein and humic substances as distinct pools in microbial biofilms). The accuracies of both the Bradford assay and a modified Lowry microplate method were compared in factorial combination. Protein was quantified in soil-extracts (extracted with citrate), including standard additions of model protein (BSA) and polyphenol (Sigma H1675-2). Using the Lowry microplate assay described, no interfering effects of citrate were detected even with concentrations up to 5 times greater than are typically used to extract soil protein. Moreover, the Bradford assay was found to be highly susceptible to two simultaneous and confounding artefacts: 1) the colour development due to added protein was greatly inhibited by polyphenol concentration, and 2) substantial colour development was caused directly by the polyphenol addition. In contrast, the Lowry method enabled distinction between colour development from protein and non-protein origin, providing a more accurate quantitative analysis. These results suggest that the modified-Lowry method is a more suitable measure of extract protein (defined by standard equivalents) because it is less confounded by the high polyphenolic content which is so typical of soil extracts.

  15. Development and validation of a highly sensitive liquid chromatography/mass spectrometry method for simultaneous quantification of lenalidomide and flavopiridol in human plasma.

    PubMed

    Liu, Qing; Farley, Katherine L; Johnson, Amy J; Muthusamy, Natarajan; Hofmeister, Craig C; Blum, Kristie A; Schaaf, Larry J; Grever, Michael R; Byrd, John C; Dalton, James T; Phelps, Mitch A

    2008-10-01

    Lenalidomide, an immunomodulatory agent, and flavopiridol, a broad cyclin-dependent kinase inhibitor, are active therapies for clinical use in genomic high-risk chronic lymphocytic leukemia. A high-performance liquid chromatographic assay with tandem mass spectrometric detection has been developed to simultaneously quantify lenalidomide and flavopiridol in human and mouse plasma to facilitate their combined clinical development. Samples were prepared by liquid-liquid extraction with acetonitrile (ACN)-containing internal standard, genistein, followed by evaporation of solvent and reconstitution in 95/5 H2O/ACN. Lenalidomide and internal standard were separated by reversed-phase liquid chromatography on a C-18 column using a gradient of H2O and ACN, each with 0.1% formic acid. Atmospheric pressure chemical ionization in positive ion mode with single reaction monitoring on a triple quadrupole mass spectrometer was applied to detect transitions of lenalidomide (260.06 > 149.10) and flavopiridol (402.09 > 341.02). Lower limits of quantification of lenalidomide and flavopiridol were 1 and 0.3 nM, respectively. Recoveries of lenalidomide and flavopiridol from human plasma ranged from 99% to 116% throughout their linear ranges. Within- and between-run precision and accuracy of replicate samples were all less than 15%. This is the most sensitive analytical method reported to date for both lenalidomide and flavopiridol. This sensitivity will enable late terminal phase concentration measurements and accurate pharmacokinetic parameter estimation in a planned clinical trial with lenalidomide and flavopiridol in patients with chronic lymphocytic leukemia.

  16. The need for non- or minimally-invasive biomonitoring strategies and the development of pharmacokinetic/pharmacodynamic models for quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Timchalk, Charles; Weber, Thomas J.; Smith, Jordan N.

    Advancements in Exposure Science involving the development and deployment of biomarkers of exposure and biological response are anticipated to significantly (and positively) influence health outcomes associated with occupational, environmental and clinical exposure to chemicals/drugs. To achieve this vision, innovative strategies are needed to develop multiplex sensor platforms capable of quantifying individual and mixed exposures (i.e. systemic dose) by measuring biomarkers of dose and biological response in readily obtainable (non-invasive) biofluids. Secondly, the use of saliva (alternative to blood) for biomonitoring coupled with the ability to rapidly analyze multiple samples in real-time offers an innovative opportunity to revolutionize biomonitoring assessments. Inmore » this regard, the timing and number of samples taken for biomonitoring will not be limited as is currently the case. In addition, real-time analysis will facilitate identification of work practices or conditions that are contributing to increased exposures and will make possible a more rapid and successful intervention strategy. The initial development and application of computational models for evaluation of saliva/blood analyte concentration at anticipated exposure levels represents an important opportunity to establish the limits of quantification and robustness of multiplex sensor systems by exploiting a unique computational modeling framework. The use of these pharmacokinetic models will also enable prediction of an exposure dose based on the saliva/blood measurement. This novel strategy will result in a more accurate prediction of exposures and, once validated, can be employed to assess dosimetry to a broad range of chemicals in support of biomonitoring and epidemiology studies.« less

  17. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  18. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. Finally, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  19. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    NASA Astrophysics Data System (ADS)

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; Geraci, Gianluca; Eldred, Michael S.; Vane, Zachary P.; Lacaze, Guilhem; Oefelein, Joseph C.; Najm, Habib N.

    2018-03-01

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis is conducted to identify influential uncertain input parameters, which can help reduce the systems stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. These methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.

  20. Global Sensitivity Analysis and Estimation of Model Error, Toward Uncertainty Quantification in Scramjet Computations

    DOE PAGES

    Huan, Xun; Safta, Cosmin; Sargsyan, Khachik; ...

    2018-02-09

    The development of scramjet engines is an important research area for advancing hypersonic and orbital flights. Progress toward optimal engine designs requires accurate flow simulations together with uncertainty quantification. However, performing uncertainty quantification for scramjet simulations is challenging due to the large number of uncertain parameters involved and the high computational cost of flow simulations. These difficulties are addressed in this paper by developing practical uncertainty quantification algorithms and computational methods, and deploying them in the current study to large-eddy simulations of a jet in crossflow inside a simplified HIFiRE Direct Connect Rig scramjet combustor. First, global sensitivity analysis ismore » conducted to identify influential uncertain input parameters, which can help reduce the system’s stochastic dimension. Second, because models of different fidelity are used in the overall uncertainty quantification assessment, a framework for quantifying and propagating the uncertainty due to model error is presented. In conclusion, these methods are demonstrated on a nonreacting jet-in-crossflow test problem in a simplified scramjet geometry, with parameter space up to 24 dimensions, using static and dynamic treatments of the turbulence subgrid model, and with two-dimensional and three-dimensional geometries.« less

  1. Oxygen speciation in upgraded fast pyrolysis bio-oils by comprehensive two-dimensional gas chromatography.

    PubMed

    Omais, Badaoui; Crepier, Julien; Charon, Nadège; Courtiade, Marion; Quignard, Alain; Thiébaut, Didier

    2013-04-21

    Biomass fast pyrolysis is considered as a promising route to produce liquid for the transportation field from a renewable resource. However, the derived bio-oils are mainly oxygenated (45-50%w/w O on a wet basis) and contain almost no hydrocarbons. Therefore, upgrading is necessary to obtain a liquid with lower oxygen content and characterization of oxygenated compounds in these products is essential to assist conversion reactions. For this purpose, comprehensive two-dimensional gas chromatography (GC × GC) can be investigated. Oxygen speciation in such matrices is hampered by the large diversity of oxygenated families and the complexity of the hydrocarbon matrix. Moreover, response factors must be taken into account for oxygenate quantification as the Flame Ionisation Detector (FID) response varies when a molecule contains heteroatoms. To conclude, no distillation cuts were accessible and the analysis had to cover a large range of boiling points (30-630 °C). To take up this analytical challenge, a thorough optimization approach was developed. In fact, four GC × GC column sets were investigated to separate oxygenated compounds from the hydrocarbon matrix. Both model mixtures and the upgraded biomass flash pyrolysis oil were injected using GC × GC-FID to reach a suitable chromatographic separation. The advantages and drawbacks of each column combination for oxygen speciation in upgraded bio-oils are highlighted in this study. Among the four sets, an original polar × semi-polar column combination was selected and enabled the identification by GC × GC-ToF/MS of more than 40 compounds belonging to eight chemical families: ketones, furans, alcohols, phenols, carboxylic acids, guaiacols, anisols, and esters. For quantification purpose, the GC × GC-FID chromatogram was divided into more than 60 blobs corresponding to the previously identified analyte and hydrocarbon zones. A database associating each blob to a molecule and its specific response factor (determined by standards injection at different concentrations) was created. A detailed molecular quantification by GC × GC-FID was therefore accessible after integration of the corrected normalized areas. This paper aims to present a detail level in terms of characterization of oxygenated compounds in upgraded bio-oils which to our knowledge has never been reached so far. It is based on an original column set selection and an extremely accurate quantification procedure.

  2. High-sensitivity MALDI-TOF MS quantification of anthrax lethal toxin for diagnostics and evaluation of medical countermeasures.

    PubMed

    Boyer, Anne E; Gallegos-Candela, Maribel; Quinn, Conrad P; Woolfitt, Adrian R; Brumlow, Judith O; Isbell, Katherine; Hoffmaster, Alex R; Lins, Renato C; Barr, John R

    2015-04-01

    Inhalation anthrax has a rapid progression and high fatality rate. Pathology and death from inhalation of Bacillus anthracis spores are attributed to the actions of secreted protein toxins. Protective antigen (PA) binds and imports the catalytic component lethal factor (LF), a zinc endoprotease, and edema factor (EF), an adenylyl cyclase, into susceptible cells. PA-LF is termed lethal toxin (LTx) and PA-EF, edema toxin. As the universal transporter for both toxins, PA is an important target for vaccination and immunotherapeutic intervention. However, its quantification has been limited to methods of relatively low analytic sensitivity. Quantification of LTx may be more clinically relevant than LF or PA alone because LTx is the toxic form that acts on cells. A method was developed for LTx-specific quantification in plasma using anti-PA IgG magnetic immunoprecipitation of PA and quantification of LF activity that co-purified with PA. The method was fast (<4 h total time to detection), sensitive at 0.033 ng/mL LTx in plasma for the fast analysis (0.0075 ng/mL LTx in plasma for an 18 h reaction), precise (6.3-9.9% coefficient of variation), and accurate (0.1-12.7%error; n ≥ 25). Diagnostic sensitivity was 100% (n = 27 animal/clinical cases). Diagnostic specificity was 100% (n = 141). LTx was detected post-antibiotic treatment in 6/6 treated rhesus macaques and 3/3 clinical cases of inhalation anthrax and as long as 8 days post-treatment. Over the course of infection in two rhesus macaques, LTx was first detected at 0.101 and 0.237 ng/mL at 36 h post-exposure and increased to 1147 and 12,107 ng/mL in late-stage anthrax. This demonstrated the importance of LTx as a diagnostic and therapeutic target. This method provides a sensitive, accurate tool for anthrax toxin detection and evaluation of PA-directed therapeutics.

  3. Molecular methods for pathogen detection and quantification

    USDA-ARS?s Scientific Manuscript database

    Ongoing interest in convenient, inexpensive, fast, sensitive and accurate techniques for detecting and/or quantifying the presence of soybean pathogens has resulted in increased usage of molecular tools. The method of extracting a molecular target (usually DNA or RNA) for detection depends wholly up...

  4. Metal Stable Isotope Tagging: Renaissance of Radioimmunoassay for Multiplex and Absolute Quantification of Biomolecules.

    PubMed

    Liu, Rui; Zhang, Shixi; Wei, Chao; Xing, Zhi; Zhang, Sichun; Zhang, Xinrong

    2016-05-17

    The unambiguous quantification of biomolecules is of great significance in fundamental biological research as well as practical clinical diagnosis. Due to the lack of a detectable moiety, the direct and highly sensitive quantification of biomolecules is often a "mission impossible". Consequently, tagging strategies to introduce detectable moieties for labeling target biomolecules were invented, which had a long and significant impact on studies of biomolecules in the past decades. For instance, immunoassays have been developed with radioisotope tagging by Yalow and Berson in the late 1950s. The later languishment of this technology can be almost exclusively ascribed to the use of radioactive isotopes, which led to the development of nonradioactive tagging strategy-based assays such as enzyme-linked immunosorbent assay, fluorescent immunoassay, and chemiluminescent and electrochemiluminescent immunoassay. Despite great success, these strategies suffered from drawbacks such as limited spectral window capacity for multiplex detection and inability to provide absolute quantification of biomolecules. After recalling the sequences of tagging strategies, an apparent question is why not use stable isotopes from the start? A reasonable explanation is the lack of reliable means for accurate and precise quantification of stable isotopes at that time. The situation has changed greatly at present, since several atomic mass spectrometric measures for metal stable isotopes have been developed. Among the newly developed techniques, inductively coupled plasma mass spectrometry is an ideal technique to determine metal stable isotope-tagged biomolecules, for its high sensitivity, wide dynamic linear range, and more importantly multiplex and absolute quantification ability. Since the first published report by our group, metal stable isotope tagging has become a revolutionary technique and gained great success in biomolecule quantification. An exciting research highlight in this area is the development and application of the mass cytometer, which fully exploited the multiplexing potential of metal stable isotope tagging. It realized the simultaneous detection of dozens of parameters in single cells, accurate immunophenotyping in cell populations, through modeling of intracellular signaling network and undoubted discrimination of function and connection of cell subsets. Metal stable isotope tagging has great potential applications in hematopoiesis, immunology, stem cells, cancer, and drug screening related research and opened a post-fluorescence era of cytometry. Herein, we review the development of biomolecule quantification using metal stable isotope tagging. Particularly, the power of multiplex and absolute quantification is demonstrated. We address the advantages, applicable situations, and limitations of metal stable isotope tagging strategies and propose suggestions for future developments. The transfer of enzymatic or fluorescent tagging to metal stable isotope tagging may occur in many aspects of biological and clinical practices in the near future, just as the revolution from radioactive isotope tagging to fluorescent tagging happened in the past.

  5. User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Coleman, Kayla; Gilkey, Lindsay N.

    Sandia’s Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a physics-based computational model. This can lend efficiency and rigor to manual parameter perturbation studies already being conducted by analysts. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, riskmore » analysis, and quantification of margins and uncertainty with such models. It directly supports verification and validation activities. Dakota algorithms enrich complex science and engineering models, enabling an analyst to answer crucial questions of - Sensitivity: Which are the most important input factors or parameters entering the simulation, and how do they influence key outputs?; Uncertainty: What is the uncertainty or variability in simulation output, given uncertainties in input parameters? How safe, reliable, robust, or variable is my system? (Quantification of margins and uncertainty, QMU); Optimization: What parameter values yield the best performing design or operating condition, given constraints? Calibration: What models and/or parameters best match experimental data? In general, Dakota is the Consortium for Advanced Simulation of Light Water Reactors (CASL) delivery vehicle for verification, validation, and uncertainty quantification (VUQ) algorithms. It permits ready application of the VUQ methods described above to simulation codes by CASL researchers, code developers, and application engineers.« less

  6. Quantification of fibre polymerization through Fourier space image analysis

    PubMed Central

    Nekouzadeh, Ali; Genin, Guy M.

    2011-01-01

    Quantification of changes in the total length of randomly oriented and possibly curved lines appearing in an image is a necessity in a wide variety of biological applications. Here, we present an automated approach based upon Fourier space analysis. Scaled, band-pass filtered power spectral densities of greyscale images are integrated to provide a quantitative measurement of the total length of lines of a particular range of thicknesses appearing in an image. A procedure is presented to correct for changes in image intensity. The method is most accurate for two-dimensional processes with fibres that do not occlude one another. PMID:24959096

  7. Landsat phenological metrics and their relation to aboveground carbon in the Brazilian Savanna.

    PubMed

    Schwieder, M; Leitão, P J; Pinto, J R R; Teixeira, A M C; Pedroni, F; Sanchez, M; Bustamante, M M; Hostert, P

    2018-05-15

    The quantification and spatially explicit mapping of carbon stocks in terrestrial ecosystems is important to better understand the global carbon cycle and to monitor and report change processes, especially in the context of international policy mechanisms such as REDD+ or the implementation of Nationally Determined Contributions (NDCs) and the UN Sustainable Development Goals (SDGs). Especially in heterogeneous ecosystems, such as Savannas, accurate carbon quantifications are still lacking, where highly variable vegetation densities occur and a strong seasonality hinders consistent data acquisition. In order to account for these challenges we analyzed the potential of land surface phenological metrics derived from gap-filled 8-day Landsat time series for carbon mapping. We selected three areas located in different subregions in the central Brazil region, which is a prominent example of a Savanna with significant carbon stocks that has been undergoing extensive land cover conversions. Here phenological metrics from the season 2014/2015 were combined with aboveground carbon field samples of cerrado sensu stricto vegetation using Random Forest regression models to map the regional carbon distribution and to analyze the relation between phenological metrics and aboveground carbon. The gap filling approach enabled to accurately approximate the original Landsat ETM+ and OLI EVI values and the subsequent derivation of annual phenological metrics. Random Forest model performances varied between the three study areas with RMSE values of 1.64 t/ha (mean relative RMSE 30%), 2.35 t/ha (46%) and 2.18 t/ha (45%). Comparable relationships between remote sensing based land surface phenological metrics and aboveground carbon were observed in all study areas. Aboveground carbon distributions could be mapped and revealed comprehensible spatial patterns. Phenological metrics were derived from 8-day Landsat time series with a spatial resolution that is sufficient to capture gradual changes in carbon stocks of heterogeneous Savanna ecosystems. These metrics revealed the relationship between aboveground carbon and the phenology of the observed vegetation. Our results suggest that metrics relating to the seasonal minimum and maximum values were the most influential variables and bear potential to improve spatially explicit mapping approaches in heterogeneous ecosystems, where both spatial and temporal resolutions are critical.

  8. Targeted RNA-Sequencing with Competitive Multiplex-PCR Amplicon Libraries

    PubMed Central

    Blomquist, Thomas M.; Crawford, Erin L.; Lovett, Jennie L.; Yeo, Jiyoun; Stanoszek, Lauren M.; Levin, Albert; Li, Jia; Lu, Mei; Shi, Leming; Muldrew, Kenneth; Willey, James C.

    2013-01-01

    Whole transcriptome RNA-sequencing is a powerful tool, but is costly and yields complex data sets that limit its utility in molecular diagnostic testing. A targeted quantitative RNA-sequencing method that is reproducible and reduces the number of sequencing reads required to measure transcripts over the full range of expression would be better suited to diagnostic testing. Toward this goal, we developed a competitive multiplex PCR-based amplicon sequencing library preparation method that a) targets only the sequences of interest and b) controls for inter-target variation in PCR amplification during library preparation by measuring each transcript native template relative to a known number of synthetic competitive template internal standard copies. To determine the utility of this method, we intentionally selected PCR conditions that would cause transcript amplification products (amplicons) to converge toward equimolar concentrations (normalization) during library preparation. We then tested whether this approach would enable accurate and reproducible quantification of each transcript across multiple library preparations, and at the same time reduce (through normalization) total sequencing reads required for quantification of transcript targets across a large range of expression. We demonstrate excellent reproducibility (R2 = 0.997) with 97% accuracy to detect 2-fold change using External RNA Controls Consortium (ERCC) reference materials; high inter-day, inter-site and inter-library concordance (R2 = 0.97–0.99) using FDA Sequencing Quality Control (SEQC) reference materials; and cross-platform concordance with both TaqMan qPCR (R2 = 0.96) and whole transcriptome RNA-sequencing following “traditional” library preparation using Illumina NGS kits (R2 = 0.94). Using this method, sequencing reads required to accurately quantify more than 100 targeted transcripts expressed over a 107-fold range was reduced more than 10,000-fold, from 2.3×109 to 1.4×105 sequencing reads. These studies demonstrate that the competitive multiplex-PCR amplicon library preparation method presented here provides the quality control, reproducibility, and reduced sequencing reads necessary for development and implementation of targeted quantitative RNA-sequencing biomarkers in molecular diagnostic testing. PMID:24236095

  9. Profiling ABA metabolites in Nicotiana tabacum L. leaves by ultra-performance liquid chromatography-electrospray tandem mass spectrometry.

    PubMed

    Turecková, Veronika; Novák, Ondrej; Strnad, Miroslav

    2009-11-15

    We have developed a simple method for extracting and purifying (+)-abscisic acid (ABA) and eight ABA metabolites--phaseic acid (PA), dihydrophaseic acid (DPA), neophaseic acid (neoPA), ABA-glucose ester (ABAGE), 7'-hydroxy-ABA (7'-OH-ABA), 9'-hydroxy-ABA (9'-OH-ABA), ABAaldehyde, and ABAalcohol--before analysis by a novel technique for these substances, ultra-performance liquid chromatography-electrospray ionisation tandem mass spectrometry (UPLC-ESI-MS/MS). The procedure includes addition of deuterium-labelled standards, extraction with methanol-water-acetic acid (10:89:1, v/v), simple purification by Oasis((R)) HLB cartridges, rapid chromatographic separation by UPLC, and sensitive, accurate quantification by MS/MS in multiple reaction monitoring modes. The detection limits of the technique ranged between 0.1 and 1 pmol for ABAGE and ABA acids in negative ion mode, and 0.01-0.50 pmol for ABAGE, ABAaldehyde, ABAalcohol and the methylated acids in positive ion mode. The fast liquid chromatographic separation and analysis of ABA and its eight measured derivatives by UPLC-ESI-MS/MS provide rapid, accurate and robust quantification of most of the substances, and the low detection limits allow small amounts of tissue (1-5mg) to be used in quantitative analysis. To demonstrate the potential of the technique, we isolated ABA and its metabolites from control and water-stressed tobacco leaf tissues then analysed them by UPLC-ESI-MS/MS. Only ABA, PA, DPA, neoPA, and ABAGE were detected in the samples. PA was the most abundant analyte (ca. 1000 pmol/g f.w.) in both the control and water-stressed tissues, followed by ABAGE and DPA, which were both present at levels ca. 5-fold lower. ABA levels were at least 100-fold lower than PA concentrations, but they increased following the water stress treatment, while ABAGE, PA, and DPA levels decreased. Overall, the technique offers substantial improvements over previously described methods, enabling the detailed, direct study of diverse ABA metabolites in small amounts of plant tissue.

  10. Future research needs associated with the assessment of potential human health risks from exposure to toxic ambient air pollutants.

    PubMed Central

    Möller, L; Schuetzle, D; Autrup, H

    1994-01-01

    This paper presents key conclusions and future research needs from a Workshop on the Risk Assessment of Urban Air, Emissions, Exposure, Risk Identification, and Quantification, which was held in Stockholm during June 1992 by 41 participants from 13 countries. Research is recommended in the areas of identification and quantification of toxics in source emissions and ambient air, atmospheric transport and chemistry, exposure level assessment, the development of improved in vitro bioassays, biomarker development, the development of more accurate epidemiological methodologies, and risk quantification techniques. Studies are described that will be necessary to assess and reduce the level of uncertainties associated with each step of the risk assessment process. International collaborative research efforts between industry and government organizations are recommended as the most effective way to carry out this research. PMID:7529703

  11. Mathematical simulations for bioanalytical assay development: the (un-)necessity and (im-)possibility of free drug quantification.

    PubMed

    Staack, Roland F; Jordan, Gregor; Heinrich, Julia

    2012-02-01

    For every drug development program it needs to be discussed whether discrimination between free and total drug concentrations is required to accurately describe its pharmacokinetic behavior. This perspective describes the application of mathematical simulation approaches to guide this initial decision based on available knowledge about target biology, binding kinetics and expected drug concentrations. We provide generic calculations that can be used to estimate the necessity of free drug quantification for different drug molecules. In addition, mathematical approaches are used to simulate various assay conditions in bioanalytical ligand-binding assays: it is demonstrated that due to the noncovalent interaction between the binding partners and typical assay-related interferences in the equilibrium, a correct quantification of the free drug concentration is highly challenging and requires careful design of different assay procedure steps.

  12. Development of a screening method for genetically modified soybean by plasmid-based quantitative competitive polymerase chain reaction.

    PubMed

    Shimizu, Eri; Kato, Hisashi; Nakagawa, Yuki; Kodama, Takashi; Futo, Satoshi; Minegishi, Yasutaka; Watanabe, Takahiro; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi

    2008-07-23

    A novel type of quantitative competitive polymerase chain reaction (QC-PCR) system for the detection and quantification of the Roundup Ready soybean (RRS) was developed. This system was designed based on the advantage of a fully validated real-time PCR method used for the quantification of RRS in Japan. A plasmid was constructed as a competitor plasmid for the detection and quantification of genetically modified soy, RRS. The plasmid contained the construct-specific sequence of RRS and the taxon-specific sequence of lectin1 (Le1), and both had 21 bp oligonucleotide insertion in the sequences. The plasmid DNA was used as a reference molecule instead of ground seeds, which enabled us to precisely and stably adjust the copy number of targets. The present study demonstrated that the novel plasmid-based QC-PCR method could be a simple and feasible alternative to the real-time PCR method used for the quantification of genetically modified organism contents.

  13. Digital Quantification of Proteins and mRNA in Single Mammalian Cells.

    PubMed

    Albayrak, Cem; Jordi, Christian A; Zechner, Christoph; Lin, Jing; Bichsel, Colette A; Khammash, Mustafa; Tay, Savaş

    2016-03-17

    Absolute quantification of macromolecules in single cells is critical for understanding and modeling biological systems that feature cellular heterogeneity. Here we show extremely sensitive and absolute quantification of both proteins and mRNA in single mammalian cells by a very practical workflow that combines proximity ligation assay (PLA) and digital PCR. This digital PLA method has femtomolar sensitivity, which enables the quantification of very small protein concentration changes over its entire 3-log dynamic range, a quality necessary for accounting for single-cell heterogeneity. We counted both endogenous (CD147) and exogenously expressed (GFP-p65) proteins from hundreds of single cells and determined the correlation between CD147 mRNA and the protein it encodes. Using our data, a stochastic two-state model of the central dogma was constructed and verified using joint mRNA/protein distributions, allowing us to estimate transcription burst sizes and extrinsic noise strength and calculate the transcription and translation rate constants in single mammalian cells. Copyright © 2016 Elsevier Inc. All rights reserved.

  14. Validation metrics for turbulent plasma transport

    DOE PAGES

    Holland, C.

    2016-06-22

    Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. Furthermore, the utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak, as part of a multi-year transport model validation activity.« less

  15. Image registration and analysis for quantitative myocardial perfusion: application to dynamic circular cardiac CT.

    PubMed

    Isola, A A; Schmitt, H; van Stevendaal, U; Begemann, P G; Coulon, P; Boussel, L; Grass, M

    2011-09-21

    Large area detector computed tomography systems with fast rotating gantries enable volumetric dynamic cardiac perfusion studies. Prospectively, ECG-triggered acquisitions limit the data acquisition to a predefined cardiac phase and thereby reduce x-ray dose and limit motion artefacts. Even in the case of highly accurate prospective triggering and stable heart rate, spatial misalignment of the cardiac volumes acquired and reconstructed per cardiac cycle may occur due to small motion pattern variations from cycle to cycle. These misalignments reduce the accuracy of the quantitative analysis of myocardial perfusion parameters on a per voxel basis. An image-based solution to this problem is elastic 3D image registration of dynamic volume sequences with variable contrast, as it is introduced in this contribution. After circular cone-beam CT reconstruction of cardiac volumes covering large areas of the myocardial tissue, the complete series is aligned with respect to a chosen reference volume. The results of the registration process and the perfusion analysis with and without registration are evaluated quantitatively in this paper. The spatial alignment leads to improved quantification of myocardial perfusion for three different pig data sets.

  16. Whispering Gallery Mode Resonators for Rapid Label-Free Biosensing in Small Volume Droplets

    PubMed Central

    Wildgen, Sarah M.; Dunn, Robert C.

    2015-01-01

    Rapid biosensing requires fast mass transport of the analyte to the surface of the sensing element. To optimize analysis times, both mass transport in solution and the geometry and size of the sensing element need to be considered. Small dielectric spheres, tens of microns in diameter, can act as label-free biosensors using whispering gallery mode (WGM) resonances. WGM resonances are sensitive to the effective refractive index, which changes upon analyte binding to recognition sites on functionalized resonators. The spherical geometry and tens of microns diameter of these resonators provides an efficient target for sensing while their compact size enables detection in limited volumes. Here, we explore conditions leading to rapid analyte detection using WGM resonators as label-free sensors in 10 μL sample droplets. Droplet evaporation leads to potentially useful convective mixing, but also limits the time over which analysis can be completed. We show that active droplet mixing combined with initial binding rate measurements is required for accurate nanomolar protein quantification within the first minute following injection. PMID:25806835

  17. Near infrared fluorescence-based bacteriophage particles for ratiometric pH imaging.

    PubMed

    Hilderbrand, Scott A; Kelly, Kimberly A; Niedre, Mark; Weissleder, Ralph

    2008-08-01

    Fluorogenic imaging agents emitting in the near-infrared are becoming important research tools for disease investigation in vivo. Often pathophysiological states such as cancer and cystic fibrosis are associated with disruptions in acid/base homeostasis. The development of optical sensors for pH imaging would facilitate the investigation of these diseased conditions. In this report, the design and synthesis of a ratiometric near-infrared emitting probe for pH quantification is detailed. The pH-responsive probe is prepared by covalent attachment of pH-sensitive and pH-insensitive fluorophores to a bacteriophage particle scaffold. The pH-responsive cyanine dye, HCyC-646, used to construct the probe, has a fluorogenic pKa of 6.2, which is optimized for visualization of acidic pH often associated with tumor hypoxia and other diseased states. Incorporation of pH-insensitive reference dyes enables the ratiometric determination of pH independent of the probe concentration. With the pH-responsive construct, measurement of intracellular pH and accurate determination of pH through optically diffuse biological tissue is demonstrated.

  18. Validation metrics for turbulent plasma transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, C.

    Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. Furthermore, the utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak, as part of a multi-year transport model validation activity.« less

  19. Computation of Calcium Score with Dual Energy CT: A Phantom Study

    PubMed Central

    Kumar, Vidhya; Min, James K.; He, Xin; Raman, Subha V.

    2016-01-01

    Dual energy computed tomography (DECT) improves material and tissue characterization compared to single energy CT (SECT); we sought to validate coronary calcium quantification in advancing cardiovascular DECT. In an anthropomorphic phantom, agreement between measurements was excellent, and Bland-Altman analysis demonstrated minimal bias. Compared to the known calcium mass for each phantom, calcium mass by DECT was highly accurate. Noncontrast DECT yields accurate calcium measures, and warrants consideration in cardiac protocols for additional tissue characterizations. PMID:27680414

  20. Quantification of observed flare parameters in relation to a shear-index and verification of MHD models for flare prediction

    NASA Technical Reports Server (NTRS)

    Wu, S. T.

    1987-01-01

    The goal for the SAMEX magnetograph's optical system is to accurately measure the polarization state of sunlight in a narrow spectral bandwidth over the field of view of an active region to make an accurate determination of the magnetic field in that region. The instrumental polarization is characterized. The optics and coatings were designed to minimize this spurious polarization introduced by foreoptics. The method developed to calculate the instrumental polarization of the SAMEX optics is described.

  1. PaCeQuant: A Tool for High-Throughput Quantification of Pavement Cell Shape Characteristics1[OPEN

    PubMed Central

    Poeschl, Yvonne; Plötner, Romina

    2017-01-01

    Pavement cells (PCs) are the most frequently occurring cell type in the leaf epidermis and play important roles in leaf growth and function. In many plant species, PCs form highly complex jigsaw-puzzle-shaped cells with interlocking lobes. Understanding of their development is of high interest for plant science research because of their importance for leaf growth and hence for plant fitness and crop yield. Studies of PC development, however, are limited, because robust methods are lacking that enable automatic segmentation and quantification of PC shape parameters suitable to reflect their cellular complexity. Here, we present our new ImageJ-based tool, PaCeQuant, which provides a fully automatic image analysis workflow for PC shape quantification. PaCeQuant automatically detects cell boundaries of PCs from confocal input images and enables manual correction of automatic segmentation results or direct import of manually segmented cells. PaCeQuant simultaneously extracts 27 shape features that include global, contour-based, skeleton-based, and PC-specific object descriptors. In addition, we included a method for classification and analysis of lobes at two-cell junctions and three-cell junctions, respectively. We provide an R script for graphical visualization and statistical analysis. We validated PaCeQuant by extensive comparative analysis to manual segmentation and existing quantification tools and demonstrated its usability to analyze PC shape characteristics during development and between different genotypes. PaCeQuant thus provides a platform for robust, efficient, and reproducible quantitative analysis of PC shape characteristics that can easily be applied to study PC development in large data sets. PMID:28931626

  2. USEPA Approach for the Detection and Quantification of Enterococcus by qPCR

    EPA Science Inventory

    The Beach Act 2000 specified that EPA should develop: Appropriate and effective indicators for improviding detection in a timely manner of pathogens in coastal waters Appropriate, accurate, expeditious and cost-effective methods for the timely detection of pathogens in coas...

  3. Ranking filter methods for concentrating pathogens in lake water

    USDA-ARS?s Scientific Manuscript database

    Accurately comparing filtration methods for concentrating waterborne pathogens is difficult because of two important water matrix effects on recovery measurements, the effect on PCR quantification and the effect on filter performance. Regarding the first effect, we show how to create a control water...

  4. INTERLABORATORY METHODS COMPARISON FOR THE TOTAL ORGANIC CARBON ANALYSIS OF AQUIFER MATERIALS

    EPA Science Inventory

    The total organic carbon (TOC) content of aquifer materials has been found to have significant effects on the movement of pollutants in the subsurface environment. Accurate quantification of TOC is therefore of great im- portance to research in groundwater contamination. However,...

  5. Empirical Evidence for Childhood Depression.

    ERIC Educational Resources Information Center

    Lachar, David

    Although several theoretical positions deal with the concept of childhood depression, accurate measurement of depression can only occur if valid and reliable measures are available. Current efforts emphasize direct questioning of the child and quantification of parents' observations. One scale used to study childhood depression, the Personality…

  6. Screening and identification of per- and polyfluoroalkyl substances in microwave popcorn bags.

    PubMed

    Zabaleta, Itsaso; Negreira, Noelia; Bizkarguenaga, Ekhine; Prieto, Ailette; Covaci, Adrian; Zuloaga, Olatz

    2017-09-01

    Liquid chromatography coupled to quadrupole time-of-flight mass spectrometry (LC-QToF-MS) was used for the accurate identification (<10ppm) of different polyfluoroalkylphosphates (PAPs) and their intermediate and end degradation products in popcorn bags. Up to 46 per- and polyfluoroalkyl substances (PFASs) and precursors were identified. Moreover, an accurate method based on focused ultrasonic solid-liquid extraction (FUSLE) and a clean-up step with Envi-Carb sorbent was validated and applied to the quantification of 24 PFASs in popcorn bags from over twelve European countries, three American countries and two Asian countries. To the best of our knowledge, this is the first time that identification and quantification of some intermediates of PFAS precursors (different chain length fluorotelomer saturated acids (FTCAs) and fluorotelomer unsaturated acids (FTUCAs)) have been reported. Moreover, different patterns in the microwave popcorn bag composition were observed within the countries; while in European and American countries short chain PFASs were detected, Asian countries still use long chain PFASs. Copyright © 2017 Elsevier Ltd. All rights reserved.

  7. Properties of targeted preamplification in DNA and cDNA quantification.

    PubMed

    Andersson, Daniel; Akrap, Nina; Svec, David; Godfrey, Tony E; Kubista, Mikael; Landberg, Göran; Ståhlberg, Anders

    2015-01-01

    Quantification of small molecule numbers often requires preamplification to generate enough copies for accurate downstream enumerations. Here, we studied experimental parameters in targeted preamplification and their effects on downstream quantitative real-time PCR (qPCR). To evaluate different strategies, we monitored the preamplification reaction in real-time using SYBR Green detection chemistry followed by melting curve analysis. Furthermore, individual targets were evaluated by qPCR. The preamplification reaction performed best when a large number of primer pairs was included in the primer pool. In addition, preamplification efficiency, reproducibility and specificity were found to depend on the number of template molecules present, primer concentration, annealing time and annealing temperature. The amount of nonspecific PCR products could also be reduced about 1000-fold using bovine serum albumin, glycerol and formamide in the preamplification. On the basis of our findings, we provide recommendations how to perform robust and highly accurate targeted preamplification in combination with qPCR or next-generation sequencing.

  8. Accurate LC peak boundary detection for ¹⁶O/¹⁸O labeled LC-MS data.

    PubMed

    Cui, Jian; Petritis, Konstantinos; Tegeler, Tony; Petritis, Brianne; Ma, Xuepo; Jin, Yufang; Gao, Shou-Jiang S J; Zhang, Jianqiu Michelle

    2013-01-01

    In liquid chromatography-mass spectrometry (LC-MS), parts of LC peaks are often corrupted by their co-eluting peptides, which results in increased quantification variance. In this paper, we propose to apply accurate LC peak boundary detection to remove the corrupted part of LC peaks. Accurate LC peak boundary detection is achieved by checking the consistency of intensity patterns within peptide elution time ranges. In addition, we remove peptides with erroneous mass assignment through model fitness check, which compares observed intensity patterns to theoretically constructed ones. The proposed algorithm can significantly improve the accuracy and precision of peptide ratio measurements.

  9. Accurate LC Peak Boundary Detection for 16 O/ 18 O Labeled LC-MS Data

    PubMed Central

    Cui, Jian; Petritis, Konstantinos; Tegeler, Tony; Petritis, Brianne; Ma, Xuepo; Jin, Yufang; Gao, Shou-Jiang (SJ); Zhang, Jianqiu (Michelle)

    2013-01-01

    In liquid chromatography-mass spectrometry (LC-MS), parts of LC peaks are often corrupted by their co-eluting peptides, which results in increased quantification variance. In this paper, we propose to apply accurate LC peak boundary detection to remove the corrupted part of LC peaks. Accurate LC peak boundary detection is achieved by checking the consistency of intensity patterns within peptide elution time ranges. In addition, we remove peptides with erroneous mass assignment through model fitness check, which compares observed intensity patterns to theoretically constructed ones. The proposed algorithm can significantly improve the accuracy and precision of peptide ratio measurements. PMID:24115998

  10. Accurate quantification of magnetic particle properties by intra-pair magnetophoresis for nanobiotechnology

    NASA Astrophysics Data System (ADS)

    van Reenen, Alexander; Gao, Yang; Bos, Arjen H.; de Jong, Arthur M.; Hulsen, Martien A.; den Toonder, Jaap M. J.; Prins, Menno W. J.

    2013-07-01

    The application of magnetic particles in biomedical research and in-vitro diagnostics requires accurate characterization of their magnetic properties, with single-particle resolution and good statistics. Here, we report intra-pair magnetophoresis as a method to accurately quantify the field-dependent magnetic moments of magnetic particles and to rapidly generate histograms of the magnetic moments with good statistics. We demonstrate our method with particles of different sizes and from different sources, with a measurement precision of a few percent. We expect that intra-pair magnetophoresis will be a powerful tool for the characterization and improvement of particles for the upcoming field of particle-based nanobiotechnology.

  11. Quantification of intestinal bacterial populations by real-time PCR with a universal primer set and minor groove binder probes: a global approach to the enteric flora.

    PubMed

    Ott, Stephan J; Musfeldt, Meike; Ullmann, Uwe; Hampe, Jochen; Schreiber, Stefan

    2004-06-01

    The composition of the human intestinal flora is important for the health status of the host. The global composition and the presence of specific pathogens are relevant to the effects of the flora. Therefore, accurate quantification of all major bacterial populations of the enteric flora is needed. A TaqMan real-time PCR-based method for the quantification of 20 dominant bacterial species and groups of the intestinal flora has been established on the basis of 16S ribosomal DNA taxonomy. A PCR with conserved primers was used for all reactions. In each real-time PCR, a universal probe for quantification of total bacteria and a specific probe for the species in question were included. PCR with conserved primers and the universal probe for total bacteria allowed relative and absolute quantification. Minor groove binder probes increased the sensitivity of the assays 10- to 100-fold. The method was evaluated by cross-reaction experiments and quantification of bacteria in complex clinical samples from healthy patients. A sensitivity of 10(1) to 10(3) bacterial cells per sample was achieved. No significant cross-reaction was observed. The real-time PCR assays presented may facilitate understanding of the intestinal bacterial flora through a normalized global estimation of the major contributing species.

  12. Instrument for Real-Time Digital Nucleic Acid Amplification on Custom Microfluidic Devices

    PubMed Central

    Selck, David A.

    2016-01-01

    Nucleic acid amplification tests that are coupled with a digital readout enable the absolute quantification of single molecules, even at ultralow concentrations. Digital methods are robust, versatile and compatible with many amplification chemistries including isothermal amplification, making them particularly invaluable to assays that require sensitive detection, such as the quantification of viral load in occult infections or detection of sparse amounts of DNA from forensic samples. A number of microfluidic platforms are being developed for carrying out digital amplification. However, the mechanistic investigation and optimization of digital assays has been limited by the lack of real-time kinetic information about which factors affect the digital efficiency and analytical sensitivity of a reaction. Commercially available instruments that are capable of tracking digital reactions in real-time are restricted to only a small number of device types and sample-preparation strategies. Thus, most researchers who wish to develop, study, or optimize digital assays rely on the rate of the amplification reaction when performed in a bulk experiment, which is now recognized as an unreliable predictor of digital efficiency. To expand our ability to study how digital reactions proceed in real-time and enable us to optimize both the digital efficiency and analytical sensitivity of digital assays, we built a custom large-format digital real-time amplification instrument that can accommodate a wide variety of devices, amplification chemistries and sample-handling conditions. Herein, we validate this instrument, we provide detailed schematics that will enable others to build their own custom instruments, and we include a complete custom software suite to collect and analyze the data retrieved from the instrument. We believe assay optimizations enabled by this instrument will improve the current limits of nucleic acid detection and quantification, improving our fundamental understanding of single-molecule reactions and providing advancements in practical applications such as medical diagnostics, forensics and environmental sampling. PMID:27760148

  13. Quantification of Cannabinoid Content in Cannabis

    NASA Astrophysics Data System (ADS)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  14. A Statistics-based Platform for Quantitative N-terminome Analysis and Identification of Protease Cleavage Products*

    PubMed Central

    auf dem Keller, Ulrich; Prudova, Anna; Gioia, Magda; Butler, Georgina S.; Overall, Christopher M.

    2010-01-01

    Terminal amine isotopic labeling of substrates (TAILS), our recently introduced platform for quantitative N-terminome analysis, enables wide dynamic range identification of original mature protein N-termini and protease cleavage products. Modifying TAILS by use of isobaric tag for relative and absolute quantification (iTRAQ)-like labels for quantification together with a robust statistical classifier derived from experimental protease cleavage data, we report reliable and statistically valid identification of proteolytic events in complex biological systems in MS2 mode. The statistical classifier is supported by a novel parameter evaluating ion intensity-dependent quantification confidences of single peptide quantifications, the quantification confidence factor (QCF). Furthermore, the isoform assignment score (IAS) is introduced, a new scoring system for the evaluation of single peptide-to-protein assignments based on high confidence protein identifications in the same sample prior to negative selection enrichment of N-terminal peptides. By these approaches, we identified and validated, in addition to known substrates, low abundance novel bioactive MMP-2 targets including the plasminogen receptor S100A10 (p11) and the proinflammatory cytokine proEMAP/p43 that were previously undescribed. PMID:20305283

  15. Simultaneous Quantification of Dexpanthenol and Resorcinol from Hair Care Formulation Using Liquid Chromatography: Method Development and Validation.

    PubMed

    De, Amit Kumar; Chowdhury, Partha Pratim; Chattapadhyay, Shyamaprasad

    2016-01-01

    The current study presents the simultaneous quantification of dexpanthenol and resorcinol from marketed hair care formulation. Dexpanthenol is often present as an active ingredient in personal care products for its beautifying and invigorating properties and restorative and smoothing properties. On the other hand resorcinol is mainly prescribed for the treatment of seborrheic dermatitis of scalp. The toxic side effects of resorcinol limit its use in dermatological preparations. Therefore an accurate quantification technique for the simultaneous estimation of these two components can be helpful for the formulation industries for the accurate analysis of their product quality. In the current study a high performance liquid chromatographic technique has been developed using a C18 column and a mobile phase consisting of phosphate buffer of pH = 2.8 following a gradient elution. The mobile phase flow rate was 0.6 mL per minute and the detection wavelength was 210 nm for dexpanthenol and 280 nm for resorcinol. The linearity study was carried out using five solutions having concentrations ranging between 10.34 μg·mL(-1) and 82.69 μg·mL(-1) (r (2) = 0.999) for resorcinol and 10.44 μg·mL(-1) and 83.50 μg·mL(-1) (r (2) = 0.998) for dexpanthenol. The method has been validated as per ICH Q2(R1) guidelines. The ease of single step sample preparation, accuracy, and precision (intraday and interday) study presents the method suitable for the simultaneous quantification of dexpanthenol and resorcinol from any personal care product and dermatological preparations containing these two ingredients.

  16. Simultaneous Quantification of Dexpanthenol and Resorcinol from Hair Care Formulation Using Liquid Chromatography: Method Development and Validation

    PubMed Central

    De, Amit Kumar; Chowdhury, Partha Pratim; Chattapadhyay, Shyamaprasad

    2016-01-01

    The current study presents the simultaneous quantification of dexpanthenol and resorcinol from marketed hair care formulation. Dexpanthenol is often present as an active ingredient in personal care products for its beautifying and invigorating properties and restorative and smoothing properties. On the other hand resorcinol is mainly prescribed for the treatment of seborrheic dermatitis of scalp. The toxic side effects of resorcinol limit its use in dermatological preparations. Therefore an accurate quantification technique for the simultaneous estimation of these two components can be helpful for the formulation industries for the accurate analysis of their product quality. In the current study a high performance liquid chromatographic technique has been developed using a C18 column and a mobile phase consisting of phosphate buffer of pH = 2.8 following a gradient elution. The mobile phase flow rate was 0.6 mL per minute and the detection wavelength was 210 nm for dexpanthenol and 280 nm for resorcinol. The linearity study was carried out using five solutions having concentrations ranging between 10.34 μg·mL−1 and 82.69 μg·mL−1 (r 2 = 0.999) for resorcinol and 10.44 μg·mL−1 and 83.50 μg·mL−1 (r 2 = 0.998) for dexpanthenol. The method has been validated as per ICH Q2(R1) guidelines. The ease of single step sample preparation, accuracy, and precision (intraday and interday) study presents the method suitable for the simultaneous quantification of dexpanthenol and resorcinol from any personal care product and dermatological preparations containing these two ingredients. PMID:27042377

  17. Quantification of strontium in human serum by ICP-MS using alternate analyte-free matrix and its application to a pilot bioequivalence study of two strontium ranelate oral formulations in healthy Chinese subjects.

    PubMed

    Zhang, Dan; Wang, Xiaolin; Liu, Man; Zhang, Lina; Deng, Ming; Liu, Huichen

    2015-01-01

    A rapid, sensitive and accurate ICP-MS method using alternate analyte-free matrix for calibration standards preparation and a rapid direct dilution procedure for sample preparation was developed and validated for the quantification of exogenous strontium (Sr) from the drug in human serum. Serum was prepared by direct dilution (1:29, v/v) in an acidic solution consisting of nitric acid (0.1%) and germanium (Ge) added as internal standard (IS), to obtain simple and high-throughput preparation procedure with minimized matrix effect, and good repeatability. ICP-MS analysis was performed using collision cell technology (CCT) mode. Alternate matrix method by using distilled water as an alternate analyte-free matrix for the preparation of calibration standards (CS) was used to avoid the influence of endogenous Sr in serum on the quantification. The method was validated in terms of selectivity, carry-over, matrix effects, lower limit of quantification (LLOQ), linearity, precision and accuracy, and stability. Instrumental linearity was verified in the range of 1.00-500ng/mL, corresponding to a concentration range of 0.0300-15.0μg/mL in 50μL sample of serum matrix and alternate matrix. Intra- and inter-day precision as relative standard deviation (RSD) were less than 8.0% and accuracy as relative error (RE) was within ±3.0%. The method allowed a high sample throughput, and was sensitive and accurate enough for a pilot bioequivalence study in healthy male Chinese subjects following single oral administration of two strontium ranelate formulations containing 2g strontium ranelate. Copyright © 2014 Elsevier GmbH. All rights reserved.

  18. Development and validation of high-performance liquid chromatography and high-performance thin-layer chromatography methods for the quantification of khellin in Ammi visnaga seed

    PubMed Central

    Kamal, Abid; Khan, Washim; Ahmad, Sayeed; Ahmad, F. J.; Saleem, Kishwar

    2015-01-01

    Objective: The present study was used to design simple, accurate and sensitive reversed phase-high-performance liquid chromatography RP-HPLC and high-performance thin-layer chromatography (HPTLC) methods for the development of quantification of khellin present in the seeds of Ammi visnaga. Materials and Methods: RP-HPLC analysis was performed on a C18 column with methanol: Water (75: 25, v/v) as a mobile phase. The HPTLC method involved densitometric evaluation of khellin after resolving it on silica gel plate using ethyl acetate: Toluene: Formic acid (5.5:4.0:0.5, v/v/v) as a mobile phase. Results: The developed HPLC and HPTLC methods were validated for precision (interday, intraday and intersystem), robustness and accuracy, limit of detection and limit of quantification. The relationship between the concentration of standard solutions and the peak response was linear in both HPLC and HPTLC methods with the concentration range of 10–80 μg/mL in HPLC and 25–1,000 ng/spot in HPTLC for khellin. The % relative standard deviation values for method precision was found to be 0.63–1.97%, 0.62–2.05% in HPLC and HPTLC for khellin respectively. Accuracy of the method was checked by recovery studies conducted at three different concentration levels and the average percentage recovery was found to be 100.53% in HPLC and 100.08% in HPTLC for khellin. Conclusions: The developed HPLC and HPTLC methods for the quantification of khellin were found simple, precise, specific, sensitive and accurate which can be used for routine analysis and quality control of A. visnaga and several formulations containing it as an ingredient. PMID:26681890

  19. Uncertainty Quantification for Polynomial Systems via Bernstein Expansions

    NASA Technical Reports Server (NTRS)

    Crespo, Luis G.; Kenny, Sean P.; Giesy, Daniel P.

    2012-01-01

    This paper presents a unifying framework to uncertainty quantification for systems having polynomial response metrics that depend on both aleatory and epistemic uncertainties. The approach proposed, which is based on the Bernstein expansions of polynomials, enables bounding the range of moments and failure probabilities of response metrics as well as finding supersets of the extreme epistemic realizations where the limits of such ranges occur. These bounds and supersets, whose analytical structure renders them free of approximation error, can be made arbitrarily tight with additional computational effort. Furthermore, this framework enables determining the importance of particular uncertain parameters according to the extent to which they affect the first two moments of response metrics and failure probabilities. This analysis enables determining the parameters that should be considered uncertain as well as those that can be assumed to be constants without incurring significant error. The analytical nature of the approach eliminates the numerical error that characterizes the sampling-based techniques commonly used to propagate aleatory uncertainties as well as the possibility of under predicting the range of the statistic of interest that may result from searching for the best- and worstcase epistemic values via nonlinear optimization or sampling.

  20. Addressing Phase Errors in Fat-Water Imaging Using a Mixed Magnitude/Complex Fitting Method

    PubMed Central

    Hernando, D.; Hines, C. D. G.; Yu, H.; Reeder, S.B.

    2012-01-01

    Accurate, noninvasive measurements of liver fat content are needed for the early diagnosis and quantitative staging of nonalcoholic fatty liver disease. Chemical shift-based fat quantification methods acquire images at multiple echo times using a multiecho spoiled gradient echo sequence, and provide fat fraction measurements through postprocessing. However, phase errors, such as those caused by eddy currents, can adversely affect fat quantification. These phase errors are typically most significant at the first echo of the echo train, and introduce bias in complex-based fat quantification techniques. These errors can be overcome using a magnitude-based technique (where the phase of all echoes is discarded), but at the cost of significantly degraded signal-to-noise ratio, particularly for certain choices of echo time combinations. In this work, we develop a reconstruction method that overcomes these phase errors without the signal-to-noise ratio penalty incurred by magnitude fitting. This method discards the phase of the first echo (which is often corrupted) while maintaining the phase of the remaining echoes (where phase is unaltered). We test the proposed method on 104 patient liver datasets (from 52 patients, each scanned twice), where the fat fraction measurements are compared to coregistered spectroscopy measurements. We demonstrate that mixed fitting is able to provide accurate fat fraction measurements with high signal-to-noise ratio and low bias over a wide choice of echo combinations. PMID:21713978

  1. Vitamin D in foods: an evolution of knowledge (chapter 60)

    USDA-ARS?s Scientific Manuscript database

    Accurate data for vitamin D in foods are essential to support epidemiological and clinical studies seeking to identify associations between total vitamin D “exposure” and health outcomes that require quantification of dietary intake, and also to inform health professionals about wise food choices fo...

  2. Application of an energy balance method for estimating evapotranspiration in cropping systems

    USDA-ARS?s Scientific Manuscript database

    Accurate quantification of evapotranspiration (ET, consumptive water use) from planting through harvest is critical for managing the limited water resources for crop irrigation. Our objective was to develop and apply an improved land-crop surface residual energy balance (EB) method for quantifying E...

  3. RECOVERY OF SEMI-VOLATILE ORGANIC COMPOUNDS DURING SAMPLE PREPARATION: IMPLICATIONS FOR CHARACTERIZATION OF AIRBORNE PARTICULATE MATTER

    EPA Science Inventory

    Semi-volatile compounds present special analytical challenges not met by conventional methods for analysis of ambient particulate matter (PM). Accurate quantification of PM-associated organic compounds requires validation of the laboratory procedures for recovery over a wide v...

  4. Quantitative PCR for Detection and Enumeration of Genetic Markers of Bovine Fecal Pollution

    EPA Science Inventory

    Accurate assessment of health risks associated with bovine (cattle) fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for the detection of two recently described cow feces-spec...

  5. Lamb Wave Damage Quantification Using GA-Based LS-SVM.

    PubMed

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-06-12

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  6. Multivariate Analysis for Quantification of Plutonium(IV) in Nitric Acid Based on Absorption Spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lines, Amanda M.; Adami, Susan R.; Sinkov, Sergey I.

    Development of more effective, reliable, and fast methods for monitoring process streams is a growing opportunity for analytical applications. Many fields can benefit from on-line monitoring, including the nuclear fuel cycle where improved methods for monitoring radioactive materials will facilitate maintenance of proper safeguards and ensure safe and efficient processing of materials. On-line process monitoring with a focus on optical spectroscopy can provide a fast, non-destructive method for monitoring chemical species. However, identification and quantification of species can be hindered by the complexity of the solutions if bands overlap or show condition-dependent spectral features. Plutonium (IV) is one example ofmore » a species which displays significant spectral variation with changing nitric acid concentration. Single variate analysis (i.e. Beer’s Law) is difficult to apply to the quantification of Pu(IV) unless the nitric acid concentration is known and separate calibration curves have been made for all possible acid strengths. Multivariate, or chemometric, analysis is an approach that allows for the accurate quantification of Pu(IV) without a priori knowledge of nitric acid concentration.« less

  7. Lamb Wave Damage Quantification Using GA-Based LS-SVM

    PubMed Central

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-01-01

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification. PMID:28773003

  8. Characterization and Quantification of Intact 26S Proteasome Proteins by Real-Time Measurement of Intrinsic Fluorescence Prior to Top-down Mass Spectrometry

    PubMed Central

    Russell, Jason D.; Scalf, Mark; Book, Adam J.; Ladror, Daniel T.; Vierstra, Richard D.; Smith, Lloyd M.; Coon, Joshua J.

    2013-01-01

    Quantification of gas-phase intact protein ions by mass spectrometry (MS) is impeded by highly-variable ionization, ion transmission, and ion detection efficiencies. Therefore, quantification of proteins using MS-associated techniques is almost exclusively done after proteolysis where peptides serve as proxies for estimating protein abundance. Advances in instrumentation, protein separations, and informatics have made large-scale sequencing of intact proteins using top-down proteomics accessible to the proteomics community; yet quantification of proteins using a top-down workflow has largely been unaddressed. Here we describe a label-free approach to determine the abundance of intact proteins separated by nanoflow liquid chromatography prior to MS analysis by using solution-phase measurements of ultraviolet light-induced intrinsic fluorescence (UV-IF). UV-IF is measured directly at the electrospray interface just prior to the capillary exit where proteins containing at least one tryptophan residue are readily detected. UV-IF quantification was demonstrated using commercially available protein standards and provided more accurate and precise protein quantification than MS ion current. We evaluated the parallel use of UV-IF and top-down tandem MS for quantification and identification of protein subunits and associated proteins from an affinity-purified 26S proteasome sample from Arabidopsis thaliana. We identified 26 unique proteins and quantified 13 tryptophan-containing species. Our analyses discovered previously unidentified N-terminal processing of the β6 (PBF1) and β7 (PBG1) subunit - such processing of PBG1 may generate a heretofore unknown additional protease active site upon cleavage. In addition, our approach permitted the unambiguous identification and quantification both isoforms of the proteasome-associated protein DSS1. PMID:23536786

  9. Characterization and quantification of intact 26S proteasome proteins by real-time measurement of intrinsic fluorescence prior to top-down mass spectrometry.

    PubMed

    Russell, Jason D; Scalf, Mark; Book, Adam J; Ladror, Daniel T; Vierstra, Richard D; Smith, Lloyd M; Coon, Joshua J

    2013-01-01

    Quantification of gas-phase intact protein ions by mass spectrometry (MS) is impeded by highly-variable ionization, ion transmission, and ion detection efficiencies. Therefore, quantification of proteins using MS-associated techniques is almost exclusively done after proteolysis where peptides serve as proxies for estimating protein abundance. Advances in instrumentation, protein separations, and informatics have made large-scale sequencing of intact proteins using top-down proteomics accessible to the proteomics community; yet quantification of proteins using a top-down workflow has largely been unaddressed. Here we describe a label-free approach to determine the abundance of intact proteins separated by nanoflow liquid chromatography prior to MS analysis by using solution-phase measurements of ultraviolet light-induced intrinsic fluorescence (UV-IF). UV-IF is measured directly at the electrospray interface just prior to the capillary exit where proteins containing at least one tryptophan residue are readily detected. UV-IF quantification was demonstrated using commercially available protein standards and provided more accurate and precise protein quantification than MS ion current. We evaluated the parallel use of UV-IF and top-down tandem MS for quantification and identification of protein subunits and associated proteins from an affinity-purified 26S proteasome sample from Arabidopsis thaliana. We identified 26 unique proteins and quantified 13 tryptophan-containing species. Our analyses discovered previously unidentified N-terminal processing of the β6 (PBF1) and β7 (PBG1) subunit - such processing of PBG1 may generate a heretofore unknown additional protease active site upon cleavage. In addition, our approach permitted the unambiguous identification and quantification both isoforms of the proteasome-associated protein DSS1.

  10. A validated Fourier transform infrared spectroscopy method for quantification of total lactones in Inula racemosa and Andrographis paniculata.

    PubMed

    Shivali, Garg; Praful, Lahorkar; Vijay, Gadgil

    2012-01-01

    Fourier transform infrared (FT-IR) spectroscopy is a technique widely used for detection and quantification of various chemical moieties. This paper describes the use of the FT-IR spectroscopy technique for the quantification of total lactones present in Inula racemosa and Andrographis paniculata. To validate the FT-IR spectroscopy method for quantification of total lactones in I. racemosa and A. paniculata. Dried and powdered I. racemosa roots and A. paniculata plant were extracted with ethanol and dried to remove ethanol completely. The ethanol extract was analysed in a KBr pellet by FT-IR spectroscopy. The FT-IR spectroscopy method was validated and compared with a known spectrophotometric method for quantification of lactones in A. paniculata. By FT-IR spectroscopy, the amount of total lactones was found to be 2.12 ± 0.47% (n = 3) in I. racemosa and 8.65 ± 0.51% (n = 3) in A. paniculata. The method showed comparable results with a known spectrophotometric method used for quantification of such lactones: 8.42 ± 0.36% (n = 3) in A. paniculata. Limits of detection and quantification for isoallantolactone were 1 µg and 10 µg respectively; for andrographolide they were 1.5 µg and 15 µg respectively. Recoveries were over 98%, with good intra- and interday repeatability: RSD ≤ 2%. The FT-IR spectroscopy method proved linear, accurate, precise and specific, with low limits of detection and quantification, for estimation of total lactones, and is less tedious than the UV spectrophotometric method for the compounds tested. This validated FT-IR spectroscopy method is readily applicable for the quality control of I. racemosa and A. paniculata. Copyright © 2011 John Wiley & Sons, Ltd.

  11. 18O-labeled proteome reference as global internal standards for targeted quantification by selected reaction monitoring-mass spectrometry.

    PubMed

    Kim, Jong-Seo; Fillmore, Thomas L; Liu, Tao; Robinson, Errol; Hossain, Mahmud; Champion, Boyd L; Moore, Ronald J; Camp, David G; Smith, Richard D; Qian, Wei-Jun

    2011-12-01

    Selected reaction monitoring (SRM)-MS is an emerging technology for high throughput targeted protein quantification and verification in biomarker discovery studies; however, the cost associated with the application of stable isotope-labeled synthetic peptides as internal standards can be prohibitive for screening a large number of candidate proteins as often required in the preverification phase of discovery studies. Herein we present a proof of concept study using an (18)O-labeled proteome reference as global internal standards (GIS) for SRM-based relative quantification. The (18)O-labeled proteome reference (or GIS) can be readily prepared and contains a heavy isotope ((18)O)-labeled internal standard for every possible tryptic peptide. Our results showed that the percentage of heavy isotope ((18)O) incorporation applying an improved protocol was >99.5% for most peptides investigated. The accuracy, reproducibility, and linear dynamic range of quantification were further assessed based on known ratios of standard proteins spiked into the labeled mouse plasma reference. Reliable quantification was observed with high reproducibility (i.e. coefficient of variance <10%) for analyte concentrations that were set at 100-fold higher or lower than those of the GIS based on the light ((16)O)/heavy ((18)O) peak area ratios. The utility of (18)O-labeled GIS was further illustrated by accurate relative quantification of 45 major human plasma proteins. Moreover, quantification of the concentrations of C-reactive protein and prostate-specific antigen was illustrated by coupling the GIS with standard additions of purified protein standards. Collectively, our results demonstrated that the use of (18)O-labeled proteome reference as GIS provides a convenient, low cost, and effective strategy for relative quantification of a large number of candidate proteins in biological or clinical samples using SRM.

  12. Microfluidic Device to Quantify the Behavior of Therapeutic Bacteria in Three-Dimensional Tumor Tissue.

    PubMed

    Brackett, Emily L; Swofford, Charles A; Forbes, Neil S

    2016-01-01

    Microfluidic devices enable precise quantification of the interactions between anti-cancer bacteria and tumor tissue. Direct observation of bacterial movement and gene expression in tissue is difficult with either monolayers of cells or tumor-bearing mice. Quantification of these interactions is necessary to understand the inherent mechanisms of bacterial targeting and to develop modified organisms with enhanced therapeutic properties. Here we describe the procedures for designing, printing, and assembling microfluidic tumor-on-a-chip devices. We also describe the procedures for inserting three-dimensional tumor-cell masses, exposure to bacteria, and analyzing the resultant images.

  13. An accurate bacterial DNA quantification assay for HTS library preparation of human biological samples.

    PubMed

    Seashols-Williams, Sarah; Green, Raquel; Wohlfahrt, Denise; Brand, Angela; Tan-Torres, Antonio Limjuco; Nogales, Francy; Brooks, J Paul; Singh, Baneshwar

    2018-05-17

    Sequencing and classification of microbial taxa within forensically relevant biological fluids has the potential for applications in the forensic science and biomedical fields. The quantity of bacterial DNA from human samples is currently estimated based on quantity of total DNA isolated. This method can miscalculate bacterial DNA quantity due to the mixed nature of the sample, and consequently library preparation is often unreliable. We developed an assay that can accurately and specifically quantify bacterial DNA within a mixed sample for reliable 16S ribosomal DNA (16S rDNA) library preparation and high throughput sequencing (HTS). A qPCR method was optimized using universal 16S rDNA primers, and a commercially available bacterial community DNA standard was used to develop a precise standard curve. Following qPCR optimization, 16S rDNA libraries from saliva, vaginal and menstrual secretions, urine, and fecal matter were amplified and evaluated at various DNA concentrations; successful HTS data were generated with as low as 20 pg of bacterial DNA. Changes in bacterial DNA quantity did not impact observed relative abundances of major bacterial taxa, but relative abundance changes of minor taxa were observed. Accurate quantification of microbial DNA resulted in consistent, successful library preparations for HTS analysis. © 2018 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  14. Investigating the limits of PET/CT imaging at very low true count rates and high random fractions in ion-beam therapy monitoring

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kurz, Christopher, E-mail: Christopher.Kurz@physik.uni-muenchen.de; Bauer, Julia; Conti, Maurizio

    Purpose: External beam radiotherapy with protons and heavier ions enables a tighter conformation of the applied dose to arbitrarily shaped tumor volumes with respect to photons, but is more sensitive to uncertainties in the radiotherapeutic treatment chain. Consequently, an independent verification of the applied treatment is highly desirable. For this purpose, the irradiation-induced β{sup +}-emitter distribution within the patient is detected shortly after irradiation by a commercial full-ring positron emission tomography/x-ray computed tomography (PET/CT) scanner installed next to the treatment rooms at the Heidelberg Ion-Beam Therapy Center (HIT). A major challenge to this approach is posed by the small numbermore » of detected coincidences. This contribution aims at characterizing the performance of the used PET/CT device and identifying the best-performing reconstruction algorithm under the particular statistical conditions of PET-based treatment monitoring. Moreover, this study addresses the impact of radiation background from the intrinsically radioactive lutetium-oxyorthosilicate (LSO)-based detectors at low counts. Methods: The authors have acquired 30 subsequent PET scans of a cylindrical phantom emulating a patientlike activity pattern and spanning the entire patient counting regime in terms of true coincidences and random fractions (RFs). Accuracy and precision of activity quantification, image noise, and geometrical fidelity of the scanner have been investigated for various reconstruction algorithms and settings in order to identify a practical, well-suited reconstruction scheme for PET-based treatment verification. Truncated listmode data have been utilized for separating the effects of small true count numbers and high RFs on the reconstructed images. A corresponding simulation study enabled extending the results to an even wider range of counting statistics and to additionally investigate the impact of scatter coincidences. Eventually, the recommended reconstruction scheme has been applied to exemplary postirradiation patient data-sets. Results: Among the investigated reconstruction options, the overall best results in terms of image noise, activity quantification, and accurate geometrical recovery were achieved using the ordered subset expectation maximization reconstruction algorithm with time-of-flight (TOF) and point-spread function (PSF) information. For this algorithm, reasonably accurate (better than 5%) and precise (uncertainty of the mean activity below 10%) imaging can be provided down to 80 000 true coincidences at 96% RF. Image noise and geometrical fidelity are generally improved for fewer iterations. The main limitation for PET-based treatment monitoring has been identified in the small number of true coincidences, rather than the high intrinsic random background. Application of the optimized reconstruction scheme to patient data-sets results in a 25% − 50% reduced image noise at a comparable activity quantification accuracy and an improved geometrical performance with respect to the formerly used reconstruction scheme at HIT, adopted from nuclear medicine applications. Conclusions: Under the poor statistical conditions in PET-based treatment monitoring, improved results can be achieved by considering PSF and TOF information during image reconstruction and by applying less iterations than in conventional nuclear medicine imaging. Geometrical fidelity and image noise are mainly limited by the low number of true coincidences, not the high LSO-related random background. The retrieved results might also impact other emerging PET applications at low counting statistics.« less

  15. MetaQuant: a tool for the automatic quantification of GC/MS-based metabolome data.

    PubMed

    Bunk, Boyke; Kucklick, Martin; Jonas, Rochus; Münch, Richard; Schobert, Max; Jahn, Dieter; Hiller, Karsten

    2006-12-01

    MetaQuant is a Java-based program for the automatic and accurate quantification of GC/MS-based metabolome data. In contrast to other programs MetaQuant is able to quantify hundreds of substances simultaneously with minimal manual intervention. The integration of a self-acting calibration function allows the parallel and fast calibration for several metabolites simultaneously. Finally, MetaQuant is able to import GC/MS data in the common NetCDF format and to export the results of the quantification into Systems Biology Markup Language (SBML), Comma Separated Values (CSV) or Microsoft Excel (XLS) format. MetaQuant is written in Java and is available under an open source license. Precompiled packages for the installation on Windows or Linux operating systems are freely available for download. The source code as well as the installation packages are available at http://bioinformatics.org/metaquant

  16. Evaluation of the performance of quantitative detection of the Listeria monocytogenes prfA locus with droplet digital PCR.

    PubMed

    Witte, Anna Kristina; Fister, Susanne; Mester, Patrick; Schoder, Dagmar; Rossmanith, Peter

    2016-11-01

    Fast and reliable pathogen detection is an important issue for human health. Since conventional microbiological methods are rather slow, there is growing interest in detection and quantification using molecular methods. The droplet digital polymerase chain reaction (ddPCR) is a relatively new PCR method for absolute and accurate quantification without external standards. Using the Listeria monocytogenes specific prfA assay, we focused on the questions of whether the assay was directly transferable to ddPCR and whether ddPCR was suitable for samples derived from heterogeneous matrices, such as foodstuffs that often included inhibitors and a non-target bacterial background flora. Although the prfA assay showed suboptimal cluster formation, use of ddPCR for quantification of L. monocytogenes from pure bacterial cultures, artificially contaminated cheese, and naturally contaminated foodstuff was satisfactory over a relatively broad dynamic range. Moreover, results demonstrated the outstanding detection limit of one copy. However, while poorer DNA quality, such as resulting from longer storage, can impair ddPCR, internal amplification control (IAC) of prfA by ddPCR, that is integrated in the genome of L. monocytogenes ΔprfA, showed even slightly better quantification over a broader dynamic range. Graphical Abstract Evaluating the absolute quantification potential of ddPCR targeting Listeria monocytogenes prfA.

  17. Evaluation of digital PCR for absolute RNA quantification.

    PubMed

    Sanders, Rebecca; Mason, Deborah J; Foy, Carole A; Huggett, Jim F

    2013-01-01

    Gene expression measurements detailing mRNA quantities are widely employed in molecular biology and are increasingly important in diagnostic fields. Reverse transcription (RT), necessary for generating complementary DNA, can be both inefficient and imprecise, but remains a quintessential RNA analysis tool using qPCR. This study developed a Transcriptomic Calibration Material and assessed the RT reaction using digital (d)PCR for RNA measurement. While many studies characterise dPCR capabilities for DNA quantification, less work has been performed investigating similar parameters using RT-dPCR for RNA analysis. RT-dPCR measurement using three, one-step RT-qPCR kits was evaluated using single and multiplex formats when measuring endogenous and synthetic RNAs. The best performing kit was compared to UV quantification and sensitivity and technical reproducibility investigated. Our results demonstrate assay and kit dependent RT-dPCR measurements differed significantly compared to UV quantification. Different values were reported by different kits for each target, despite evaluation of identical samples using the same instrument. RT-dPCR did not display the strong inter-assay agreement previously described when analysing DNA. This study demonstrates that, as with DNA measurement, RT-dPCR is capable of accurate quantification of low copy RNA targets, but the results are both kit and target dependent supporting the need for calibration controls.

  18. Normal Databases for the Relative Quantification of Myocardial Perfusion

    PubMed Central

    Rubeaux, Mathieu; Xu, Yuan; Germano, Guido; Berman, Daniel S.; Slomka, Piotr J.

    2016-01-01

    Purpose of review Myocardial perfusion imaging (MPI) with SPECT is performed clinically worldwide to detect and monitor coronary artery disease (CAD). MPI allows an objective quantification of myocardial perfusion at stress and rest. This established technique relies on normal databases to compare patient scans against reference normal limits. In this review, we aim to introduce the process of MPI quantification with normal databases and describe the associated perfusion quantitative measures that are used. Recent findings New equipment and new software reconstruction algorithms have been introduced which require the development of new normal limits. The appearance and regional count variations of normal MPI scan may differ between these new scanners and standard Anger cameras. Therefore, these new systems may require the determination of new normal limits to achieve optimal accuracy in relative myocardial perfusion quantification. Accurate diagnostic and prognostic results rivaling those obtained by expert readers can be obtained by this widely used technique. Summary Throughout this review, we emphasize the importance of the different normal databases and the need for specific databases relative to distinct imaging procedures. use of appropriate normal limits allows optimal quantification of MPI by taking into account subtle image differences due to the hardware and software used, and the population studied. PMID:28138354

  19. Digital PCR as a tool to measure HIV persistence.

    PubMed

    Rutsaert, Sofie; Bosman, Kobus; Trypsteen, Wim; Nijhuis, Monique; Vandekerckhove, Linos

    2018-01-30

    Although antiretroviral therapy is able to suppress HIV replication in infected patients, the virus persists and rebounds when treatment is stopped. In order to find a cure that can eradicate the latent reservoir, one must be able to quantify the persisting virus. Traditionally, HIV persistence studies have used real-time PCR (qPCR) to measure the viral reservoir represented by HIV DNA and RNA. Most recently, digital PCR is gaining popularity as a novel approach to nucleic acid quantification as it allows for absolute target quantification. Various commercial digital PCR platforms are nowadays available that implement the principle of digital PCR, of which Bio-Rad's QX200 ddPCR is currently the most used platform in HIV research. Quantification of HIV by digital PCR is proving to be a valuable improvement over qPCR as it is argued to have a higher robustness to mismatches between the primers-probe set and heterogeneous HIV, and forfeits the need for a standard curve, both of which are known to complicate reliable quantification. However, currently available digital PCR platforms occasionally struggle with unexplained false-positive partitions, and reliable segregation between positive and negative droplets remains disputed. Future developments and advancements of the digital PCR technology are promising to aid in the accurate quantification and characterization of the persistent HIV reservoir.

  20. Quantification of optical absorption coefficient from acoustic spectra in the optical diffusive regime using photoacoustic microscopy

    NASA Astrophysics Data System (ADS)

    Guo, Zijian; Favazza, Christopher; Wang, Lihong V.

    2012-02-01

    Photoacoustic (PA) tomography (PAT) can image optical absorption contrast with ultrasonic spatial resolution in the optical diffusive regime. Multi-wavelength PAT can noninvasively monitor hemoglobin oxygen saturation (sO2) with high sensitivity and fine spatial resolution. However, accurate quantification in PAT requires knowledge of the optical fluence distribution, acoustic wave attenuation, and detection system bandwidth. We propose a method to circumvent this requirement using acoustic spectra of PA signals acquired at two optical wavelengths. With the acoustic spectral method, the absorption coefficients of an oxygenated bovine blood phantom at 560 and 575 nm were quantified with errors of ><5%.

  1. ddpcr: an R package and web application for analysis of droplet digital PCR data.

    PubMed

    Attali, Dean; Bidshahri, Roza; Haynes, Charles; Bryan, Jennifer

    2016-01-01

    Droplet digital polymerase chain reaction (ddPCR) is a novel platform for exact quantification of DNA which holds great promise in clinical diagnostics. It is increasingly popular due to its digital nature, which provides more accurate quantification and higher sensitivity than traditional real-time PCR. However, clinical adoption has been slowed in part by the lack of software tools available for analyzing ddPCR data. Here, we present ddpcr - a new R package for ddPCR visualization and analysis. In addition, ddpcr includes a web application (powered by the Shiny R package) that allows users to analyze ddPCR data using an interactive graphical interface.

  2. Localized 2D COSY sequences: Method and experimental evaluation for a whole metabolite quantification approach

    NASA Astrophysics Data System (ADS)

    Martel, Dimitri; Tse Ve Koon, K.; Le Fur, Yann; Ratiney, Hélène

    2015-11-01

    Two-dimensional spectroscopy offers the possibility to unambiguously distinguish metabolites by spreading out the multiplet structure of J-coupled spin systems into a second dimension. Quantification methods that perform parametric fitting of the 2D MRS signal have recently been proposed for resolved PRESS (JPRESS) but not explicitly for Localized Correlation Spectroscopy (LCOSY). Here, through a whole metabolite quantification approach, correlation spectroscopy quantification performances are studied. The ability to quantify metabolite relaxation constant times is studied for three localized 2D MRS sequences (LCOSY, LCTCOSY and the JPRESS) in vitro on preclinical MR systems. The issues encountered during implementation and quantification strategies are discussed with the help of the Fisher matrix formalism. The described parameterized models enable the computation of the lower bound for error variance - generally known as the Cramér Rao bounds (CRBs), a standard of precision - on the parameters estimated from these 2D MRS signal fittings. LCOSY has a theoretical net signal loss of two per unit of acquisition time compared to JPRESS. A rapid analysis could point that the relative CRBs of LCOSY compared to JPRESS (expressed as a percentage of the concentration values) should be doubled but we show that this is not necessarily true. Finally, the LCOSY quantification procedure has been applied on data acquired in vivo on a mouse brain.

  3. Label-free SPR detection of gluten peptides in urine for non-invasive celiac disease follow-up.

    PubMed

    Soler, Maria; Estevez, M-Carmen; Moreno, Maria de Lourdes; Cebolla, Angel; Lechuga, Laura M

    2016-05-15

    Motivated by the necessity of new and efficient methods for dietary gluten control of celiac patients, we have developed a simple and highly sensitive SPR biosensor for the detection of gluten peptides in urine. The sensing methodology enables rapid and label-free quantification of the gluten immunogenic peptides (GIP) by using G12 mAb. The overall performance of the biosensor has been in-depth optimized and evaluated in terms of sensitivity, selectivity and reproducibility, reaching a limit of detection of 0.33 ng mL(-1). Besides, the robustness and stability of the methodology permit the continuous use of the biosensor for more than 100 cycles with excellent repeatability. Special efforts have been focused on preventing and minimizing possible interferences coming from urine matrix enabling a direct analysis in this fluid without requiring extraction or purification procedures. Our SPR biosensor has proven to detect and identify gluten consumption by evaluating urine samples from healthy and celiac individuals with different dietary gluten conditions. This novel biosensor methodology represents a novel approach to quantify the digested gluten peptides in human urine with outstanding sensitivity in a rapid and non-invasive manner. Our technique should be considered as a promising opportunity to develop Point-of-Care (POC) devices for an efficient, simple and accurate gluten free diet (GFD) monitoring as well as therapy follow-up of celiac disease patients. Copyright © 2015 Elsevier B.V. All rights reserved.

  4. Digital PCR Modeling for Maximal Sensitivity, Dynamic Range and Measurement Precision

    PubMed Central

    Majumdar, Nivedita; Wessel, Thomas; Marks, Jeffrey

    2015-01-01

    The great promise of digital PCR is the potential for unparalleled precision enabling accurate measurements for genetic quantification. A challenge associated with digital PCR experiments, when testing unknown samples, is to perform experiments at dilutions allowing the detection of one or more targets of interest at a desired level of precision. While theory states that optimal precision (Po) is achieved by targeting ~1.59 mean copies per partition (λ), and that dynamic range (R) includes the space spanning one positive (λL) to one negative (λU) result from the total number of partitions (n), these results are tempered for the practitioner seeking to construct digital PCR experiments in the laboratory. A mathematical framework is presented elucidating the relationships between precision, dynamic range, number of partitions, interrogated volume, and sensitivity in digital PCR. The impact that false reaction calls and volumetric variation have on sensitivity and precision is next considered. The resultant effects on sensitivity and precision are established via Monte Carlo simulations reflecting the real-world likelihood of encountering such scenarios in the laboratory. The simulations provide insight to the practitioner on how to adapt experimental loading concentrations to counteract any one of these conditions. The framework is augmented with a method of extending the dynamic range of digital PCR, with and without increasing n, via the use of dilutions. An example experiment demonstrating the capabilities of the framework is presented enabling detection across 3.33 logs of starting copy concentration. PMID:25806524

  5. Separation and dual detection of prostate cancer cells and protein biomarkers using a microchip device.

    PubMed

    Huang, Wanfeng; Chang, Chun-Li; Brault, Norman D; Gur, Onur; Wang, Zhe; Jalal, Shadia I; Low, Philip S; Ratliff, Timothy L; Pili, Roberto; Savran, Cagri A

    2017-01-31

    Current efforts for the detection of prostate cancer using only prostate specific antigen are not ideal and indicate a need to develop new assays - using multiple targets - that can more accurately stratify disease states. We previously introduced a device capable of the concurrent detection of cellular and molecular markers from a single sample fluid. Here, an improved design, which achieves affinity as well as size-based separation of captured targets using antibody-conjugated magnetic beads and a silicon chip containing micro-apertures, is presented. Upon injection of the sample, the integration of magnetic attraction with the micro-aperture chip permits larger cell-bead complexes to be isolated in an upper chamber with the smaller protein-bead complexes and remaining beads passing through the micro-apertures into the lower chamber. This enhances captured cell purity for on chip quantification, allows the separate retrieval of captured cells and proteins for downstream analysis, and enables higher bead concentrations for improved multiplexed ligand targeting. Using LNCaP cells and prostate specific membrane antigen (PSMA) to model prostate cancer, the device was able to detect 34 pM of spiked PSMA and achieve a cell capture efficiency of 93% from culture media. LNCaP cells and PSMA were then spiked into diluted healthy human blood to mimic a cancer patient. The device enabled the detection of spiked PSMA (relative to endogenous PSMA) while recovering 85-90% of LNCaP cells which illustrated the potential of new assays for the diagnosis of prostate cancer.

  6. Lights, camera, action: high-throughput plant phenotyping is ready for a close-up

    USDA-ARS?s Scientific Manuscript database

    Modern techniques for crop improvement rely on both DNA sequencing and accurate quantification of plant traits to identify genes and germplasm of interest. With rapid advances in DNA sequencing technologies, plant phenotyping is now a bottleneck in advancing crop yields [1,2]. Furthermore, the envir...

  7. Multi-scale geospatial agroecosystem modeling: A case study on the influence of soil data resolution on carbon budget estimates

    EPA Science Inventory

    The development of effective measures to stabilize atmospheric 22 CO2 concentration and mitigate negative impacts of climate change requires accurate quantification of the spatial variation and magnitude of the terrestrial carbon (C) flux. However, the spatial pattern and strengt...

  8. Memory for Generic and Quantifed Sentences in Spanish-Speaking Children and Adults

    ERIC Educational Resources Information Center

    Gelman, Susan A.; Tapia, Ingrid Sánchez; Leslie, Sarah-Jane

    2016-01-01

    Generic language ("Owls eat at night") expresses knowledge about categories and may represent a cognitively default mode of generalization. English-speaking children and adults more accurately recall generic than quantified sentences ("All owls eat at night") and tend to recall quantified sentences as generic. However, generics…

  9. Numerical Quantification of Perkinsus Marinus in the American Oyster Crassostrea virginicata (Gmelin 1791) (Mollusca: Bivalvia) by Modern Stereology

    EPA Science Inventory

    Species of Perkinsus are responsible for high mortalities of bivalve molluscs world-wide. Techniques to accurately estimate parasites in tissues are required to improve understanding of perkinsosis. This study quantifies the number and tissue distribution of Perkinsus marinus in ...

  10. SPECIES SPECIFIC DIETARY ARSENIC EXPOSURE ASSESSMENT: THE NEED TO ESTIMATE BIOACCESSIBILITY AND ASSESSING THE IMPLIED PRESYSTEMIC METABOLISM IMPLICATIONS

    EPA Science Inventory

    The chemical form specific toxicity of arsenic dictates the need for species specific quantification in order to accurately assess the risk from an exposure. The literature has begun to produce preliminary species specific databases for certain dietary sources, but a quantitativ...

  11. Using the CPTAC Assay Portal to identify and implement highly characterized targeted proteomics assays

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Whiteaker, Jeffrey R.; Halusa, Goran; Hoofnagle, Andrew N.

    2016-02-12

    The Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute (NCI) has launched an Assay Portal (http://assays.cancer.gov) to serve as an open-source repository of well-characterized targeted proteomic assays. The portal is designed to curate and disseminate highly characterized, targeted mass spectrometry (MS)-based assays by providing detailed assay performance characterization data, standard operating procedures, and access to reagents. Assay content is accessed via the portal through queries to find assays targeting proteins associated with specific cellular pathways, protein complexes, or specific chromosomal regions. The position of the peptide analytes for which there are available assays are mapped relative tomore » other features of interest in the protein, such as sequence domains, isoforms, single nucleotide polymorphisms, and post-translational modifications. The overarching goals are to enable robust quantification of all human proteins and to standardize the quantification of targeted MS-based assays to ultimately enable harmonization of results over time and across laboratories.« less

  12. Using the CPTAC Assay Portal to Identify and Implement Highly Characterized Targeted Proteomics Assays.

    PubMed

    Whiteaker, Jeffrey R; Halusa, Goran N; Hoofnagle, Andrew N; Sharma, Vagisha; MacLean, Brendan; Yan, Ping; Wrobel, John A; Kennedy, Jacob; Mani, D R; Zimmerman, Lisa J; Meyer, Matthew R; Mesri, Mehdi; Boja, Emily; Carr, Steven A; Chan, Daniel W; Chen, Xian; Chen, Jing; Davies, Sherri R; Ellis, Matthew J C; Fenyö, David; Hiltke, Tara; Ketchum, Karen A; Kinsinger, Chris; Kuhn, Eric; Liebler, Daniel C; Liu, Tao; Loss, Michael; MacCoss, Michael J; Qian, Wei-Jun; Rivers, Robert; Rodland, Karin D; Ruggles, Kelly V; Scott, Mitchell G; Smith, Richard D; Thomas, Stefani; Townsend, R Reid; Whiteley, Gordon; Wu, Chaochao; Zhang, Hui; Zhang, Zhen; Rodriguez, Henry; Paulovich, Amanda G

    2016-01-01

    The Clinical Proteomic Tumor Analysis Consortium (CPTAC) of the National Cancer Institute (NCI) has launched an Assay Portal (http://assays.cancer.gov) to serve as an open-source repository of well-characterized targeted proteomic assays. The portal is designed to curate and disseminate highly characterized, targeted mass spectrometry (MS)-based assays by providing detailed assay performance characterization data, standard operating procedures, and access to reagents. Assay content is accessed via the portal through queries to find assays targeting proteins associated with specific cellular pathways, protein complexes, or specific chromosomal regions. The position of the peptide analytes for which there are available assays are mapped relative to other features of interest in the protein, such as sequence domains, isoforms, single nucleotide polymorphisms, and posttranslational modifications. The overarching goals are to enable robust quantification of all human proteins and to standardize the quantification of targeted MS-based assays to ultimately enable harmonization of results over time and across laboratories.

  13. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Kress, Joel David

    The development and scale up of cost effective carbon capture processes is of paramount importance to enable the widespread deployment of these technologies to significantly reduce greenhouse gas emissions. The U.S. Department of Energy initiated the Carbon Capture Simulation Initiative (CCSI) in 2011 with the goal of developing a computational toolset that would enable industry to more effectively identify, design, scale up, operate, and optimize promising concepts. The first half of the presentation will introduce the CCSI Toolset consisting of basic data submodels, steady-state and dynamic process models, process optimization and uncertainty quantification tools, an advanced dynamic process control framework,more » and high-resolution filtered computationalfluid- dynamics (CFD) submodels. The second half of the presentation will describe a high-fidelity model of a mesoporous silica supported, polyethylenimine (PEI)-impregnated solid sorbent for CO 2 capture. The sorbent model includes a detailed treatment of transport and amine-CO 2- H 2O interactions based on quantum chemistry calculations. Using a Bayesian approach for uncertainty quantification, we calibrate the sorbent model to Thermogravimetric (TGA) data.« less

  14. Designing Control System Application Software for Change

    NASA Technical Reports Server (NTRS)

    Boulanger, Richard

    2001-01-01

    The Unified Modeling Language (UML) was used to design the Environmental Systems Test Stand (ESTS) control system software. The UML was chosen for its ability to facilitate a clear dialog between software designer and customer, from which requirements are discovered and documented in a manner which transposes directly to program objects. Applying the UML to control system software design has resulted in a baseline set of documents from which change and effort of that change can be accurately measured. As the Environmental Systems Test Stand evolves, accurate estimates of the time and effort required to change the control system software will be made. Accurate quantification of the cost of software change can be before implementation, improving schedule and budget accuracy.

  15. Generating standardized image data for testing and calibrating quantification of volumes, surfaces, lengths, and object counts in fibrous and porous materials using X-ray microtomography.

    PubMed

    Jiřík, Miroslav; Bartoš, Martin; Tomášek, Petr; Malečková, Anna; Kural, Tomáš; Horáková, Jana; Lukáš, David; Suchý, Tomáš; Kochová, Petra; Hubálek Kalbáčová, Marie; Králíčková, Milena; Tonar, Zbyněk

    2018-06-01

    Quantification of the structure and composition of biomaterials using micro-CT requires image segmentation due to the low contrast and overlapping radioopacity of biological materials. The amount of bias introduced by segmentation procedures is generally unknown. We aim to develop software that generates three-dimensional models of fibrous and porous structures with known volumes, surfaces, lengths, and object counts in fibrous materials and to provide a software tool that calibrates quantitative micro-CT assessments. Virtual image stacks were generated using the newly developed software TeIGen, enabling the simulation of micro-CT scans of unconnected tubes, connected tubes, and porosities. A realistic noise generator was incorporated. Forty image stacks were evaluated using micro-CT, and the error between the true known and estimated data was quantified. Starting with geometric primitives, the error of the numerical estimation of surfaces and volumes was eliminated, thereby enabling the quantification of volumes and surfaces of colliding objects. Analysis of the sensitivity of the thresholding upon parameters of generated testing image sets revealed the effects of decreasing resolution and increasing noise on the accuracy of the micro-CT quantification. The size of the error increased with decreasing resolution when the voxel size exceeded 1/10 of the typical object size, which simulated the effect of the smallest details that could still be reliably quantified. Open-source software for calibrating quantitative micro-CT assessments by producing and saving virtually generated image data sets with known morphometric data was made freely available to researchers involved in morphometry of three-dimensional fibrillar and porous structures in micro-CT scans. © 2018 Wiley Periodicals, Inc.

  16. PaCeQuant: A Tool for High-Throughput Quantification of Pavement Cell Shape Characteristics.

    PubMed

    Möller, Birgit; Poeschl, Yvonne; Plötner, Romina; Bürstenbinder, Katharina

    2017-11-01

    Pavement cells (PCs) are the most frequently occurring cell type in the leaf epidermis and play important roles in leaf growth and function. In many plant species, PCs form highly complex jigsaw-puzzle-shaped cells with interlocking lobes. Understanding of their development is of high interest for plant science research because of their importance for leaf growth and hence for plant fitness and crop yield. Studies of PC development, however, are limited, because robust methods are lacking that enable automatic segmentation and quantification of PC shape parameters suitable to reflect their cellular complexity. Here, we present our new ImageJ-based tool, PaCeQuant, which provides a fully automatic image analysis workflow for PC shape quantification. PaCeQuant automatically detects cell boundaries of PCs from confocal input images and enables manual correction of automatic segmentation results or direct import of manually segmented cells. PaCeQuant simultaneously extracts 27 shape features that include global, contour-based, skeleton-based, and PC-specific object descriptors. In addition, we included a method for classification and analysis of lobes at two-cell junctions and three-cell junctions, respectively. We provide an R script for graphical visualization and statistical analysis. We validated PaCeQuant by extensive comparative analysis to manual segmentation and existing quantification tools and demonstrated its usability to analyze PC shape characteristics during development and between different genotypes. PaCeQuant thus provides a platform for robust, efficient, and reproducible quantitative analysis of PC shape characteristics that can easily be applied to study PC development in large data sets. © 2017 American Society of Plant Biologists. All Rights Reserved.

  17. Quantification of liver fat in the presence of iron overload.

    PubMed

    Horng, Debra E; Hernando, Diego; Reeder, Scott B

    2017-02-01

    To evaluate the accuracy of R2* models (1/T 2 * = R2*) for chemical shift-encoded magnetic resonance imaging (CSE-MRI)-based proton density fat-fraction (PDFF) quantification in patients with fatty liver and iron overload, using MR spectroscopy (MRS) as the reference standard. Two Monte Carlo simulations were implemented to compare the root-mean-squared-error (RMSE) performance of single-R2* and dual-R2* correction in a theoretical liver environment with high iron. Fatty liver was defined as hepatic PDFF >5.6% based on MRS; only subjects with fatty liver were considered for analyses involving fat. From a group of 40 patients with known/suspected iron overload, nine patients were identified at 1.5T, and 13 at 3.0T with fatty liver. MRS linewidth measurements were used to estimate R2* values for water and fat peaks. PDFF was measured from CSE-MRI data using single-R2* and dual-R2* correction with magnitude and complex fitting. Spectroscopy-based R2* analysis demonstrated that the R2* of water and fat remain close in value, both increasing as iron overload increases: linear regression between R2* W and R2* F resulted in slope = 0.95 [0.79-1.12] (95% limits of agreement) at 1.5T and slope = 0.76 [0.49-1.03] at 3.0T. MRI-PDFF using dual-R2* correction had severe artifacts. MRI-PDFF using single-R2* correction had good agreement with MRS-PDFF: Bland-Altman analysis resulted in -0.7% (bias) ± 2.9% (95% limits of agreement) for magnitude-fit and -1.3% ± 4.3% for complex-fit at 1.5T, and -1.5% ± 8.4% for magnitude-fit and -2.2% ± 9.6% for complex-fit at 3.0T. Single-R2* modeling enables accurate PDFF quantification, even in patients with iron overload. 1 J. Magn. Reson. Imaging 2017;45:428-439. © 2016 International Society for Magnetic Resonance in Medicine.

  18. Label-free in vivo in situ diagnostic imaging by cellular metabolism quantification with a flexible multiphoton endomicroscope (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Leclerc, Pierre; Hage, Charles-Henri; Fabert, Marc; Brevier, Julien; O'Connor, Rodney P.; Bardet-Coste, Sylvia M.; Habert, Rémi; Braud, Flavie; Kudlinski, Alexandre; Louradour, Frederic

    2017-02-01

    Multiphoton microscopy is a cutting edge imaging modality leading to increasing advances in biology and also in the clinical field. To use it at its full potential and at the very heart of clinical practice, there have been several developments of fiber-based multiphoton microendoscopes. The application for those probes is now limited by few major restrictions, such as the difficulty to collect autofluorescence signals from tissues and cells theses being inherently weak (e.g. the ones from intracellular NADH or FAD metabolites). This limitation reduces the usefulness of microendoscopy in general, effectively restraining it to morphological imaging modality requiring staining of the tissues. Our aim is to go beyond this limitation, showing for the first time label-free cellular metabolism monitoring, in vivo in situ in real time. The experimental setup is an upgrade of a recently published one (Ducourthial et.al, Scientific Reports, 2016) where femtosecond pulse fiber delivery is further optimized thank's to a new transmissive-GRISM-based pulse stretcher permitting high energy throughput and wide bandwidth. This device allows fast sequential operation with two different excitation wavelengths for efficient two-photon excited NADH and FAD autofluorescence endoscopic detection (i.e. 860 nm for FAD and 760 nm for NADH), enabling cellular optical redox ratio quantification at 8 frames/s. The obtained results on cell models in vitro and also on animal models in vivo (e.g. neurons of a living mouse) prove that we accurately assess the level of NADH and FAD at subcellular resolution through a 3-meters-long fiber with our miniaturized probe (O.D. =2.2 mm).

  19. Smartphone-based rapid quantification of viable bacteria by single-cell microdroplet turbidity imaging.

    PubMed

    Cui, Xiaonan; Ren, Lihui; Shan, Yufei; Wang, Xixian; Yang, Zhenlong; Li, Chunyu; Xu, Jian; Ma, Bo

    2018-05-18

    Standard plate count (SPC) has been recognized as the golden standard for the quantification of viable bacteria. However, SPC usually takes one to several days to grow individual cells into a visible colony, which greatly hampers its application in rapid bacteria enumeration. Here we present a microdroplet turbidity imaging based digital standard plate count (dSPC) method to overcome this hurdle. Instead of cultivating on agar plates, bacteria are encapsulated in monodisperse microdroplets for single-cell cultivation. Proliferation of the encapsulated bacterial cell produced a detectable change in microdroplet turbidity, which allowed, after just a few bacterial doubling cycles (i.e., a few hours), enumeration of viable bacteria by visible-light imaging. Furthermore, a dSPC platform integrating a power-free droplet generator with smartphone-based turbidity imaging was established. As proof-of-concept demonstrations, a series of Gram-negative bacteria (Escherichia coli) and Gram-positive bacteria (Bacillus subtilis) samples were quantified via the smartphone dSPC accurately within 6 hours, representing a detection sensitivity of 100 CFU ml-1 and at least 3 times faster. In addition, Enterobacter sakazakii (E. sakazakii) in infant milk powder as a real sample was enumerated within 6 hours, in contrast to the 24 hours needed in traditional SPC. Results with high accuracy and reproducibility were achieved, with no difference in counts found between dSPC and SPC. By enabling label-free, rapid, portable and low-cost enumeration and cultivation of viable bacteria onsite, smartphone dSPC forms the basis for a temporally and geographically trackable network for surveying live microbes globally where every citizen with a cellphone can contribute anytime and anywhere.

  20. The carbon stock of harvested wood products in US residential houses is substantial

    NASA Astrophysics Data System (ADS)

    Xie, S. H.; Kurz, W. A.; McFarlane, P. N.

    2016-12-01

    Harvested wood products (HWP) provide humans with services that can substitute for emissions-intensive products, while storing carbon sequestered from the atmosphere by forests. Nearly half of HWP in the US have been used for construction purposes. Due to the long-lived nature of houses, the wood within these buildings can store carbon for many decades. This study developed a new methodology to model the decay and half-lives based on national census data. Six different models were evaluated and the inverse sigmoidal decay pattern of houses was best represented using a Gamma distribution model. It adequately modelled the decay pattern of houses from the US, Canada and Norway and enabled the quantification of structural HWP carbon stocks in residential houses. For the US, it was estimated that residential houses would take about 140 years to reach 50% removal of the housing number initially constructed and 390 years to reach 95% removal. At the end of 2009, the carbon stock of structural HWP in US residential houses was estimated to be 668 MtC and the average rate of carbon storage from 1990 to 2006 was 44.7 Mt CO2e yr-1. The utilization of HWP for long-lived uses has the potential to make a major contribution to mitigating greenhouse gas emissions through carbon storage and substitution of emissions from other products such as concrete and steel. With a same amount of HWP input, structural wood use can produce a carbon pool that is 48 times larger than pulp and paper use, or 3 times larger than furniture use. In addition, this pool takes much longer to saturate. Accurate quantification of the structural HWP pool is therefore an important topic worthy of detailed investigation.

  1. Characterization of drop aerodynamic fragmentation in the bag and sheet-thinning regimes by crossed-beam, two-view, digital in-line holography

    DOE PAGES

    Guildenbecher, Daniel R.; Gao, Jian; Chen, Jun; ...

    2017-04-19

    When a spherical liquid drop is subjected to a step change in relative gas velocity, aerodynamic forces lead to drop deformation and possible breakup into a number of secondary fragments. In order to investigate this flow, a digital in-line holography (DIH) diagnostic is proposed which enables rapid quantification of spatial statistics with limited experimental repetition. To overcome the high uncertainty in the depth direction experienced in previous applications of DIH, a crossed-beam, two-view configuration is introduced. With appropriate calibration, this diagnostic is shown to provide accurate quantification of fragment sizes, three-dimensional positions and three-component velocities in a large measurement volume.more » We apply these capabilities in order to investigate the aerodynamic breakup of drops at two non-dimensional Weber numbers, We, corresponding to the bag (We = 14) and sheet-thinning (We = 55) regimes. Ensemble average results show the evolution of fragment size and velocity statistics during the course of breakup. Our results indicate that mean fragment sizes increase throughout the course of breakup. For the bag breakup case, the evolution of a multi-mode fragment size probability density is observed. This is attributed to separate fragmentation mechanisms for the bag and rim structures. In contrast, for the sheet-thinning case, the fragment size probability density shows only one distinct peak indicating a single fragmentation mechanism. Compared to previous related investigations of this flow, many orders of magnitude more fragments are measured per condition, resulting in a significant improvement in data fidelity. For this reason, this experimental dataset is likely to provide new opportunities for detailed validation of analytic and computational models of this flow.« less

  2. Quantification of mevalonate-5-phosphate using UPLC-MS/MS for determination of mevalonate kinase activity.

    PubMed

    Reitzle, Lukas; Maier, Barbara; Stojanov, Silvia; Teupser, Daniel; Muntau, Ania C; Vogeser, Michael; Gersting, Søren W

    2015-08-01

    Mevalonate kinase deficiency, a rare autosomal recessive autoinflammatory disease, is caused by mutations in the MVK gene encoding mevalonate kinase (MK). MK catalyzes the phosphorylation of mevalonic acid to mevalonate-5-phosphate (MVAP) in the pathway of isoprenoid and sterol synthesis. The disease phenotype correlates with residual activity ranging from <0.5% for mevalonic aciduria to 1-7% for the milder hyperimmunoglobulinemia D and periodic fever syndrome (HIDS). Hence, assessment of loss-of-function requires high accuracy measurements. We describe a method using isotope dilution UPLC-MS/MS for precise and sensitive determination of MK activity. Wild-type MK and the variant V261A, which is associated with HIDS, were recombinantly expressed in Escherichia coli. Enzyme activity was determined by formation of MVAP over time quantified by isotope dilution UPLC-MS/MS. The method was validated according to the FDA Guidance for Bioanalytical Method Validation. Sensitivity for detection of MAVP by UPLC-MS/MS was improved by derivatization with butanol-HCl (LLOQ, 5.0 fmol) and the method was linear from 0.5 to 250 μmol/L (R(2) > 0.99) with a precision of ≥ 89% and an accuracy of ± 2.7%. The imprecision of the activity assay, including the enzymatic reaction and the UPLC-MS/MS quantification, was 8.3%. The variant V261A showed a significantly decreased activity of 53.1%. Accurate determination of MK activity was enabled by sensitive and reproducible detection of MVAP using UPLC-MS/MS. The novel method may improve molecular characterization of MVK mutations, provide robust genotype-phenotype correlations, and accelerate compound screening for drug candidates restoring variant MK activity. Copyright © 2015 The Canadian Society of Clinical Chemists. Published by Elsevier Inc. All rights reserved.

  3. MetaPro-IQ: a universal metaproteomic approach to studying human and mouse gut microbiota.

    PubMed

    Zhang, Xu; Ning, Zhibin; Mayne, Janice; Moore, Jasmine I; Li, Jennifer; Butcher, James; Deeke, Shelley Ann; Chen, Rui; Chiang, Cheng-Kang; Wen, Ming; Mack, David; Stintzi, Alain; Figeys, Daniel

    2016-06-24

    The gut microbiota has been shown to be closely associated with human health and disease. While next-generation sequencing can be readily used to profile the microbiota taxonomy and metabolic potential, metaproteomics is better suited for deciphering microbial biological activities. However, the application of gut metaproteomics has largely been limited due to the low efficiency of protein identification. Thus, a high-performance and easy-to-implement gut metaproteomic approach is required. In this study, we developed a high-performance and universal workflow for gut metaproteome identification and quantification (named MetaPro-IQ) by using the close-to-complete human or mouse gut microbial gene catalog as database and an iterative database search strategy. An average of 38 and 33 % of the acquired tandem mass spectrometry (MS) spectra was confidently identified for the studied mouse stool and human mucosal-luminal interface samples, respectively. In total, we accurately quantified 30,749 protein groups for the mouse metaproteome and 19,011 protein groups for the human metaproteome. Moreover, the MetaPro-IQ approach enabled comparable identifications with the matched metagenome database search strategy that is widely used but needs prior metagenomic sequencing. The response of gut microbiota to high-fat diet in mice was then assessed, which showed distinct metaproteome patterns for high-fat-fed mice and identified 849 proteins as significant responders to high-fat feeding in comparison to low-fat feeding. We present MetaPro-IQ, a metaproteomic approach for highly efficient intestinal microbial protein identification and quantification, which functions as a universal workflow for metaproteomic studies, and will thus facilitate the application of metaproteomics for better understanding the functions of gut microbiota in health and disease.

  4. Quantification of incisal tooth wear in upper anterior teeth: conventional vs new method using toolmakers microscope and a three-dimensional measuring technique.

    PubMed

    Al-Omiri, Mahmoud K; Sghaireen, Mohd G; Alzarea, Bader K; Lynch, Edward

    2013-12-01

    This study aimed to quantify tooth wear in upper anterior teeth using a new CAD-CAM Laser scanning machine, tool maker microscope and conventional tooth wear index. Fifty participants (25 males and 25 females, mean age = 25 ± 4 years) were assessed for incisal tooth wear of upper anterior teeth using Smith and Knight clinical tooth wear index (TWI) on two occasions, the study baseline and 1 year later. Stone dies for each tooth were prepared and scanned using the CAD-CAM Laser Cercon System. Scanned images were printed and examined under a toolmaker microscope to quantify tooth wear and then the dies were directly assessed under the microscope to measure tooth wear. The Wilcoxon Signed Ranks Test was used to analyze the data. TWI scores for incisal edges were 0-3 and were similar at both occasions. Score 4 was not detected. Wear values measured by directly assessing the dies under the toolmaker microscope (range = 113 - 150 μm, mean = 130 ± 20 μm) were significantly more than those measured from Cercon Digital Machine images (range=52-80 μm, mean = 68 ± 23 μm) and both showed significant differences between the two occasions. Wear progression in upper anterior teeth was effectively detected by directly measuring the dies or the images of dies under toolmaker microscope. Measuring the dies of worn dentition directly under tool maker microscope enabled detection of wear progression more accurately than measuring die images obtained with Cercon Digital Machine. Conventional method was the least sensitive for tooth wear quantification and was unable to identify wear progression in most cases. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Quantifying uncertainties of seismic Bayesian inversion of Northern Great Plains

    NASA Astrophysics Data System (ADS)

    Gao, C.; Lekic, V.

    2017-12-01

    Elastic waves excited by earthquakes are the fundamental observations of the seismological studies. Seismologists measure information such as travel time, amplitude, and polarization to infer the properties of earthquake source, seismic wave propagation, and subsurface structure. Across numerous applications, seismic imaging has been able to take advantage of complimentary seismic observables to constrain profiles and lateral variations of Earth's elastic properties. Moreover, seismic imaging plays a unique role in multidisciplinary studies of geoscience by providing direct constraints on the unreachable interior of the Earth. Accurate quantification of uncertainties of inferences made from seismic observations is of paramount importance for interpreting seismic images and testing geological hypotheses. However, such quantification remains challenging and subjective due to the non-linearity and non-uniqueness of geophysical inverse problem. In this project, we apply a reverse jump Markov chain Monte Carlo (rjMcMC) algorithm for a transdimensional Bayesian inversion of continental lithosphere structure. Such inversion allows us to quantify the uncertainties of inversion results by inverting for an ensemble solution. It also yields an adaptive parameterization that enables simultaneous inversion of different elastic properties without imposing strong prior information on the relationship between them. We present retrieved profiles of shear velocity (Vs) and radial anisotropy in Northern Great Plains using measurements from USArray stations. We use both seismic surface wave dispersion and receiver function data due to their complementary constraints of lithosphere structure. Furthermore, we analyze the uncertainties of both individual and joint inversion of those two data types to quantify the benefit of doing joint inversion. As an application, we infer the variation of Moho depths and crustal layering across the northern Great Plains.

  6. Photoacoustic bio-quantification of graphene based nanomaterials at a single cell level (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Nedosekin, Dmitry A.; Nolan, Jacqueline; Biris, Alexandru S.; Zharov, Vladimir P.

    2017-03-01

    Arkansas Nanomedicine Center at the University of Arkansas for Medical Sciences in collaboration with other Arkansas Universities and the FDA-based National Center of Toxicological Research in Jefferson, AR is developing novel techniques for rapid quantification of graphene-based nanomaterials (GBNs) in various biological samples. All-carbon GBNs have wide range of potential applications in industry, agriculture, food processing and medicine; however, quantification of GBNs is difficult in carbon reach biological tissues. The accurate quantification of GBNs is essential for research on material toxicity and the development of GBNs-based drug delivery platforms. We have developed microscopy and cytometry platforms for detection and quantification of GBNs in single cells, tissue and blood samples using photoacoustic contrast of GBNs. We demonstrated PA quantification of individual graphene uptake by single cells. High-resolution PA microscopy provided mapping of GBN distribution within live cells to establish correlation with intracellular toxic phenomena using apoptotic and necrotic assays. This new methodology and corresponding technical platform provide the insight on possible toxicological risks of GBNs at singe cells levels. In addition, in vivo PA image flow cytometry demonstrated the capability to monitor of GBNs pharmacokinetics in mouse model and to map the resulting biodistribution of GBNs in mouse tissues. The integrated PA platform provided an unprecedented sensitivity toward GBNs and allowed to enhance conventional toxicology research by providing a direct correlation between uptake of GBNs at a single cell level and cell viability status.

  7. Identification of cellular MMP substrates using quantitative proteomics: isotope-coded affinity tags (ICAT) and isobaric tags for relative and absolute quantification (iTRAQ).

    PubMed

    Butler, Georgina S; Dean, Richard A; Morrison, Charlotte J; Overall, Christopher M

    2010-01-01

    Identification of protease substrates is essential to understand the functional consequences of normal proteolytic processing and dysregulated proteolysis in disease. Quantitative proteomics and mass spectrometry can be used to identify protease substrates in the cellular context. Here we describe the use of two protein labeling techniques, Isotope-Coded Affinity Tags (ICAT and Isobaric Tags for Relative and Absolute Quantification (iTRAQ), which we have used successfully to identify novel matrix metalloproteinase (MMP) substrates in cell culture systems (1-4). ICAT and iTRAQ can label proteins and protease cleavage products of secreted proteins, protein domains shed from the cell membrane or pericellular matrix of protease-transfected cells that have accumulated in conditioned medium, or cell surface proteins in membrane preparations; isotopically distinct labels are used for control cells. Tryptic digestion and tandem mass spectrometry of the generated fragments enable sequencing of differentially labeled but otherwise identical pooled peptides. The isotopic tag, which is unique for each label, identifies the peptides originating from each sample, for instance, protease-transfected or control cells, and comparison of the peak areas enables relative quantification of the peptide in each sample. Thus proteins present in altered amounts between protease-expressing and null cells are implicated as protease substrates and can be further validated as such.

  8. Selective classification and quantification model of C&D waste from material resources consumed in residential building construction.

    PubMed

    Mercader-Moyano, Pilar; Ramírez-de-Arellano-Agudo, Antonio

    2013-05-01

    The unfortunate economic situation involving Spain and the European Union is, among other factors, the result of intensive construction activity over recent years. The excessive consumption of natural resources, together with the impact caused by the uncontrolled dumping of untreated C&D waste in illegal landfills have caused environmental pollution and a deterioration of the landscape. The objective of this research was to generate a selective classification and quantification model of C&D waste based on the material resources consumed in the construction of residential buildings, either new or renovated, namely the Conventional Constructive Model (CCM). A practical example carried out on ten residential buildings in Seville, Spain, enabled the identification and quantification of the C&D waste generated in their construction and the origin of the waste, in terms of the building material from which it originated and its impact for every m(2) constructed. This model enables other researchers to establish comparisons between the various improvements proposed for the minimization of the environmental impact produced by building a CCM, new corrective measures to be proposed in future policies that regulate the production and management of C&D waste generated in construction from the design stage to the completion of the construction process, and the establishment of sustainable management for C&D waste and for the selection of materials for the construction on projected or renovated buildings.

  9. A tool for selective inline quantification of co-eluting proteins in chromatography using spectral analysis and partial least squares regression.

    PubMed

    Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen

    2014-07-01

    Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. © 2014 Wiley Periodicals, Inc.

  10. Microvolume protein concentration determination using the NanoDrop 2000c spectrophotometer.

    PubMed

    Desjardins, Philippe; Hansen, Joel B; Allen, Michael

    2009-11-04

    Traditional spectrophotometry requires placing samples into cuvettes or capillaries. This is often impractical due to the limited sample volumes often used for protein analysis. The Thermo Scientific NanoDrop 2000c Spectrophotometer solves this issue with an innovative sample retention system that holds microvolume samples between two measurement surfaces using the surface tension properties of liquids, enabling the quantification of samples in volumes as low as 0.5-2 microL. The elimination of cuvettes or capillaries allows real time changes in path length, which reduces the measurement time while greatly increasing the dynamic range of protein concentrations that can be measured. The need for dilutions is also eliminated, and preparations for sample quantification are relatively easy as the measurement surfaces can be simply wiped with laboratory wipe. This video article presents modifications to traditional protein concentration determination methods for quantification of microvolume amounts of protein using A280 absorbance readings or the BCA colorimetric assay.

  11. A fault tree model to assess probability of contaminant discharge from shipwrecks.

    PubMed

    Landquist, H; Rosén, L; Lindhe, A; Norberg, T; Hassellöv, I-M; Lindgren, J F; Dahllöf, I

    2014-11-15

    Shipwrecks on the sea floor around the world may contain hazardous substances that can cause harm to the marine environment. Today there are no comprehensive methods for environmental risk assessment of shipwrecks, and thus there is poor support for decision-making on prioritization of mitigation measures. The purpose of this study was to develop a tool for quantitative risk estimation of potentially polluting shipwrecks, and in particular an estimation of the annual probability of hazardous substance discharge. The assessment of the probability of discharge is performed using fault tree analysis, facilitating quantification of the probability with respect to a set of identified hazardous events. This approach enables a structured assessment providing transparent uncertainty and sensitivity analyses. The model facilitates quantification of risk, quantification of the uncertainties in the risk calculation and identification of parameters to be investigated further in order to obtain a more reliable risk calculation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  12. Thyroid tissue constituents characterization and application to in vivo studies by broadband (600-1200 nm) diffuse optical spectroscopy

    NASA Astrophysics Data System (ADS)

    Konugolu Venkata Sekar, Sanathana; Farina, Andrea; dalla Mora, Alberto; Taroni, Paola; Lindner, Claus; Mora, Mireia; Farzam, Parisa; Pagliazzi, Marco; Squarcia, Mattia; Halperin, Irene; Hanzu, Felicia A.; Dehghani, Hamid; Durduran, Turgut; Pifferi, Antonio

    2017-07-01

    We present the first broadband (600-1100 nm) diffuse optical characterization of thyroglobulin and tyrosine, which are thyroid-specific tissue constituents. In-vivo measurements at the thyroid region enabled their quantification for functional and diagnostic applications.

  13. Identification and absolute quantification of enzymes in laundry detergents by liquid chromatography tandem mass spectrometry.

    PubMed

    Gaubert, Alexandra; Jeudy, Jérémy; Rougemont, Blandine; Bordes, Claire; Lemoine, Jérôme; Casabianca, Hervé; Salvador, Arnaud

    2016-07-01

    In a stricter legislative context, greener detergent formulations are developed. In this way, synthetic surfactants are frequently replaced by bio-sourced surfactants and/or used at lower concentrations in combination with enzymes. In this paper, a LC-MS/MS method was developed for the identification and quantification of enzymes in laundry detergents. Prior to the LC-MS/MS analyses, a specific sample preparation protocol was developed due to matrix complexity (high surfactant percentages). Then for each enzyme family mainly used in detergent formulations (protease, amylase, cellulase, and lipase), specific peptides were identified on a high resolution platform. A LC-MS/MS method was then developed in selected reaction monitoring (SRM) MS mode for the light and corresponding heavy peptides. The method was linear on the peptide concentration ranges 25-1000 ng/mL for protease, lipase, and cellulase; 50-1000 ng/mL for amylase; and 5-1000 ng/mL for cellulase in both water and laundry detergent matrices. The application of the developed analytical strategy to real commercial laundry detergents enabled enzyme identification and absolute quantification. For the first time, identification and absolute quantification of enzymes in laundry detergent was realized by LC-MS/MS in a single run. Graphical Abstract Identification and quantification of enzymes by LC-MS/MS.

  14. Normalized Quantitative Western Blotting Based on Standardized Fluorescent Labeling.

    PubMed

    Faden, Frederik; Eschen-Lippold, Lennart; Dissmeyer, Nico

    2016-01-01

    Western blot (WB) analysis is the most widely used method to monitor expression of proteins of interest in protein extracts of high complexity derived from diverse experimental setups. WB allows the rapid and specific detection of a target protein, such as non-tagged endogenous proteins as well as protein-epitope tag fusions depending on the availability of specific antibodies. To generate quantitative data from independent samples within one experiment and to allow accurate inter-experimental quantification, a reliable and reproducible method to standardize and normalize WB data is indispensable. To date, it is a standard procedure to normalize individual bands of immunodetected proteins of interest from a WB lane to other individual bands of so-called housekeeping proteins of the same sample lane. These are usually detected by an independent antibody or colorimetric detection and do not reflect the real total protein of a sample. Housekeeping proteins-assumed to be constitutively expressed mostly independent of developmental and environmental states-can greatly differ in their expression under these various conditions. Therefore, they actually do not represent a reliable reference to normalize the target protein's abundance to the total amount of protein contained in each lane of a blot.Here, we demonstrate the Smart Protein Layers (SPL) technology, a combination of fluorescent standards and a stain-free fluorescence-based visualization of total protein in gels and after transfer via WB. SPL allows a rapid and highly sensitive protein visualization and quantification with a sensitivity comparable to conventional silver staining with a 1000-fold higher dynamic range. For normalization, standardization and quantification of protein gels and WBs, a sample-dependent bi-fluorescent standard reagent is applied and, for accurate quantification of data derived from different experiments, a second calibration standard is used. Together, the precise quantification of protein expression by lane-to-lane, gel-to-gel, and blot-to-blot comparisons is facilitated especially with respect to experiments in the area of proteostasis dealing with highly variable protein levels and involving protein degradation mutants and treatments modulating protein abundance.

  15. Quantification and characterization of leakage errors

    NASA Astrophysics Data System (ADS)

    Wood, Christopher J.; Gambetta, Jay M.

    2018-03-01

    We present a general framework for the quantification and characterization of leakage errors that result when a quantum system is encoded in the subspace of a larger system. To do this we introduce metrics for quantifying the coherent and incoherent properties of the resulting errors and we illustrate this framework with several examples relevant to superconducting qubits. In particular, we propose two quantities, the leakage and seepage rates, which together with average gate fidelity allow for characterizing the average performance of quantum gates in the presence of leakage and show how the randomized benchmarking protocol can be modified to enable the robust estimation of all three quantities for a Clifford gate set.

  16. Estimation of big sagebrush leaf area index with terrestrial laser scanning

    USDA-ARS?s Scientific Manuscript database

    Accurate monitoring and quantification of the structure and function of semiarid ecosystems is necessary to improve carbon and water flux models that help describe how these systems will respond in the future. The leaf area index (LAI, m2 m-2) is an important indicator of energy, water, and carbon e...

  17. Identification and evaluation of reliable reference genes for quantitative real-time PCR analysis in tea plant (Camellia sinensis (L.) O. Kuntze)

    USDA-ARS?s Scientific Manuscript database

    Quantitative real-time polymerase chain reaction (qRT-PCR) is a commonly used technique for measuring gene expression levels due to its simplicity, specificity, and sensitivity. Reliable reference selection for the accurate quantification of gene expression under various experimental conditions is a...

  18. Simultaneous quantification of neuroactive dopamine serotonin and kynurenine pathway metabolites in gender-specific youth urine by ultra performance liquid chromatography tandem high resolution mass spectrometry.

    PubMed

    Lu, Haihua; Yu, Jing; Wang, Jun; Wu, Linlin; Xiao, Hang; Gao, Rong

    2016-04-15

    Neuroactive metabolites in dopamine, serotonin and kynurenine metabolic pathways play key roles in several physiological processes and their imbalances have been implicated in the pathophysiology of a wide range of disorders. The association of these metabolites' alterations with various pathologies has raised interest in analytical methods for accurate quantification in biological fluids. However, simultaneous measurement of various neuroactive metabolites represents great challenges due to their trace level, high polarity and instability. In this study, an analytical method was developed and validated for accurately quantifying 12 neuroactive metabolites covering three metabolic pathways in youth urine by ultra performance liquid chromatography coupled to electrospray tandem high resolution mass spectrometry (UPLC-ESI-HRMS/MS). The strategy of dansyl chloride derivatization followed by solid phase extraction on C18 cartridges were employed to reduce matrix interference and improve the extraction efficiency. The reverse phase chromatographic separation was achieved with a gradient elution program in 20 min. The high resolution mass spectrometer (Q Exactive) was employed, with confirmation and quantification by Target-MS/MS scan mode. Youth urine samples collected from 100 healthy volunteers (Female:Male=1:1) were analyzed to explore the differences in metabolite profile and their turnover between genders. The results demonstrated that the UPLC-ESI-HRMS/MS method is sensitive and robust, suitable for monitoring a large panel of metabolites and for discovering new biomarkers in the medical fields. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Quantification of Wilms' tumor 1 mRNA by digital polymerase chain reaction.

    PubMed

    Koizumi, Yuki; Furuya, Daisuke; Endo, Teruo; Asanuma, Kouichi; Yanagihara, Nozomi; Takahashi, Satoshi

    2018-02-01

    Wilms' tumor 1 (WT1) is overexpressed in various hematopoietic tumors and widely used as a marker of minimal residual disease. WT1 mRNA has been analyzed using quantitative real-time polymerase chain reaction (real-time PCR). In the present study, we analyzed 40 peripheral blood and bone marrow samples obtained from cases of acute myeloid leukemia, acute lymphoblastic leukemia, and myelodysplastic syndrome at Sapporo Medical University Hospital from April 2012 to January 2015. We performed quantification of WT1 was performed using QuantStudio 3D Digital PCR System (Thermo Fisher Scientific‎) and compared the results between digital PCR and real-time PCR technology. The correlation between digital PCR and real-time PCR was very strong (R = 0.99), and the detection limits of the two methods were equivalent. Digital PCR was able to accurately detect lower WT levels compared with real-time PCR. Digital PCR technology can thus be utilized to predict WT1/ABL1 expression level accurately and should thus be useful for diagnosis or the evaluation of drug efficiency in patients with leukemia.

  20. Absolute Quantification of Selected Proteins in the Human Osteoarthritic Secretome

    PubMed Central

    Peffers, Mandy J.; Beynon, Robert J.; Clegg, Peter D.

    2013-01-01

    Osteoarthritis (OA) is characterized by a loss of extracellular matrix which is driven by catabolic cytokines. Proteomic analysis of the OA cartilage secretome enables the global study of secreted proteins. These are an important class of molecules with roles in numerous pathological mechanisms. Although cartilage studies have identified profiles of secreted proteins, quantitative proteomics techniques have been implemented that would enable further biological questions to be addressed. To overcome this limitation, we used the secretome from human OA cartilage explants stimulated with IL-1β and compared proteins released into the media using a label-free LC-MS/MS-based strategy. We employed QconCAT technology to quantify specific proteins using selected reaction monitoring. A total of 252 proteins were identified, nine were differentially expressed by IL-1 β stimulation. Selected protein candidates were quantified in absolute amounts using QconCAT. These findings confirmed a significant reduction in TIMP-1 in the secretome following IL-1β stimulation. Label-free and QconCAT analysis produced equivocal results indicating no effect of cytokine stimulation on aggrecan, cartilage oligomeric matrix protein, fibromodulin, matrix metalloproteinases 1 and 3 or plasminogen release. This study enabled comparative protein profiling and absolute quantification of proteins involved in molecular pathways pertinent to understanding the pathogenesis of OA. PMID:24132152

  1. Integrating Biology into the General Chemistry Laboratory: Fluorometric Analysis of Chlorophyll "a"

    ERIC Educational Resources Information Center

    Wesolowski, Meredith C.

    2014-01-01

    A laboratory experiment that introduces fluorometry of chlorophyll "a" at the general chemistry level is described. The use of thin-layer chromatography to isolate chlorophyll "a" from spirulina and leaf matter enables quantification of small amounts of chlorophyll "a" via fluorometry. Student results were reasonably…

  2. Functional DNA quantification guides accurate next-generation sequencing mutation detection in formalin-fixed, paraffin-embedded tumor biopsies

    PubMed Central

    2013-01-01

    The formalin-fixed, paraffin-embedded (FFPE) biopsy is a challenging sample for molecular assays such as targeted next-generation sequencing (NGS). We compared three methods for FFPE DNA quantification, including a novel PCR assay (‘QFI-PCR’) that measures the absolute copy number of amplifiable DNA, across 165 residual clinical specimens. The results reveal the limitations of commonly used approaches, and demonstrate the value of an integrated workflow using QFI-PCR to improve the accuracy of NGS mutation detection and guide changes in input that can rescue low quality FFPE DNA. These findings address a growing need for improved quality measures in NGS-based patient testing. PMID:24001039

  3. Carotenoid Extraction and Quantification from Capsicum annuum.

    PubMed

    Richins, Richard D; Kilcrease, James; Rodgriguez-Uribe, Laura; O'Connell, Mary A

    2014-10-05

    Carotenoids are ubiquitous pigments that play key roles in photosynthesis and also accumulate to high levels in fruit and flowers. Specific carotenoids play essential roles in human health as these compounds are precursors for Vitamin A; other specific carotenoids are important sources of macular pigments and all carotenoids are important anti-oxidants. Accurate determination of the composition and concentration of this complex set of natural products is therefore important in many different scientific areas. One of the richest sources of these compounds is the fruit of Capsicum ; these red, yellow and orange fruit accumulate multiple carotenes and xanthophylls. This report describes the detailed method for the extraction and quantification of specific carotenes and xanthophylls.

  4. Simultaneous quantification of amoxicillin and potassium clavulanate in different commercial drugs using PIXE technique

    NASA Astrophysics Data System (ADS)

    Bejjani, A.; Roumié, M.; Akkad, S.; El-Yazbi, F.; Nsouli, B.

    2016-03-01

    We have demonstrated, in previous studies that Particle Induced X-ray Emission (PIXE) is one of the most rapid and accurate choices for quantification of an active ingredient, in a solid drug, from the reactions induced on its specific heteroatom using pellets made from original tablets. In this work, PIXE is used, for the first time, for simultaneous quantification of two active ingredients, amoxicillin trihydrate and potassium clavulanate, in six different commercial antibiotic type of drugs. Since the quality control process of a drug covers a large number of samples, the scope of this study was also to found the most rapid and low cost sample preparation needed to analyze these drugs with a good precision. The chosen drugs were analyzed in their tablets' "as received" form, in pellets made from the powder of the tablets and also in pellets made from the powder of the tablets after being heated up to 70 °C to avoid any molecular destruction until constant weight and removal of humidity. The quantification validity related to the aspects of each sample preparation (homogeneity of the drug components and humidity) are presented and discussed.

  5. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection

    NASA Astrophysics Data System (ADS)

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-10-01

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets.

  6. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection.

    PubMed

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-10-14

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets.

  7. Application of Stochastic Labeling with Random-Sequence Barcodes for Simultaneous Quantification and Sequencing of Environmental 16S rRNA Genes.

    PubMed

    Hoshino, Tatsuhiko; Inagaki, Fumio

    2017-01-01

    Next-generation sequencing (NGS) is a powerful tool for analyzing environmental DNA and provides the comprehensive molecular view of microbial communities. For obtaining the copy number of particular sequences in the NGS library, however, additional quantitative analysis as quantitative PCR (qPCR) or digital PCR (dPCR) is required. Furthermore, number of sequences in a sequence library does not always reflect the original copy number of a target gene because of biases caused by PCR amplification, making it difficult to convert the proportion of particular sequences in the NGS library to the copy number using the mass of input DNA. To address this issue, we applied stochastic labeling approach with random-tag sequences and developed a NGS-based quantification protocol, which enables simultaneous sequencing and quantification of the targeted DNA. This quantitative sequencing (qSeq) is initiated from single-primer extension (SPE) using a primer with random tag adjacent to the 5' end of target-specific sequence. During SPE, each DNA molecule is stochastically labeled with the random tag. Subsequently, first-round PCR is conducted, specifically targeting the SPE product, followed by second-round PCR to index for NGS. The number of random tags is only determined during the SPE step and is therefore not affected by the two rounds of PCR that may introduce amplification biases. In the case of 16S rRNA genes, after NGS sequencing and taxonomic classification, the absolute number of target phylotypes 16S rRNA gene can be estimated by Poisson statistics by counting random tags incorporated at the end of sequence. To test the feasibility of this approach, the 16S rRNA gene of Sulfolobus tokodaii was subjected to qSeq, which resulted in accurate quantification of 5.0 × 103 to 5.0 × 104 copies of the 16S rRNA gene. Furthermore, qSeq was applied to mock microbial communities and environmental samples, and the results were comparable to those obtained using digital PCR and relative abundance based on a standard sequence library. We demonstrated that the qSeq protocol proposed here is advantageous for providing less-biased absolute copy numbers of each target DNA with NGS sequencing at one time. By this new experiment scheme in microbial ecology, microbial community compositions can be explored in more quantitative manner, thus expanding our knowledge of microbial ecosystems in natural environments.

  8. Advances in targeted proteomics and applications to biomedical research

    PubMed Central

    Shi, Tujin; Song, Ehwang; Nie, Song; Rodland, Karin D.; Liu, Tao; Qian, Wei-Jun; Smith, Richard D.

    2016-01-01

    Targeted proteomics technique has emerged as a powerful protein quantification tool in systems biology, biomedical research, and increasing for clinical applications. The most widely used targeted proteomics approach, selected reaction monitoring (SRM), also known as multiple reaction monitoring (MRM), can be used for quantification of cellular signaling networks and preclinical verification of candidate protein biomarkers. As an extension to our previous review on advances in SRM sensitivity herein we review recent advances in the method and technology for further enhancing SRM sensitivity (from 2012 to present), and highlighting its broad biomedical applications in human bodily fluids, tissue and cell lines. Furthermore, we also review two recently introduced targeted proteomics approaches, parallel reaction monitoring (PRM) and data-independent acquisition (DIA) with targeted data extraction on fast scanning high-resolution accurate-mass (HR/AM) instruments. Such HR/AM targeted quantification with monitoring all target product ions addresses SRM limitations effectively in specificity and multiplexing; whereas when compared to SRM, PRM and DIA are still in the infancy with a limited number of applications. Thus, for HR/AM targeted quantification we focus our discussion on method development, data processing and analysis, and its advantages and limitations in targeted proteomics. Finally, general perspectives on the potential of achieving both high sensitivity and high sample throughput for large-scale quantification of hundreds of target proteins are discussed. PMID:27302376

  9. AQuA: An Automated Quantification Algorithm for High-Throughput NMR-Based Metabolomics and Its Application in Human Plasma.

    PubMed

    Röhnisch, Hanna E; Eriksson, Jan; Müllner, Elisabeth; Agback, Peter; Sandström, Corine; Moazzami, Ali A

    2018-02-06

    A key limiting step for high-throughput NMR-based metabolomics is the lack of rapid and accurate tools for absolute quantification of many metabolites. We developed, implemented, and evaluated an algorithm, AQuA (Automated Quantification Algorithm), for targeted metabolite quantification from complex 1 H NMR spectra. AQuA operates based on spectral data extracted from a library consisting of one standard calibration spectrum for each metabolite. It uses one preselected NMR signal per metabolite for determining absolute concentrations and does so by effectively accounting for interferences caused by other metabolites. AQuA was implemented and evaluated using experimental NMR spectra from human plasma. The accuracy of AQuA was tested and confirmed in comparison with a manual spectral fitting approach using the ChenomX software, in which 61 out of 67 metabolites quantified in 30 human plasma spectra showed a goodness-of-fit (r 2 ) close to or exceeding 0.9 between the two approaches. In addition, three quality indicators generated by AQuA, namely, occurrence, interference, and positional deviation, were studied. These quality indicators permit evaluation of the results each time the algorithm is operated. The efficiency was tested and confirmed by implementing AQuA for quantification of 67 metabolites in a large data set comprising 1342 experimental spectra from human plasma, in which the whole computation took less than 1 s.

  10. A universal real-time PCR assay for the quantification of group-M HIV-1 proviral load.

    PubMed

    Malnati, Mauro S; Scarlatti, Gabriella; Gatto, Francesca; Salvatori, Francesca; Cassina, Giulia; Rutigliano, Teresa; Volpi, Rosy; Lusso, Paolo

    2008-01-01

    Quantification of human immunodeficiency virus type-1 (HIV-1) proviral DNA is increasingly used to measure the HIV-1 cellular reservoirs, a helpful marker to evaluate the efficacy of antiretroviral therapeutic regimens in HIV-1-infected individuals. Furthermore, the proviral DNA load represents a specific marker for the early diagnosis of perinatal HIV-1 infection and might be predictive of HIV-1 disease progression independently of plasma HIV-1 RNA levels and CD4(+) T-cell counts. The high degree of genetic variability of HIV-1 poses a serious challenge for the design of a universal quantitative assay capable of detecting all the genetic subtypes within the main (M) HIV-1 group with similar efficiency. Here, we describe a highly sensitive real-time PCR protocol that allows for the correct quantification of virtually all group-M HIV-1 strains with a higher degree of accuracy compared with other methods. The protocol involves three stages, namely DNA extraction/lysis, cellular DNA quantification and HIV-1 proviral load assessment. Owing to the robustness of the PCR design, this assay can be performed on crude cellular extracts, and therefore it may be suitable for the routine analysis of clinical samples even in developing countries. An accurate quantification of the HIV-1 proviral load can be achieved within 1 d from blood withdrawal.

  11. Advances in targeted proteomics and applications to biomedical research

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Shi, Tujin; Song, Ehwang; Nie, Song

    Targeted proteomics technique has emerged as a powerful protein quantification tool in systems biology, biomedical research, and increasing for clinical applications. The most widely used targeted proteomics approach, selected reaction monitoring (SRM), also known as multiple reaction monitoring (MRM), can be used for quantification of cellular signaling networks and preclinical verification of candidate protein biomarkers. As an extension to our previous review on advances in SRM sensitivity (Shi et al., Proteomics, 12, 1074–1092, 2012) herein we review recent advances in the method and technology for further enhancing SRM sensitivity (from 2012 to present), and highlighting its broad biomedical applications inmore » human bodily fluids, tissue and cell lines. Furthermore, we also review two recently introduced targeted proteomics approaches, parallel reaction monitoring (PRM) and data-independent acquisition (DIA) with targeted data extraction on fast scanning high-resolution accurate-mass (HR/AM) instruments. Such HR/AM targeted quantification with monitoring all target product ions addresses SRM limitations effectively in specificity and multiplexing; whereas when compared to SRM, PRM and DIA are still in the infancy with a limited number of applications. Thus, for HR/AM targeted quantification we focus our discussion on method development, data processing and analysis, and its advantages and limitations in targeted proteomics. Finally, general perspectives on the potential of achieving both high sensitivity and high sample throughput for large-scale quantification of hundreds of target proteins are discussed.« less

  12. Automated quantification of Epstein-Barr Virus in whole blood of hematopoietic stem cell transplant patients using the Abbott m2000 system.

    PubMed

    Salmona, Maud; Fourati, Slim; Feghoul, Linda; Scieux, Catherine; Thiriez, Aline; Simon, François; Resche-Rigon, Matthieu; LeGoff, Jérôme

    2016-08-01

    Accurate quantification of Epstein-Barr virus (EBV) load in blood is essential for the management of post-transplant lymphoproliferative disorders. The automation of DNA extraction and amplification may improve accuracy and reproducibility. We evaluated the EBV PCR Kit V1 with fully automated DNA extraction and amplification on the m2000 system (Abbott assay). Conversion factor between copies and international units (IU), lower limit of quantification, imprecision and linearity were determined in a whole blood (WB) matrix. Results from 339 clinical WB specimens were compared with a home-brew real-time PCR assay used in our laboratory (in-house assay). The conversion factor between copies and IU was 3.22 copies/IU. The lower limit of quantification (LLQ) was 1000 copies/mL. Intra- and inter-assay coefficients of variation were 3.1% and 7.9% respectively for samples with EBV load higher than the LLQ. The comparison between Abbott assay and in-house assay showed a good concordance (kappa = 0.77). Loads were higher with the Abbott assay (mean difference = 0.62 log10 copies/mL). The EBV PCR Kit V1 assay on the m2000 system provides a reliable and easy-to-use method for quantification of EBV DNA in WB. Copyright © 2016 Elsevier Inc. All rights reserved.

  13. Modeling qRT-PCR dynamics with application to cancer biomarker quantification.

    PubMed

    Chervoneva, Inna; Freydin, Boris; Hyslop, Terry; Waldman, Scott A

    2017-01-01

    Quantitative reverse transcription polymerase chain reaction (qRT-PCR) is widely used for molecular diagnostics and evaluating prognosis in cancer. The utility of mRNA expression biomarkers relies heavily on the accuracy and precision of quantification, which is still challenging for low abundance transcripts. The critical step for quantification is accurate estimation of efficiency needed for computing a relative qRT-PCR expression. We propose a new approach to estimating qRT-PCR efficiency based on modeling dynamics of polymerase chain reaction amplification. In contrast, only models for fluorescence intensity as a function of polymerase chain reaction cycle have been used so far for quantification. The dynamics of qRT-PCR efficiency is modeled using an ordinary differential equation model, and the fitted ordinary differential equation model is used to obtain effective polymerase chain reaction efficiency estimates needed for efficiency-adjusted quantification. The proposed new qRT-PCR efficiency estimates were used to quantify GUCY2C (Guanylate Cyclase 2C) mRNA expression in the blood of colorectal cancer patients. Time to recurrence and GUCY2C expression ratios were analyzed in a joint model for survival and longitudinal outcomes. The joint model with GUCY2C quantified using the proposed polymerase chain reaction efficiency estimates provided clinically meaningful results for association between time to recurrence and longitudinal trends in GUCY2C expression.

  14. New approach for the quantification of processed animal proteins in feed using light microscopy.

    PubMed

    Veys, P; Baeten, V

    2010-07-01

    A revision of European Union's total feed ban on animal proteins in feed will need robust quantification methods, especially for control analyses, if tolerance levels are to be introduced, as for fishmeal in ruminant feed. In 2006, a study conducted by the Community Reference Laboratory for Animal Proteins in feedstuffs (CRL-AP) demonstrated the deficiency of the official quantification method based on light microscopy. The study concluded that the method had to be revised. This paper puts forward an improved quantification method based on three elements: (1) the preparation of permanent slides with an optical adhesive preserving all morphological markers of bones necessary for accurate identification and precision counting; (2) the use of a counting grid eyepiece reticle; and (3) new definitions for correction factors for the estimated portions of animal particles in the sediment. This revised quantification method was tested on feeds adulterated at different levels with bovine meat and bone meal (MBM) and fishmeal, and it proved to be effortless to apply. The results obtained were very close to the expected values of contamination levels for both types of adulteration (MBM or fishmeal). Calculated values were not only replicable, but also reproducible. The advantages of the new approach, including the benefits of the optical adhesive used for permanent slide mounting and the experimental conditions that need to be met to implement the new method correctly, are discussed.

  15. Introducing AAA-MS, a rapid and sensitive method for amino acid analysis using isotope dilution and high-resolution mass spectrometry.

    PubMed

    Louwagie, Mathilde; Kieffer-Jaquinod, Sylvie; Dupierris, Véronique; Couté, Yohann; Bruley, Christophe; Garin, Jérôme; Dupuis, Alain; Jaquinod, Michel; Brun, Virginie

    2012-07-06

    Accurate quantification of pure peptides and proteins is essential for biotechnology, clinical chemistry, proteomics, and systems biology. The reference method to quantify peptides and proteins is amino acid analysis (AAA). This consists of an acidic hydrolysis followed by chromatographic separation and spectrophotometric detection of amino acids. Although widely used, this method displays some limitations, in particular the need for large amounts of starting material. Driven by the need to quantify isotope-dilution standards used for absolute quantitative proteomics, particularly stable isotope-labeled (SIL) peptides and PSAQ proteins, we developed a new AAA assay (AAA-MS). This method requires neither derivatization nor chromatographic separation of amino acids. It is based on rapid microwave-assisted acidic hydrolysis followed by high-resolution mass spectrometry analysis of amino acids. Quantification is performed by comparing MS signals from labeled amino acids (SIL peptide- and PSAQ-derived) with those of unlabeled amino acids originating from co-hydrolyzed NIST standard reference materials. For both SIL peptides and PSAQ standards, AAA-MS quantification results were consistent with classical AAA measurements. Compared to AAA assay, AAA-MS was much faster and was 100-fold more sensitive for peptide and protein quantification. Finally, thanks to the development of a labeled protein standard, we also extended AAA-MS analysis to the quantification of unlabeled proteins.

  16. The Qiagen Investigator® Quantiplex HYres as an alternative kit for DNA quantification.

    PubMed

    Frégeau, Chantal J; Laurin, Nancy

    2015-05-01

    The Investigator® Quantiplex HYres kit was evaluated as a potential replacement for dual DNA quantification of casework samples. This kit was determined to be highly sensitive with a limit of quantification and limit of detection of 0.0049ng/μL and 0.0003ng/μL, respectively, for both human and male DNA, using full or half reaction volumes. It was also accurate in assessing the amount of male DNA present in 96 mock and actual casework male:female mixtures (various ratios) processed in this exercise. The close correlation between the male/human DNA ratios expressed in percentages derived from the Investigator® Quantiplex HYres quantification results and the male DNA proportion calculated in mixed AmpFlSTR® Profiler® Plus or AmpFlSTR® Identifiler® Plus profiles, using the Amelogenin Y peak and STR loci, allowed guidelines to be developed to facilitate decisions regarding when to submit samples to Y-STR rather than autosomal STR profiling. The internal control (IC) target was shown to be more sensitive to inhibitors compared to the human and male DNA targets included in the Investigator® Quantiplex HYres kit serving as a good quality assessor of DNA extracts. The new kit met our criteria of enhanced sensitivity, accuracy, consistency, reliability and robustness for casework DNA quantification. Crown Copyright © 2015. Published by Elsevier Ireland Ltd. All rights reserved.

  17. Numerical Modeling of Artificial Recharge: Determining Spatial/Temporal Sampling Resolution to Quantify Infiltration Rates and Effective Hydraulic Conductivity

    NASA Astrophysics Data System (ADS)

    Glose, T. J.; Hausner, M. B.; Lowry, C.

    2016-12-01

    The accurate, fine scale quantification of groundwater-surface water (GW-SW) interactions over large expanses in hydrologic systems is a fundamental need in order to accurately characterize critical zones of biogeochemical transformation and fluxes, as well as to provide insight into near-surface geologic heterogeneity. Paired fiber-optic distributed temperature sensing (FO-DTS) is a tool that is capable of synoptically sampling hydrologic systems, allowing GW-SW interactions to be examined at a fine scale over large distances. Within managed aquifer recharge (MAR) sites, differential recharge dynamics controlled by bed clogging and subsurface heterogeneity dictate the effectiveness of these sites at infiltrating water. Numerical modeling indicates that the use of paired FO-DTS in an MAR site can provide accurate quantification of flux at the GW-SW interface, as well as provide insight to the areal extent of geologic heterogeneity in the subsurface. However, the lateral and vertical separation of the fiber-optic cables is of vital importance. Here we present a 2-D, fully coupled groundwater flow and heat transport model with prescribed heterogeneity. Following a forward modeling approach, realizations simulating varying fiber-optic cable positioning, differential bed clogging, and hydraulic conductivity variability were analyzed over a suite of scenarios. The results from the model were then used as observations to calculate groundwater recharge rates and calibration targets for an inverse model to estimate subsurface heterogeneity.

  18. Low-power, low-cost urinalysis system with integrated dipstick evaluation and microscopic analysis.

    PubMed

    Smith, Gennifer T; Li, Linkai; Zhu, Yue; Bowden, Audrey K

    2018-06-21

    We introduce a coupled dipstick and microscopy device for analyzing urine samples. The device is capable of accurately assessing urine dipstick results while simultaneously imaging the microscopic contents within the sample. We introduce a long working distance, cellphone-based microscope in combination with an oblique illumination scheme to accurately visualize and quantify particles within the urine sample. To facilitate accurate quantification, we couple the imaging set-up with a power-free filtration system. The proposed device is reusable, low-cost, and requires very little power. We show that results obtained with the proposed device and custom-built app are consistent with those obtained with the standard clinical protocol, suggesting the potential clinical utility of the device.

  19. Improved Precision and Accuracy of Quantification of Rare Earth Element Abundances via Medium-Resolution LA-ICP-MS.

    PubMed

    Funderburg, Rebecca; Arevalo, Ricardo; Locmelis, Marek; Adachi, Tomoko

    2017-11-01

    Laser ablation ICP-MS enables streamlined, high-sensitivity measurements of rare earth element (REE) abundances in geological materials. However, many REE isotope mass stations are plagued by isobaric interferences, particularly from diatomic oxides and argides. In this study, we compare REE abundances quantitated from mass spectra collected with low-resolution (m/Δm = 300 at 5% peak height) and medium-resolution (m/Δm = 2500) mass discrimination. A wide array of geological samples was analyzed, including USGS and NIST glasses ranging from mafic to felsic in composition, with NIST 610 employed as the bracketing calibrating reference material. The medium-resolution REE analyses are shown to be significantly more accurate and precise (at the 95% confidence level) than low-resolution analyses, particularly in samples characterized by low (<μg/g levels) REE abundances. A list of preferred mass stations that are least susceptible to isobaric interferences is reported. These findings impact the reliability of REE abundances derived from LA-ICP-MS methods, particularly those relying on mass analyzers that do not offer tuneable mass-resolution and/or collision cell technologies that can reduce oxide and/or argide formation. Graphical Abstract ᅟ.

  20. Improved Precision and Accuracy of Quantification of Rare Earth Element Abundances via Medium-Resolution LA-ICP-MS

    NASA Astrophysics Data System (ADS)

    Funderburg, Rebecca; Arevalo, Ricardo; Locmelis, Marek; Adachi, Tomoko

    2017-07-01

    Laser ablation ICP-MS enables streamlined, high-sensitivity measurements of rare earth element (REE) abundances in geological materials. However, many REE isotope mass stations are plagued by isobaric interferences, particularly from diatomic oxides and argides. In this study, we compare REE abundances quantitated from mass spectra collected with low-resolution (m/Δm = 300 at 5% peak height) and medium-resolution (m/Δm = 2500) mass discrimination. A wide array of geological samples was analyzed, including USGS and NIST glasses ranging from mafic to felsic in composition, with NIST 610 employed as the bracketing calibrating reference material. The medium-resolution REE analyses are shown to be significantly more accurate and precise (at the 95% confidence level) than low-resolution analyses, particularly in samples characterized by low (<μg/g levels) REE abundances. A list of preferred mass stations that are least susceptible to isobaric interferences is reported. These findings impact the reliability of REE abundances derived from LA-ICP-MS methods, particularly those relying on mass analyzers that do not offer tuneable mass-resolution and/or collision cell technologies that can reduce oxide and/or argide formation.

  1. Optimized Lateral Flow Immunoassay Reader for the Detection of Infectious Diseases in Developing Countries.

    PubMed

    Pilavaki, Evdokia; Demosthenous, Andreas

    2017-11-20

    Detection and control of infectious diseases is a major problem, especially in developing countries. Lateral flow immunoassays can be used with great success for the detection of infectious diseases. However, for the quantification of their results an electronic reader is required. This paper presents an optimized handheld electronic reader for developing countries. It features a potentially low-cost, low-power, battery-operated device with no added optical accessories. The operation of this proof of concept device is based on measuring the reflected light from the lateral flow immunoassay and translating it into the concentration of the specific analyte of interest. Characterization of the surface of the lateral flow immunoassay has been performed in order to accurately model its response to the incident light. Ray trace simulations have been performed to optimize the system and achieve maximum sensitivity by placing all the components in optimum positions. A microcontroller enables all the signal processing to be performed on the device and a Bluetooth module allows transmission of the results wirelessly to a mobile phone app. Its performance has been validated using lateral flow immunoassays with influenza A nucleoprotein in the concentration range of 0.5 ng/mL to 200 ng/mL.

  2. Tunable Diode Laser Atomic Absorption Spectroscopy for Detection of Potassium under Optically Thick Conditions.

    PubMed

    Qu, Zhechao; Steinvall, Erik; Ghorbani, Ramin; Schmidt, Florian M

    2016-04-05

    Potassium (K) is an important element related to ash and fine-particle formation in biomass combustion processes. In situ measurements of gaseous atomic potassium, K(g), using robust optical absorption techniques can provide valuable insight into the K chemistry. However, for typical parts per billion K(g) concentrations in biomass flames and reactor gases, the product of atomic line strength and absorption path length can give rise to such high absorbance that the sample becomes opaque around the transition line center. We present a tunable diode laser atomic absorption spectroscopy (TDLAAS) methodology that enables accurate, calibration-free species quantification even under optically thick conditions, given that Beer-Lambert's law is valid. Analyte concentration and collisional line shape broadening are simultaneously determined by a least-squares fit of simulated to measured absorption profiles. Method validation measurements of K(g) concentrations in saturated potassium hydroxide vapor in the temperature range 950-1200 K showed excellent agreement with equilibrium calculations, and a dynamic range from 40 pptv cm to 40 ppmv cm. The applicability of the compact TDLAAS sensor is demonstrated by real-time detection of K(g) concentrations close to biomass pellets during atmospheric combustion in a laboratory reactor.

  3. Analyses of air samples for ascospores of Leptosphaeria maculans and L.biglobosa by light microscopy and molecular techniques.

    PubMed

    Kaczmarek, J; Jedryczka, M; Fitt, B D L; Lucas, J A; Latunde-Dada, A O

    2009-01-01

    Spores of many fungal pathogens are dispersed by wind. Detection of these airborne inocula is important in forecasting both the onset and the risk of epiphytotics. Species-specific primers targeted at the internal transcribed spacer (ITS) region of Leptosphaeria maculans and L. biglobosa - the causal organisms of phoma stem canker and stem lesions of Brassica spp., including oilseed rape - were used to detect DNA extracted from particles deposited on tapes obtained from a spore trap operated in Rarwino (northwest Poland) from September to November in 2004 and 2006. The quantities of DNA assessed by traditional end-point PCR and quantitative real-time PCR were compared to microscopic counts of airborne ascospores. Results of this study showed that fluctuations in timing of ascospore release corresponded to the dynamics of combined concentrations of DNA from L. maculans and L. biglobosa, with significant positive correlations between ascospore number and DNA yield. Thus the utilization of PCR-based molecular diagnostic techniques enabled the detection, identification, and accurate quantification of airborne inoculum at the species level. Moreover, real-time PCR was more sensitive than traditional PCR, especially in years with low ascospore numbers.

  4. Simultaneous analysis of carotenoids and tocopherols in botanical species using one step solid-liquid extraction followed by high performance liquid chromatography.

    PubMed

    Valdivielso, Izaskun; Bustamante, María Ángeles; Ruiz de Gordoa, Juan Carlos; Nájera, Ana Isabel; de Renobales, Mertxe; Barron, Luis Javier R

    2015-04-15

    Carotenoids and tocopherols from botanical species abundant in Atlantic mountain grasslands were simultaneously extracted using one-step solid-liquid phase. A single n-hexane/2-propanol extract containing both types of compounds was injected twice under two different sets of HPLC conditions to separate the tocopherols by normal-phase chromatography and carotenoids by reverse-phase mode. The method allowed reproducible quantification in plant samples of very low amounts of α-, β-, γ- and δ-tocopherols (LOD from 0.0379 to 0.0720 μg g(-1) DM) and over 15 different xanthophylls and carotene isomers. The simplified one-step extraction without saponification significantly increased the recovery of tocopherols and carotenoids, thereby enabling the determination of α-tocopherol acetate in plant samples. The two different sets of chromatographic analysis provided near baseline separation of individual compounds without interference from other lipid compounds extracted from plants, and a very sensitive and accurate detection of tocopherols and carotenoids. The detection of minor individual components in botanical species from grasslands is nowadays of high interest in searching for biomarkers for foods derived from grazing animals. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Remodeling and homeostasis of the extracellular matrix: implications for fibrotic diseases and cancer

    PubMed Central

    Cox, Thomas R.; Erler, Janine T.

    2011-01-01

    Dynamic remodeling of the extracellular matrix (ECM) is essential for development, wound healing and normal organ homeostasis. Life-threatening pathological conditions arise when ECM remodeling becomes excessive or uncontrolled. In this Perspective, we focus on how ECM remodeling contributes to fibrotic diseases and cancer, which both present challenging obstacles with respect to clinical treatment, to illustrate the importance and complexity of cell-ECM interactions in the pathogenesis of these conditions. Fibrotic diseases, which include pulmonary fibrosis, systemic sclerosis, liver cirrhosis and cardiovascular disease, account for over 45% of deaths in the developed world. ECM remodeling is also crucial for tumor malignancy and metastatic progression, which ultimately cause over 90% of deaths from cancer. Here, we discuss current methodologies and models for understanding and quantifying the impact of environmental cues provided by the ECM on disease progression, and how improving our understanding of ECM remodeling in these pathological conditions is crucial for uncovering novel therapeutic targets and treatment strategies. This can only be achieved through the use of appropriate in vitro and in vivo models to mimic disease, and with technologies that enable accurate monitoring, imaging and quantification of the ECM. PMID:21324931

  6. Validation metrics for turbulent plasma transport

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Holland, C., E-mail: chholland@ucsd.edu

    Developing accurate models of plasma dynamics is essential for confident predictive modeling of current and future fusion devices. In modern computer science and engineering, formal verification and validation processes are used to assess model accuracy and establish confidence in the predictive capabilities of a given model. This paper provides an overview of the key guiding principles and best practices for the development of validation metrics, illustrated using examples from investigations of turbulent transport in magnetically confined plasmas. Particular emphasis is given to the importance of uncertainty quantification and its inclusion within the metrics, and the need for utilizing synthetic diagnosticsmore » to enable quantitatively meaningful comparisons between simulation and experiment. As a starting point, the structure of commonly used global transport model metrics and their limitations is reviewed. An alternate approach is then presented, which focuses upon comparisons of predicted local fluxes, fluctuations, and equilibrium gradients against observation. The utility of metrics based upon these comparisons is demonstrated by applying them to gyrokinetic predictions of turbulent transport in a variety of discharges performed on the DIII-D tokamak [J. L. Luxon, Nucl. Fusion 42, 614 (2002)], as part of a multi-year transport model validation activity.« less

  7. MR fingerprinting using fast imaging with steady state precession (FISP) with spiral readout.

    PubMed

    Jiang, Yun; Ma, Dan; Seiberlich, Nicole; Gulani, Vikas; Griswold, Mark A

    2015-12-01

    This study explores the possibility of using gradient echo-based sequences other than balanced steady-state free precession (bSSFP) in the magnetic resonance fingerprinting (MRF) framework to quantify the relaxation parameters . An MRF method based on a fast imaging with steady-state precession (FISP) sequence structure is presented. A dictionary containing possible signal evolutions with physiological range of T1 and T2 was created using the extended phase graph formalism according to the acquisition parameters. The proposed method was evaluated in a phantom and a human brain. T1 , T2 , and proton density were quantified directly from the undersampled data by the pattern recognition algorithm. T1 and T2 values from the phantom demonstrate that the results of MRF FISP are in good agreement with the traditional gold-standard methods. T1 and T2 values in brain are within the range of previously reported values. MRF-FISP enables a fast and accurate quantification of the relaxation parameters. It is immune to the banding artifact of bSSFP due to B0 inhomogeneities, which could improve the ability to use MRF for applications beyond brain imaging. © 2014 Wiley Periodicals, Inc.

  8. MR Fingerprinting Using Fast Imaging with Steady State Precession (FISP) with Spiral Readout

    PubMed Central

    Jiang, Yun; Ma, Dan; Seiberlich, Nicole; Gulani, Vikas; Griswold, Mark A.

    2015-01-01

    Purpose This study explores the possibility of using gradient echo based sequences other than bSSFP in the magnetic resonance fingerprinting (MRF) framework to quantify the relaxation parameters. Methods An MRF method based on a fast imaging with steady state precession (FISP) sequence structure is presented. A dictionary containing possible signal evolutions with physiological range of T1 and T2 was created using the extended phase graph (EPG) formalism according to the acquisition parameters. The proposed method was evaluated in a phantom and a human brain. T1, T2 and proton density were quantified directly from the undersampled data by the pattern recognition algorithm. Results T1 and T2 values from the phantom demonstrate that the results of MRF FISP are in good agreement with the traditional gold-standard methods. T1 and T2 values in brain are within the range of previously reported values. Conclusion MRF FISP enables a fast and accurate quantification of the relaxation parameters, while is immune to the banding artifact of bSSFP due to B0 inhomogeneities, which could improve the ability to use MRF for applications beyond brain imaging. PMID:25491018

  9. Establishing Ion Ratio Thresholds Based on Absolute Peak Area for Absolute Protein Quantification using Protein Cleavage Isotope Dilution Mass Spectrometry

    PubMed Central

    Loziuk, Philip L.; Sederoff, Ronald R.; Chiang, Vincent L.; Muddiman, David C.

    2014-01-01

    Quantitative mass spectrometry has become central to the field of proteomics and metabolomics. Selected reaction monitoring is a widely used method for the absolute quantification of proteins and metabolites. This method renders high specificity using several product ions measured simultaneously. With growing interest in quantification of molecular species in complex biological samples, confident identification and quantitation has been of particular concern. A method to confirm purity or contamination of product ion spectra has become necessary for achieving accurate and precise quantification. Ion abundance ratio assessments were introduced to alleviate some of these issues. Ion abundance ratios are based on the consistent relative abundance (RA) of specific product ions with respect to the total abundance of all product ions. To date, no standardized method of implementing ion abundance ratios has been established. Thresholds by which product ion contamination is confirmed vary widely and are often arbitrary. This study sought to establish criteria by which the relative abundance of product ions can be evaluated in an absolute quantification experiment. These findings suggest that evaluation of the absolute ion abundance for any given transition is necessary in order to effectively implement RA thresholds. Overall, the variation of the RA value was observed to be relatively constant beyond an absolute threshold ion abundance. Finally, these RA values were observed to fluctuate significantly over a 3 year period, suggesting that these values should be assessed as close as possible to the time at which data is collected for quantification. PMID:25154770

  10. Quantification of pericardial effusions by echocardiography and computed tomography.

    PubMed

    Leibowitz, David; Perlman, Gidon; Planer, David; Gilon, Dan; Berman, Philip; Bogot, Naama

    2011-01-15

    Echocardiography is a well-accepted tool for the diagnosis and quantification of pericardial effusion (PEff). Given the increasing use of computed tomographic (CT) scanning, more PEffs are being initially diagnosed by computed tomography. No study has compared quantification of PEff by computed tomography and echocardiography. The objective of this study was to assess the accuracy of quantification of PEff by 2-dimensional echocardiography and computed tomography compared to the amount of pericardial fluid drained at pericardiocentesis. We retrospectively reviewed an institutional database to identify patients who underwent chest computed tomography and echocardiography before percutaneous pericardiocentesis with documentation of the amount of fluid withdrawn. Digital 2-dimensional echocardiographic and CT images were retrieved and quantification of PEff volume was performed by applying the formula for the volume of a prolate ellipse, π × 4/3 × maximal long-axis dimension/2 × maximal transverse dimension/2 × maximal anteroposterior dimension/2, to the pericardial sac and to the heart. Nineteen patients meeting study qualifications were entered into the study. The amount of PEff drained was 200 to 1,700 ml (mean 674 ± 340). Echocardiographically calculated pericardial effusion volume correlated relatively well with PEff volume (r = 0.73, p <0.001, mean difference -41 ± 225 ml). There was only moderate correlation between CT volume quantification and actual volume drained (r = 0.4, p = 0.004, mean difference 158 ± 379 ml). In conclusion, echocardiography appears a more accurate imaging technique than computed tomography in quantitative assessment of nonloculated PEffs and should continue to be the primary imaging in these patients. Copyright © 2011 Elsevier Inc. All rights reserved.

  11. A Set of Handwriting Features for Use in Automated Writer Identification.

    PubMed

    Miller, John J; Patterson, Robert Bradley; Gantz, Donald T; Saunders, Christopher P; Walch, Mark A; Buscaglia, JoAnn

    2017-05-01

    A writer's biometric identity can be characterized through the distribution of physical feature measurements ("writer's profile"); a graph-based system that facilitates the quantification of these features is described. To accomplish this quantification, handwriting is segmented into basic graphical forms ("graphemes"), which are "skeletonized" to yield the graphical topology of the handwritten segment. The graph-based matching algorithm compares the graphemes first by their graphical topology and then by their geometric features. Graphs derived from known writers can be compared against graphs extracted from unknown writings. The process is computationally intensive and relies heavily upon statistical pattern recognition algorithms. This article focuses on the quantification of these physical features and the construction of the associated pattern recognition methods for using the features to discriminate among writers. The graph-based system described in this article has been implemented in a highly accurate and approximately language-independent biometric recognition system of writers of cursive documents. © 2017 American Academy of Forensic Sciences.

  12. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics.

    PubMed

    Röst, Hannes L; Liu, Yansheng; D'Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-09-01

    Next-generation mass spectrometric (MS) techniques such as SWATH-MS have substantially increased the throughput and reproducibility of proteomic analysis, but ensuring consistent quantification of thousands of peptide analytes across multiple liquid chromatography-tandem MS (LC-MS/MS) runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we developed TRIC (http://proteomics.ethz.ch/tric/), a software tool that utilizes fragment-ion data to perform cross-run alignment, consistent peak-picking and quantification for high-throughput targeted proteomics. TRIC reduced the identification error compared to a state-of-the-art SWATH-MS analysis without alignment by more than threefold at constant recall while correcting for highly nonlinear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups. Thus, TRIC fills a gap in the pipeline for automated analysis of massively parallel targeted proteomics data sets.

  13. Simultaneous quantification of withanolides in Withania somnifera by a validated high-performance thin-layer chromatographic method.

    PubMed

    Srivastava, Pooja; Tiwari, Neerja; Yadav, Akhilesh K; Kumar, Vijendra; Shanker, Karuna; Verma, Ram K; Gupta, Madan M; Gupta, Anil K; Khanuja, Suman P S

    2008-01-01

    This paper describes a sensitive, selective, specific, robust, and validated densitometric high-performance thin-layer chromatographic (HPTLC) method for the simultaneous determination of 3 key withanolides, namely, withaferin-A, 12-deoxywithastramonolide, and withanolide-A, in Ashwagandha (Withania somnifera) plant samples. The separation was performed on aluminum-backed silica gel 60F254 HPTLC plates using dichloromethane-methanol-acetone-diethyl ether (15 + 1 + 1 + 1, v/v/v/v) as the mobile phase. The withanolides were quantified by densitometry in the reflection/absorption mode at 230 nm. Precise and accurate quantification could be performed in the linear working concentration range of 66-330 ng/band with good correlation (r2 = 0.997, 0.999, and 0.996, respectively). The method was validated for recovery, precision, accuracy, robustness, limit of detection, limit of quantitation, and specificity according to International Conference on Harmonization guidelines. Specificity of quantification was confirmed using retention factor (Rf) values, UV-Vis spectral correlation, and electrospray ionization mass spectra of marker compounds in sample tracks.

  14. X-ray fluorescence at nanoscale resolution for multicomponent layered structures: A solar cell case study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    West, Bradley M.; Stuckelberger, Michael; Jeffries, April

    The study of a multilayered and multicomponent system by spatially resolved X-ray fluorescence microscopy poses unique challenges in achieving accurate quantification of elemental distributions. This is particularly true for the quantification of materials with high X-ray attenuation coefficients, depth-dependent composition variations and thickness variations. A widely applicable procedure for use after spectrum fitting and quantification is described. This procedure corrects the elemental distribution from the measured fluorescence signal, taking into account attenuation of the incident beam and generated fluorescence from multiple layers, and accounts for sample thickness variations. Deriving from Beer–Lambert's law, formulae are presented in a general integral formmore » and numerically applicable framework. Here, the procedure is applied using experimental data from a solar cell with a Cu(In,Ga)Se 2 absorber layer, measured at two separate synchrotron beamlines with varied measurement geometries. This example shows the importance of these corrections in real material systems, which can change the interpretation of the measured distributions dramatically.« less

  15. X-ray fluorescence at nanoscale resolution for multicomponent layered structures: A solar cell case study

    DOE PAGES

    West, Bradley M.; Stuckelberger, Michael; Jeffries, April; ...

    2017-01-01

    The study of a multilayered and multicomponent system by spatially resolved X-ray fluorescence microscopy poses unique challenges in achieving accurate quantification of elemental distributions. This is particularly true for the quantification of materials with high X-ray attenuation coefficients, depth-dependent composition variations and thickness variations. A widely applicable procedure for use after spectrum fitting and quantification is described. This procedure corrects the elemental distribution from the measured fluorescence signal, taking into account attenuation of the incident beam and generated fluorescence from multiple layers, and accounts for sample thickness variations. Deriving from Beer–Lambert's law, formulae are presented in a general integral formmore » and numerically applicable framework. Here, the procedure is applied using experimental data from a solar cell with a Cu(In,Ga)Se 2 absorber layer, measured at two separate synchrotron beamlines with varied measurement geometries. This example shows the importance of these corrections in real material systems, which can change the interpretation of the measured distributions dramatically.« less

  16. Determination of rifampicin in human plasma by high-performance liquid chromatography coupled with ultraviolet detection after automatized solid-liquid extraction.

    PubMed

    Louveau, B; Fernandez, C; Zahr, N; Sauvageon-Martre, H; Maslanka, P; Faure, P; Mourah, S; Goldwirt, L

    2016-12-01

    A precise and accurate high-performance liquid chromatography (HPLC) quantification method of rifampicin in human plasma was developed and validated using ultraviolet detection after an automatized solid-phase extraction. The method was validated with respect to selectivity, extraction recovery, linearity, intra- and inter-day precision, accuracy, lower limit of quantification and stability. Chromatographic separation was performed on a Chromolith RP 8 column using a mixture of 0.05 m acetate buffer pH 5.7-acetonitrile (35:65, v/v) as mobile phase. The compounds were detected at a wavelength of 335 nm with a lower limit of quantification of 0.05 mg/L in human plasma. Retention times for rifampicin and 6,7-dimethyl-2,3-di(2-pyridyl) quinoxaline used as internal standard were respectively 3.77 and 4.81 min. This robust and exact method was successfully applied in routine for therapeutic drug monitoring in patients treated with rifampicin. Copyright © 2016 John Wiley & Sons, Ltd.

  17. Analytical Methods for Quantification of Vitamin D and Implications for Research and Clinical Practice.

    PubMed

    Stokes, Caroline S; Lammert, Frank; Volmer, Dietrich A

    2018-02-01

    A plethora of contradictory research surrounds vitamin D and its influence on health and disease. This may, in part, result from analytical difficulties with regard to measuring vitamin D metabolites in serum. Indeed, variation exists between analytical techniques and assays used for the determination of serum 25-hydroxyvitamin D. Research studies into the effects of vitamin D on clinical endpoints rely heavily on the accurate assessment of vitamin D status. This has important implications, as findings from vitamin D-related studies to date may potentially have been hampered by the quantification techniques used. Likewise, healthcare professionals are increasingly incorporating vitamin D testing and supplementation regimens into their practice, and measurement errors may be also confounding the clinical decisions. Importantly, the Vitamin D Standardisation Programme is an initiative that aims to standardise the measurement of vitamin D metabolites. Such a programme is anticipated to eliminate the inaccuracies surrounding vitamin D quantification. Copyright© 2018, International Institute of Anticancer Research (Dr. George J. Delinasios), All rights reserved.

  18. A novel approach to quantify cybersecurity for electric power systems

    NASA Astrophysics Data System (ADS)

    Kaster, Paul R., Jr.

    Electric Power grid cybersecurity is a topic gaining increased attention in academia, industry, and government circles, yet a method of quantifying and evaluating a system's security is not yet commonly accepted. In order to be useful, a quantification scheme must be able to accurately reflect the degree to which a system is secure, simply determine the level of security in a system using real-world values, model a wide variety of attacker capabilities, be useful for planning and evaluation, allow a system owner to publish information without compromising the security of the system, and compare relative levels of security between systems. Published attempts at quantifying cybersecurity fail at one or more of these criteria. This document proposes a new method of quantifying cybersecurity that meets those objectives. This dissertation evaluates the current state of cybersecurity research, discusses the criteria mentioned previously, proposes a new quantification scheme, presents an innovative method of modeling cyber attacks, demonstrates that the proposed quantification methodology meets the evaluation criteria, and proposes a line of research for future efforts.

  19. Quantitative determination of polycyclic aromatic hydrocarbons in barbecued meat sausages by gas chromatography coupled to mass spectrometry.

    PubMed

    Mottier, P; Parisod, V; Turesky, R J

    2000-04-01

    A method is described for the analysis of the 16 polycyclic aromatic hydrocarbons (PAHs) prioritized by the USA EPA in meat sausages grilled under common barbecue practices. Quantification was done by GC-MS using perdeuterated internal standards (IS). Validation was done by spiking the matrix at the 0.5 and 1.0 microg/kg levels. The average of expected values ranged from 60 to 134% (median 84%) at the 0.5 microg/kg level and from 69 to 121% (median 96%) at the 1.0 microg/kg level. The median of the limits of detection and quantification were 0.06 and 0.20 microg/kg, respectively, for a 4-g test portion. The carcinogenic PAHs were below the quantification limit in all products except one lamb sausage. Comparison of estimates when either 1, 5, or 16 perdeuterated PAHs were used as IS showed that the most accurate determination of PAHs required that each compound be quantified against its corresponding perdeuterated analogue.

  20. Automated design of paralogue ratio test assays for the accurate and rapid typing of copy number variation

    PubMed Central

    Veal, Colin D.; Xu, Hang; Reekie, Katherine; Free, Robert; Hardwick, Robert J.; McVey, David; Brookes, Anthony J.; Hollox, Edward J.; Talbot, Christopher J.

    2013-01-01

    Motivation: Genomic copy number variation (CNV) can influence susceptibility to common diseases. High-throughput measurement of gene copy number on large numbers of samples is a challenging, yet critical, stage in confirming observations from sequencing or array Comparative Genome Hybridization (CGH). The paralogue ratio test (PRT) is a simple, cost-effective method of accurately determining copy number by quantifying the amplification ratio between a target and reference amplicon. PRT has been successfully applied to several studies analyzing common CNV. However, its use has not been widespread because of difficulties in assay design. Results: We present PRTPrimer (www.prtprimer.org) software for automated PRT assay design. In addition to stand-alone software, the web site includes a database of pre-designed assays for the human genome at an average spacing of 6 kb and a web interface for custom assay design. Other reference genomes can also be analyzed through local installation of the software. The usefulness of PRTPrimer was tested within known CNV, and showed reproducible quantification. This software and database provide assays that can rapidly genotype CNV, cost-effectively, on a large number of samples and will enable the widespread adoption of PRT. Availability: PRTPrimer is available in two forms: a Perl script (version 5.14 and higher) that can be run from the command line on Linux systems and as a service on the PRTPrimer web site (www.prtprimer.org). Contact: cjt14@le.ac.uk Supplementary Information: Supplementary data are available at Bioinformatics online. PMID:23742985

  1. Quantification of Pulmonary Fibrosis in a Bleomycin Mouse Model Using Automated Histological Image Analysis.

    PubMed

    Gilhodes, Jean-Claude; Julé, Yvon; Kreuz, Sebastian; Stierstorfer, Birgit; Stiller, Detlef; Wollin, Lutz

    2017-01-01

    Current literature on pulmonary fibrosis induced in animal models highlights the need of an accurate, reliable and reproducible histological quantitative analysis. One of the major limits of histological scoring concerns the fact that it is observer-dependent and consequently subject to variability, which may preclude comparative studies between different laboratories. To achieve a reliable and observer-independent quantification of lung fibrosis we developed an automated software histological image analysis performed from digital image of entire lung sections. This automated analysis was compared to standard evaluation methods with regard to its validation as an end-point measure of fibrosis. Lung fibrosis was induced in mice by intratracheal administration of bleomycin (BLM) at 0.25, 0.5, 0.75 and 1 mg/kg. A detailed characterization of BLM-induced fibrosis was performed 14 days after BLM administration using lung function testing, micro-computed tomography and Ashcroft scoring analysis. Quantification of fibrosis by automated analysis was assessed based on pulmonary tissue density measured from thousands of micro-tiles processed from digital images of entire lung sections. Prior to analysis, large bronchi and vessels were manually excluded from the original images. Measurement of fibrosis has been expressed by two indexes: the mean pulmonary tissue density and the high pulmonary tissue density frequency. We showed that tissue density indexes gave access to a very accurate and reliable quantification of morphological changes induced by BLM even for the lowest concentration used (0.25 mg/kg). A reconstructed 2D-image of the entire lung section at high resolution (3.6 μm/pixel) has been performed from tissue density values allowing the visualization of their distribution throughout fibrotic and non-fibrotic regions. A significant correlation (p<0.0001) was found between automated analysis and the above standard evaluation methods. This correlation establishes automated analysis as a novel end-point measure of BLM-induced lung fibrosis in mice, which will be very valuable for future preclinical drug explorations.

  2. Quantification of Pulmonary Fibrosis in a Bleomycin Mouse Model Using Automated Histological Image Analysis

    PubMed Central

    Gilhodes, Jean-Claude; Kreuz, Sebastian; Stierstorfer, Birgit; Stiller, Detlef; Wollin, Lutz

    2017-01-01

    Current literature on pulmonary fibrosis induced in animal models highlights the need of an accurate, reliable and reproducible histological quantitative analysis. One of the major limits of histological scoring concerns the fact that it is observer-dependent and consequently subject to variability, which may preclude comparative studies between different laboratories. To achieve a reliable and observer-independent quantification of lung fibrosis we developed an automated software histological image analysis performed from digital image of entire lung sections. This automated analysis was compared to standard evaluation methods with regard to its validation as an end-point measure of fibrosis. Lung fibrosis was induced in mice by intratracheal administration of bleomycin (BLM) at 0.25, 0.5, 0.75 and 1 mg/kg. A detailed characterization of BLM-induced fibrosis was performed 14 days after BLM administration using lung function testing, micro-computed tomography and Ashcroft scoring analysis. Quantification of fibrosis by automated analysis was assessed based on pulmonary tissue density measured from thousands of micro-tiles processed from digital images of entire lung sections. Prior to analysis, large bronchi and vessels were manually excluded from the original images. Measurement of fibrosis has been expressed by two indexes: the mean pulmonary tissue density and the high pulmonary tissue density frequency. We showed that tissue density indexes gave access to a very accurate and reliable quantification of morphological changes induced by BLM even for the lowest concentration used (0.25 mg/kg). A reconstructed 2D-image of the entire lung section at high resolution (3.6 μm/pixel) has been performed from tissue density values allowing the visualization of their distribution throughout fibrotic and non-fibrotic regions. A significant correlation (p<0.0001) was found between automated analysis and the above standard evaluation methods. This correlation establishes automated analysis as a novel end-point measure of BLM-induced lung fibrosis in mice, which will be very valuable for future preclinical drug explorations. PMID:28107543

  3. An on-spot internal standard addition approach for accurately determining colistin A and colistin B in dried blood spots using ultra high-performance liquid chromatography-tandem mass spectrometry.

    PubMed

    Tsai, I-Lin; Kuo, Ching-Hua; Sun, Hsin-Yun; Chuang, Yu-Chung; Chepyala, Divyabharathi; Lin, Shu-Wen; Tsai, Yun-Jung

    2017-10-25

    Outbreaks of multidrug-resistant Gram-negative bacterial infections have been reported worldwide. Colistin, an antibiotic with known nephrotoxicity and neurotoxicity, is now being used to treat multidrug-resistant Gram-negative strains. In this study, we applied an on-spot internal standard addition approach coupled with an ultra high-performance liquid chromatography-tandem mass spectrometry (LC-MS/MS) method to quantify colistin A and B from dried blood spots (DBSs). Only 15μL of whole blood was required for each sample. An internal standard with the same yield of extraction recoveries as colistin was added to the spot before sample extraction for accurate quantification. Formic acid in water (0.15%) with an equal volume of acetonitrile (50:50v/v) was used as the extraction solution. With the optimized extraction process and LC-MS/MS conditions, colistin A and B could be quantified from a DBS with respective limits of quantification of 0.13 and 0.27μgmL -1 , and the retention times were < 2min. The relative standard deviations of within-run and between-run precisions for peak area ratios were all < 17.3%. Accuracies were 91.5-111.2% for lower limit of quantification, low, medium, and high QC samples. The stability of the easily hydrolyzed prodrug, colistin methanesulfonate, was investigated in DBSs. Less than 4% of the prodrug was found to be hydrolyzed in DBSs at room temperature after 48h. The developed method applied an on-spot internal standard addition approach which benefited the precision and accuracy. Results showed that DBS sampling coupled with the sensitive LC-MS/MS method has the potential to be an alternative approach for colistin quantification, where the bias of prodrug hydrolysis in liquid samples is decreased. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Automated Image Analysis Corrosion Working Group Update: February 1, 2018

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wendelberger, James G.

    These are slides for the automated image analysis corrosion working group update. The overall goals were: automate the detection and quantification of features in images (faster, more accurate), how to do this (obtain data, analyze data), focus on Laser Scanning Confocal Microscope (LCM) data (laser intensity, laser height/depth, optical RGB, optical plus laser RGB).

  5. Quantification of betaglucans, lipid and protein contents in whole oat groats (Avena sativa L.) using near infrared reflectance spectroscopy

    USDA-ARS?s Scientific Manuscript database

    Whole oat has been described as an important healthy food for humans due to its beneficial nutritional components. Near infrared reflectance spectroscopy (NIRS) is a powerful, fast, accurate and non-destructive analytical tool that can be substituted for some traditional chemical analysis. A total o...

  6. A Standardized System of Training Intensity Guidelines for the Sports of Track and Field and Cross Country

    ERIC Educational Resources Information Center

    Belcher, Christopher P.; Pemberton, Cynthia Lee A.

    2012-01-01

    Accurate quantification of training intensity is an essential component of a training program (Rowbottom, 2000). A training program designed to optimize athlete performance abilities cannot be practically planned or implemented without a valid and reliable indication of training intensity and its effect on the physiological mechanisms of the human…

  7. Determination of humic and fulvic acids in commercial solid and liquid humic products by alkaline extraction and gravimetric determination

    USDA-ARS?s Scientific Manuscript database

    Increased use of humic substances in agriculture has generated intense interest among producers, consumers, and regulators for an accurate and reliable method for quantification of humic (HA) and fulvic acids (FA) in raw ores and products. Here we present a thoroughly validated method, the Humic Pro...

  8. Quantifying and partitioning the soil phosphorus of seven Hawaiian soils as extracted by the Olsen and Truog methods

    USDA-ARS?s Scientific Manuscript database

    Accurate quantification of available phosphorus (P) in tropical soils is important for fertilizer P recommendation. Two intrinsic P pools including weakly and tightly adsorbed P pools were recently proposed to quantify soil available P, and the weakly adsorbed P pool can be measured with the Olsen m...

  9. Near-source mobile methane emission estimates using EPA Method33a and a novel probabilistic approach as a basis for leak quantification in urban areas

    EPA Science Inventory

    Methane emissions from underground pipeline leaks remain an ongoing issue in the development of accurate methane emission inventories for the natural gas supply chain. Application of mobile methods during routine street surveys would help address this issue, but there are large ...

  10. Evaluating the remote sensing and inventory-based estimation of biomass in the western Carpathians

    Treesearch

    Magdalena Main-Knorn; Gretchen G. Moisen; Sean P. Healey; William S. Keeton; Elizabeth A. Freeman; Patrick Hostert

    2011-01-01

    Understanding the potential of forest ecosystems as global carbon sinks requires a thorough knowledge of forest carbon dynamics, including both sequestration and fluxes among multiple pools. The accurate quantification of biomass is important to better understand forest productivity and carbon cycling dynamics. Stand-based inventories (SBIs) are widely used for...

  11. Experimental evaluation of several key factors affecting root biomass estimation by 1500 MHz ground penetrating radar

    Treesearch

    John Bain; Frank Day; John Butnor

    2017-01-01

    Accurate quantification of coarse roots without disturbance represents a gap in our understanding of belowground ecology. Ground penetrating radar (GPR) has shown significant promise for coarse root detection and measurement, however root orientation relative to scanning transect direction, the difficulty identifying dead root mass, and the effects of root shadowing...

  12. Quantification and visualization of coordination during non-cyclic upper extremity motion.

    PubMed

    Fineman, Richard A; Stirling, Leia A

    2017-10-03

    There are many design challenges in creating at-home tele-monitoring systems that enable quantification and visualization of complex biomechanical behavior. One such challenge is robustly quantifying joint coordination in a way that is intuitive and supports clinical decision-making. This work defines a new measure of coordination called the relative coordination metric (RCM) and its accompanying normalization schemes. RCM enables quantification of coordination during non-constrained discrete motions. Here RCM is applied to a grasping task. Fifteen healthy participants performed a reach, grasp, transport, and release task with a cup and a pen. The measured joint angles were then time-normalized and the RCM time-series were calculated between the shoulder-elbow, shoulder-wrist, and elbow-wrist. RCM was normalized using four differing criteria: the selected joint degree of freedom, angular velocity, angular magnitude, and range of motion. Percent time spent in specified RCM ranges was used asa composite metric and was evaluated for each trial. RCM was found to vary based on: (1) chosen normalization scheme, (2) the stage within the task, (3) the object grasped, and (4) the trajectory of the motion. The RCM addresses some of the limitations of current measures of coordination because it is applicable to discrete motions, does not rely on cyclic repetition, and uses velocity-based measures. Future work will explore clinically relevant differences in the RCM as it is expanded to evaluate different tasks and patient populations. Copyright © 2017. Published by Elsevier Ltd.

  13. Simultaneous determination of tryptophan and 8 metabolites in human plasma by liquid chromatography/tandem mass spectrometry.

    PubMed

    Boulet, Lysiane; Faure, Patrice; Flore, Patrice; Montérémal, Julien; Ducros, Véronique

    2017-06-01

    Tryptophan (Trp) is an essential amino-acid and the precursor of many biologically active substances such as kynurenine (KYN) and serotonin (5HT). Its metabolism is involved in different physiopathological states, such as cardiovascular diseases, cancer, immunomodulation or depression. Hence, the quantification of Trp catabolites, from both KYN and 5HT pathways, might be usefulfor the discovery of novel diagnostic and follow-up biomarkers. We have developed a simple method for quantification of Trp and 8 of its metabolites,involved in both KYN and 5HT pathways, using liquid chromatography coupled to tandem mass spectrometry. We also validated the methodin human plasma samples, according to NF EN ISO 15189 criteria. Our method shows acceptable intra- and inter-day coefficients of variation (CV) (<12% and <16% respectively). The linearity entirelycovers the human plasma range. Stabilities of whole blood and of residues weredetermined, as well as the use of 2 different types of collectiontube, enabling us to adapt our process. Matrix effects and reference values showed good agreement compared to the literature. We propose here a method allowing the simultaneous quantification of a panel of Trp catabolites, never used before to our knowledge. This method, witha quickchromatographic runtime (15min) and simple sample preparation, has beenvalidated according to NF EN ISO 15189 criteria. The method enables the detailed analysis of these metabolic pathways, which are thought to be involved in a number of pathological conditions. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Quantification of confocal images of biofilms grown on irregular surfaces

    PubMed Central

    Ross, Stacy Sommerfeld; Tu, Mai Han; Falsetta, Megan L.; Ketterer, Margaret R.; Kiedrowski, Megan R.; Horswill, Alexander R.; Apicella, Michael A.; Reinhardt, Joseph M.; Fiegel, Jennifer

    2014-01-01

    Bacterial biofilms grow on many types of surfaces, including flat surfaces such as glass and metal and irregular surfaces such as rocks, biological tissues and polymers. While laser scanning confocal microscopy can provide high-resolution images of biofilms grown on any surface, quantification of biofilm-associated bacteria is currently limited to bacteria grown on flat surfaces. This can limit researchers studying irregular surfaces to qualitative analysis or quantification of only the total bacteria in an image. In this work, we introduce a new algorithm called modified connected volume filtration (MCVF) to quantify bacteria grown on top of an irregular surface that is fluorescently labeled or reflective. Using the MCVF algorithm, two new quantification parameters are introduced. The modified substratum coverage parameter enables quantification of the connected-biofilm bacteria on top of the surface and on the imaging substratum. The utility of MCVF and the modified substratum coverage parameter were shown with Pseudomonas aeruginosa and Staphylococcus aureus biofilms grown on human airway epithelial cells. A second parameter, the percent association, provides quantified data on the colocalization of the bacteria with a labeled component, including bacteria within a labeled tissue. The utility of quantifying the bacteria associated with the cell cytoplasm was demonstrated with Neisseria gonorrhoeae biofilms grown on cervical epithelial cells. This algorithm provides more flexibility and quantitative ability to researchers studying biofilms grown on a variety of irregular substrata. PMID:24632515

  15. Validated reverse transcription droplet digital PCR serves as a higher order method for absolute quantification of Potato virus Y strains.

    PubMed

    Mehle, Nataša; Dobnik, David; Ravnikar, Maja; Pompe Novak, Maruša

    2018-05-03

    RNA viruses have a great potential for high genetic variability and rapid evolution that is generated by mutation and recombination under selection pressure. This is also the case of Potato virus Y (PVY), which comprises a high diversity of different recombinant and non-recombinant strains. Consequently, it is hard to develop reverse transcription real-time quantitative PCR (RT-qPCR) with the same amplification efficiencies for all PVY strains which would enable their equilibrate quantification; this is specially needed in mixed infections and other studies of pathogenesis. To achieve this, we initially transferred the PVY universal RT-qPCR assay to a reverse transcription droplet digital PCR (RT-ddPCR) format. RT-ddPCR is an absolute quantification method, where a calibration curve is not needed, and it is less prone to inhibitors. The RT-ddPCR developed and validated in this study achieved a dynamic range of quantification over five orders of magnitude, and in terms of its sensitivity, it was comparable to, or even better than, RT-qPCR. RT-ddPCR showed lower measurement variability. We have shown that RT-ddPCR can be used as a reference tool for the evaluation of different RT-qPCR assays. In addition, it can be used for quantification of RNA based on in-house reference materials that can then be used as calibrators in diagnostic laboratories.

  16. Automated Photoreceptor Cell Identification on Nonconfocal Adaptive Optics Images Using Multiscale Circular Voting.

    PubMed

    Liu, Jianfei; Jung, HaeWon; Dubra, Alfredo; Tam, Johnny

    2017-09-01

    Adaptive optics scanning light ophthalmoscopy (AOSLO) has enabled quantification of the photoreceptor mosaic in the living human eye using metrics such as cell density and average spacing. These rely on the identification of individual cells. Here, we demonstrate a novel approach for computer-aided identification of cone photoreceptors on nonconfocal split detection AOSLO images. Algorithms for identification of cone photoreceptors were developed, based on multiscale circular voting (MSCV) in combination with a priori knowledge that split detection images resemble Nomarski differential interference contrast images, in which dark and bright regions are present on the two sides of each cell. The proposed algorithm locates dark and bright region pairs, iteratively refining the identification across multiple scales. Identification accuracy was assessed in data from 10 subjects by comparing automated identifications with manual labeling, followed by computation of density and spacing metrics for comparison to histology and published data. There was good agreement between manual and automated cone identifications with overall recall, precision, and F1 score of 92.9%, 90.8%, and 91.8%, respectively. On average, computed density and spacing values using automated identification were within 10.7% and 11.2% of the expected histology values across eccentricities ranging from 0.5 to 6.2 mm. There was no statistically significant difference between MSCV-based and histology-based density measurements (P = 0.96, Kolmogorov-Smirnov 2-sample test). MSCV can accurately detect cone photoreceptors on split detection images across a range of eccentricities, enabling quick, objective estimation of photoreceptor mosaic metrics, which will be important for future clinical trials utilizing adaptive optics.

  17. Quantifying HER-2 expression on circulating tumor cells by ACCEPT.

    PubMed

    Zeune, Leonie; van Dalum, Guus; Decraene, Charles; Proudhon, Charlotte; Fehm, Tanja; Neubauer, Hans; Rack, Brigitte; Alunni-Fabbroni, Marianna; Terstappen, Leon W M M; van Gils, Stephan A; Brune, Christoph

    2017-01-01

    Circulating tumor cells (CTCs) isolated from blood can be probed for the expression of treatment targets. Immunofluorescence is often used for both the enumeration of CTC and the determination of protein expression levels related to treatment targets. Accurate and reproducible assessment of such treatment target expression levels is essential for their use in the clinic. To enable this, an open source image analysis program named ACCEPT was developed in the EU-FP7 CTCTrap and CANCER-ID programs. Here its application is shown on a retrospective cohort of 132 metastatic breast cancer patients from which blood samples were processed by CellSearch® and stained for HER-2 expression as additional marker. Images were digitally stored and reviewers identified a total of 4084 CTCs. CTC's HER-2 expression was determined in the thumbnail images by ACCEPT. 150 of these images were selected and sent to six independent investigators to score the HER-2 expression with and without ACCEPT. Concordance rate of the operators' scoring results for HER-2 on CTCs was 30% and could be increased using the ACCEPT tool to 51%. Automated assessment of HER-2 expression by ACCEPT on 4084 CTCs of 132 patients showed 8 (6.1%) patients with all CTCs expressing HER-2, 14 (10.6%) patients with no CTC expressing HER-2 and 110 (83.3%) patients with CTCs showing a varying HER-2 expression level. In total 1576 CTCs were determined HER-2 positive. We conclude that the use of image analysis enables a more reproducible quantification of treatment targets on CTCs and leads the way to fully automated and reproducible approaches.

  18. Bayesian analysis of rare events

    NASA Astrophysics Data System (ADS)

    Straub, Daniel; Papaioannou, Iason; Betz, Wolfgang

    2016-06-01

    In many areas of engineering and science there is an interest in predicting the probability of rare events, in particular in applications related to safety and security. Increasingly, such predictions are made through computer models of physical systems in an uncertainty quantification framework. Additionally, with advances in IT, monitoring and sensor technology, an increasing amount of data on the performance of the systems is collected. This data can be used to reduce uncertainty, improve the probability estimates and consequently enhance the management of rare events and associated risks. Bayesian analysis is the ideal method to include the data into the probabilistic model. It ensures a consistent probabilistic treatment of uncertainty, which is central in the prediction of rare events, where extrapolation from the domain of observation is common. We present a framework for performing Bayesian updating of rare event probabilities, termed BUS. It is based on a reinterpretation of the classical rejection-sampling approach to Bayesian analysis, which enables the use of established methods for estimating probabilities of rare events. By drawing upon these methods, the framework makes use of their computational efficiency. These methods include the First-Order Reliability Method (FORM), tailored importance sampling (IS) methods and Subset Simulation (SuS). In this contribution, we briefly review these methods in the context of the BUS framework and investigate their applicability to Bayesian analysis of rare events in different settings. We find that, for some applications, FORM can be highly efficient and is surprisingly accurate, enabling Bayesian analysis of rare events with just a few model evaluations. In a general setting, BUS implemented through IS and SuS is more robust and flexible.

  19. Quality Concerns in Technical Education in India: A Quantifiable Quality Enabled Model

    ERIC Educational Resources Information Center

    Gambhir, Victor; Wadhwa, N. C.; Grover, Sandeep

    2016-01-01

    Purpose: The paper aims to discuss current Technical Education scenarios in India. It proposes modelling the factors affecting quality in a technical institute and then applying a suitable technique for assessment, comparison and ranking. Design/methodology/approach: The paper chose graph theoretic approach for quantification of quality-enabled…

  20. Three Dimensional Characterization of Tin Crystallography and Cu6Sn5 Intermetallics in Solder Joints by Multiscale Tomography

    NASA Astrophysics Data System (ADS)

    Kirubanandham, A.; Lujan-Regalado, I.; Vallabhaneni, R.; Chawla, N.

    2016-11-01

    Decreasing pitch size in electronic packaging has resulted in a drastic decrease in solder volumes. The Sn grain crystallography and fraction of intermetallic compounds (IMCs) in small-scale solder joints evolve much differently at the smaller length scales. A cross-sectional study limits the morphological analysis of microstructural features to two dimensions. This study utilizes serial sectioning technique in conjunction with electron backscatter diffraction to investigate the crystallographic orientation of both Sn grains and Cu6Sn5 IMCs in Cu/Pure Sn/Cu solder joints in three dimensional (3D). Quantification of grain aspect ratio is affected by local cooling rate differences within the solder volume. Backscatter electron imaging and focused ion beam serial sectioning enabled the visualization of morphology of both nanosized Cu6Sn5 IMCs and the hollow hexagonal morphology type Cu6Sn5 IMCs in 3D. Quantification and visualization of microstructural features in 3D thus enable us to better understand the microstructure and deformation mechanics within these small scale solder joints.

  1. Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghanem, Roger

    QUEST was a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, the Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The USC effort centered on the development of reduced models and efficient algorithms for implementing various components of the UQ pipeline. USC personnel were responsible for the development of adaptive bases, adaptive quadrature, and reduced modelsmore » to be used in estimation and inference.« less

  2. Highly multiplexed simultaneous detection of RNAs and proteins in single cells.

    PubMed

    Frei, Andreas P; Bava, Felice-Alessio; Zunder, Eli R; Hsieh, Elena W Y; Chen, Shih-Yu; Nolan, Garry P; Gherardini, Pier Federico

    2016-03-01

    To enable the detection of expression signatures specific to individual cells, we developed PLAYR (proximity ligation assay for RNA), a method for highly multiplexed transcript quantification by flow and mass cytometry that is compatible with standard antibody staining. When used with mass cytometry, PLAYR allowed for the simultaneous quantification of more than 40 different mRNAs and proteins. In primary cells, we quantified multiple transcripts, with the identity and functional state of each analyzed cell defined on the basis of the expression of a separate set of transcripts or proteins. By expanding high-throughput deep phenotyping of cells beyond protein epitopes to include RNA expression, PLAYR opens a new avenue for the characterization of cellular metabolism.

  3. Recommendations for adaptation and validation of commercial kits for biomarker quantification in drug development.

    PubMed

    Khan, Masood U; Bowsher, Ronald R; Cameron, Mark; Devanarayan, Viswanath; Keller, Steve; King, Lindsay; Lee, Jean; Morimoto, Alyssa; Rhyne, Paul; Stephen, Laurie; Wu, Yuling; Wyant, Timothy; Lachno, D Richard

    2015-01-01

    Increasingly, commercial immunoassay kits are used to support drug discovery and development. Longitudinally consistent kit performance is crucial, but the degree to which kits and reagents are characterized by manufacturers is not standardized, nor are the approaches by users to adapt them and evaluate their performance through validation prior to use. These factors can negatively impact data quality. This paper offers a systematic approach to assessment, method adaptation and validation of commercial immunoassay kits for quantification of biomarkers in drug development, expanding upon previous publications and guidance. These recommendations aim to standardize and harmonize user practices, contributing to reliable biomarker data from commercial immunoassays, thus, enabling properly informed decisions during drug development.

  4. High-Throughput Thermodynamic Modeling and Uncertainty Quantification for ICME

    NASA Astrophysics Data System (ADS)

    Otis, Richard A.; Liu, Zi-Kui

    2017-05-01

    One foundational component of the integrated computational materials engineering (ICME) and Materials Genome Initiative is the computational thermodynamics based on the calculation of phase diagrams (CALPHAD) method. The CALPHAD method pioneered by Kaufman has enabled the development of thermodynamic, atomic mobility, and molar volume databases of individual phases in the full space of temperature, composition, and sometimes pressure for technologically important multicomponent engineering materials, along with sophisticated computational tools for using the databases. In this article, our recent efforts will be presented in terms of developing new computational tools for high-throughput modeling and uncertainty quantification based on high-throughput, first-principles calculations and the CALPHAD method along with their potential propagations to downstream ICME modeling and simulations.

  5. Space Geodesy: The Cross-Disciplinary Earth science (Vening Meinesz Medal Lecture)

    NASA Astrophysics Data System (ADS)

    Shum, C. K.

    2012-04-01

    Geodesy during the onset of the 21st Century is evolving into a transformative cross-disciplinary Earth science field. The pioneers before or after the discipline Geodesy was defined include Galileo, Descartes, Kepler, Newton, Euler, Bernoulli, Kant, Laplace, Airy, Kelvin, Jeffreys, Chandler, Meinesz, Kaula, and others. The complicated dynamic processes of the Earth system manifested by interactions between the solid Earth and its fluid layers, including ocean, atmosphere, cryosphere and hydrosphere, and their feedbacks are linked with scientific problems such as global sea-level rise resulting from natural and anthropogenic climate change. Advances in the precision and stability of geodetic and fundamental instrumentations, including clocks, satellite or quasar tracking sensors, altimetry and lidars, synthetic aperture radar interferometry (InSAR), InSAR altimetry, gravimetry and gradiometry, have enabled accentuate and transformative progress in cross-disciplinary Earth sciences. In particular, advances in the measurement of the gravity with modern free-fall methods have reached accuracies of 10-9 g (~1 μGal or 10 nm/s2) or better, allowing accurate measurements of height changes at ~3 mm relative to the Earth's center of mass, and mass transports within the Earth interior or its geophysical fluids, enabling global quantifications of climate-change signals. These contemporary space geodetic and in situ sensors include, but not limited to, satellite radar and laser altimetry/lidars, GNSS/SLR/VLBI/DORIS, InSAR, spaceborne gravimetry from GRACE (Gravity Recovery And Climate Experiment twin-satellite mission) and gradiometry from GOCE (Global Ocean Circulation Experiment), tide gauges, and hydrographic data (XBT/MBT/Argo). The 2007 Intergovernmental Panel for Climate Change (IPCC) study, the Fourth Assessment Report (AR4), substantially narrowed the discrepancy between observation and the known geophysical causes of sea-level rise, but significant uncertainties remain, notably in the discrepancies of contributions from the ice-reservoirs (ice-sheet and mountain glaciers/ice caps) and our knowledge in the solid Earth glacial isostatic adjustment (GIA), to the present-day and 20th Century global sea-level rise. Here we report our use of contemporary space geodetic observations and novel methodologies to address a few of the open Earth science questions, including the potential quantifications of the major geophysical contributions to or causing present-day global sea-level rise, and the subsequent narrowing of the current sea-level budget discrepancy.

  6. Lowering the quantification limit of the QubitTM RNA HS assay using RNA spike-in.

    PubMed

    Li, Xin; Ben-Dov, Iddo Z; Mauro, Maurizio; Williams, Zev

    2015-05-06

    RNA quantification is often a prerequisite for most RNA analyses such as RNA sequencing. However, the relatively low sensitivity and large sample consumption of traditional RNA quantification methods such as UV spectrophotometry and even the much more sensitive fluorescence-based RNA quantification assays, such as the Qubit™ RNA HS Assay, are often inadequate for measuring minute levels of RNA isolated from limited cell and tissue samples and biofluids. Thus, there is a pressing need for a more sensitive method to reliably and robustly detect trace levels of RNA without interference from DNA. To improve the quantification limit of the Qubit™ RNA HS Assay, we spiked-in a known quantity of RNA to achieve the minimum reading required by the assay. Samples containing trace amounts of RNA were then added to the spike-in and measured as a reading increase over RNA spike-in baseline. We determined the accuracy and precision of reading increases between 1 and 20 pg/μL as well as RNA-specificity in this range, and compared to those of RiboGreen(®), another sensitive fluorescence-based RNA quantification assay. We then applied Qubit™ Assay with RNA spike-in to quantify plasma RNA samples. RNA spike-in improved the quantification limit of the Qubit™ RNA HS Assay 5-fold, from 25 pg/μL down to 5 pg/μL while maintaining high specificity to RNA. This enabled quantification of RNA with original concentration as low as 55.6 pg/μL compared to 250 pg/μL for the standard assay and decreased sample consumption from 5 to 1 ng. Plasma RNA samples that were not measurable by the Qubit™ RNA HS Assay were measurable by our modified method. The Qubit™ RNA HS Assay with RNA spike-in is able to quantify RNA with high specificity at 5-fold lower concentration and uses 5-fold less sample quantity than the standard Qubit™ Assay.

  7. From Pore to Core: Do Engineered Nanoparticles Violate Upscaling Assumptions? A Microtomographic Investigation

    NASA Astrophysics Data System (ADS)

    Molnar, I. L.; O'Carroll, D. M.; Gerhard, J.; Willson, C. S.

    2014-12-01

    The recent success in using Synchrotron X-ray Computed Microtomography (SXCMT) for the quantification of nanoparticle concentrations within real, three-dimensional pore networks [1] has opened up new opportunities for collecting experimental data of pore-scale flow and transport processes. One opportunity is coupling SXCMT with nanoparticle/soil transport experiments to provide unique insights into how pore-scale processes influence transport at larger scales. Understanding these processes is a key step in accurately upscaling micron-scale phenomena to the continuum-scale. Upscaling phenomena from the micron-scale to the continuum-scale typically involves the assumption that the pore space is well mixed. Using this 'well mixed assumption' it is implicitly assumed that the distribution of nanoparticles within the pore does not affect its retention by soil grains. This assumption enables the use of volume-averaged parameters in calculating transport and retention rates. However, in some scenarios, the well mixed assumption will likely be violated by processes such as deposition and diffusion. These processes can alter the distribution of the nanoparticles in the pore space and impact retention behaviour, leading to discrepancies between theoretical predictions and experimental observations. This work investigates the well mixed assumption by employing SXCMT to experimentally examine pore-scale mixing of silver nanoparticles during transport through sand packed columns. Silver nanoparticles were flushed through three different sands to examine the impact of grain distribution and nanoparticle retention rates on mixing: uniform silica (low retention), well graded silica sand (low retention) and uniform iron oxide coated silica sand (high retention). The SXCMT data identified diffusion-limited retention as responsible for violations of the well mixed assumption. A mathematical description of the diffusion-limited retention process was created and compared to the experimental data at the pore and column-scale. The mathematical description accurately predicted trends observed within the SXCMT-datasets such as concentration gradients away from grain surfaces and also accurately predicted total retention of nanoparticles at the column scale. 1. ES&T 2014, 48, (2), 1114-1122.

  8. Parsing and Quantification of Raw Orbitrap Mass Spectrometer Data Using RawQuant.

    PubMed

    Kovalchik, Kevin A; Moggridge, Sophie; Chen, David D Y; Morin, Gregg B; Hughes, Christopher S

    2018-06-01

    Effective analysis of protein samples by mass spectrometry (MS) requires careful selection and optimization of a range of experimental parameters. As the output from the primary detection device, the "raw" MS data file can be used to gauge the success of a given sample analysis. However, the closed-source nature of the standard raw MS file can complicate effective parsing of the data contained within. To ease and increase the range of analyses possible, the RawQuant tool was developed to enable parsing of raw MS files derived from Thermo Orbitrap instruments to yield meta and scan data in an openly readable text format. RawQuant can be commanded to export user-friendly files containing MS 1 , MS 2 , and MS 3 metadata as well as matrices of quantification values based on isobaric tagging approaches. In this study, the utility of RawQuant is demonstrated in several scenarios: (1) reanalysis of shotgun proteomics data for the identification of the human proteome, (2) reanalysis of experiments utilizing isobaric tagging for whole-proteome quantification, and (3) analysis of a novel bacterial proteome and synthetic peptide mixture for assessing quantification accuracy when using isobaric tags. Together, these analyses successfully demonstrate RawQuant for the efficient parsing and quantification of data from raw Thermo Orbitrap MS files acquired in a range of common proteomics experiments. In addition, the individual analyses using RawQuant highlights parametric considerations in the different experimental sets and suggests targetable areas to improve depth of coverage in identification-focused studies and quantification accuracy when using isobaric tags.

  9. Optically transmitted and inductively coupled electric reference to access in vivo concentrations for quantitative proton-decoupled ¹³C magnetic resonance spectroscopy.

    PubMed

    Chen, Xing; Pavan, Matteo; Heinzer-Schweizer, Susanne; Boesiger, Peter; Henning, Anke

    2012-01-01

    This report describes our efforts on quantification of tissue metabolite concentrations in mM by nuclear Overhauser enhanced and proton decoupled (13) C magnetic resonance spectroscopy and the Electric Reference To access In vivo Concentrations (ERETIC) method. Previous work showed that a calibrated synthetic magnetic resonance spectroscopy-like signal transmitted through an optical fiber and inductively coupled into a transmit/receive coil represents a reliable reference standard for in vivo (1) H magnetic resonance spectroscopy quantification on a clinical platform. In this work, we introduce a related implementation that enables simultaneous proton decoupling and ERETIC-based metabolite quantification and hence extends the applicability of the ERETIC method to nuclear Overhauser enhanced and proton decoupled in vivo (13) C magnetic resonance spectroscopy. In addition, ERETIC signal stability under the influence of simultaneous proton decoupling is investigated. The proposed quantification method was cross-validated against internal and external reference standards on human skeletal muscle. The ERETIC signal intensity stability was 100.65 ± 4.18% over 3 months including measurements with and without proton decoupling. Glycogen and unsaturated fatty acid concentrations measured with the ERETIC method were in excellent agreement with internal creatine and external phantom reference methods, showing a difference of 1.85 ± 1.21% for glycogen and 1.84 ± 1.00% for unsaturated fatty acid between ERETIC and creatine-based quantification, whereas the deviations between external reference and creatine-based quantification are 6.95 ± 9.52% and 3.19 ± 2.60%, respectively. Copyright © 2011 Wiley Periodicals, Inc.

  10. Accurate quantification of chromosomal lesions via short tandem repeat analysis using minimal amounts of DNA

    PubMed Central

    Jann, Johann-Christoph; Nowak, Daniel; Nolte, Florian; Fey, Stephanie; Nowak, Verena; Obländer, Julia; Pressler, Jovita; Palme, Iris; Xanthopoulos, Christina; Fabarius, Alice; Platzbecker, Uwe; Giagounidis, Aristoteles; Götze, Katharina; Letsch, Anne; Haase, Detlef; Schlenk, Richard; Bug, Gesine; Lübbert, Michael; Ganser, Arnold; Germing, Ulrich; Haferlach, Claudia; Hofmann, Wolf-Karsten; Mossner, Maximilian

    2017-01-01

    Background Cytogenetic aberrations such as deletion of chromosome 5q (del(5q)) represent key elements in routine clinical diagnostics of haematological malignancies. Currently established methods such as metaphase cytogenetics, FISH or array-based approaches have limitations due to their dependency on viable cells, high costs or semi-quantitative nature. Importantly, they cannot be used on low abundance DNA. We therefore aimed to establish a robust and quantitative technique that overcomes these shortcomings. Methods For precise determination of del(5q) cell fractions, we developed an inexpensive multiplex-PCR assay requiring only nanograms of DNA that simultaneously measures allelic imbalances of 12 independent short tandem repeat markers. Results Application of this method to n=1142 samples from n=260 individuals revealed strong intermarker concordance (R²=0.77–0.97) and reproducibility (mean SD: 1.7%). Notably, the assay showed accurate quantification via standard curve assessment (R²>0.99) and high concordance with paired FISH measurements (R²=0.92) even with subnanogram amounts of DNA. Moreover, cytogenetic response was reliably confirmed in del(5q) patients with myelodysplastic syndromes treated with lenalidomide. While the assay demonstrated good diagnostic accuracy in receiver operating characteristic analysis (area under the curve: 0.97), we further observed robust correlation between bone marrow and peripheral blood samples (R²=0.79), suggesting its potential suitability for less-invasive clonal monitoring. Conclusions In conclusion, we present an adaptable tool for quantification of chromosomal aberrations, particularly in problematic samples, which should be easily applicable to further tumour entities. PMID:28600436

  11. Magnetic Susceptibility as a B0 Field Strength Independent MRI Biomarker of Liver Iron Overload

    PubMed Central

    Hernando, Diego; Cook, Rachel J.; Diamond, Carol; Reeder, Scott B.

    2013-01-01

    Purpose MR-based quantification of liver magnetic susceptibility may enable field strength-independent measurement of liver iron concentration (LIC). However, susceptibility quantification is challenging, due to non-local effects of susceptibility on the B0 field. The purpose of this work is to demonstrate feasibility of susceptibility-based LIC quantification using a fat-referenced approach. Methods Phantoms consisting of vials with increasing iron concentrations immersed between oil/water layers, and twenty-seven subjects (9 controls/18 subjects with liver iron overload) were scanned. Ferriscan (1.5T) provided R2-based reference LIC. Multi-echo 3D-SPGR (1.5T/3T) enabled fat-water, B0- and R2*-mapping. Phantom iron concentration (mg Fe/l) was estimated from B0 differences (ΔB0) between vials and neighboring oil. Liver susceptibility and LIC (mg Fe/g dry tissue) was estimated from ΔB0 between the lateral right lobe of the liver and adjacent subcutaneous adipose tissue (SAT). Results Estimated phantom iron concentrations had good correlation with true iron concentrations (1.5T:slope=0.86, intercept=0.72, r2=0.98; 3T:slope=0.85, intercept=1.73, r2=0.98). In liver, ΔB0 correlated strongly with R2* (1.5T:r2=0.86; 3T:r2=0.93) and B0-LIC had good agreement with Ferriscan-LIC (slopes/intercepts nearly 1.0/0.0, 1.5T:r2=0.67, slope=0.93±0.13, p≈0.50, intercept=1.93±0.78, p≈0.02; 3T:r2=0.84, slope=1.01±0.09, p≈0.90, intercept=0.23±0.52, p≈0.68). Discussion Fat-referenced, susceptibility-based LIC estimation is feasible at both field strengths. This approach may enable improved susceptibility mapping in the abdomen. PMID:23801540

  12. Use of electric field sensors for recording respiration, heart rate, and stereotyped motor behaviors in the rodent home cage.

    PubMed

    Noble, Donald J; MacDowell, Camden J; McKinnon, Michael L; Neblett, Tamra I; Goolsby, William N; Hochman, Shawn

    2017-02-01

    Numerous environmental and genetic factors can contribute significantly to behavioral and cardiorespiratory variability observed experimentally. Affordable technologies that allow for noninvasive home cage capture of physio-behavioral variables should enhance understanding of inter-animal variability including after experimental interventions. We assessed whether EPIC electric field sensors (Plessey Semiconductors) embedded within or attached externally to a rodent's home cage could accurately record respiration, heart rate, and motor behaviors. Current systems for quantification of behavioral variables require expensive specialty equipment, while measures of respiratory and heart rate are often provided by surgically implanted or chronically affixed devices. Sensors accurately encoded imposed sinusoidal changes in electric field tested at frequencies ranging from 0.5-100Hz. Mini-metronome arm movements were easily detected, but response magnitude was highly distance dependent. Sensors accurately reported respiration during whole-body plethysmography. In anesthetized rodents, PVC tube-embedded sensors provided accurate mechanical detection of both respiratory and heart rate. Comparable success was seen in naturally behaving animals at rest or sleeping when sensors were attached externally. Video-verified motor behaviors (sniffing, grooming, chewing, and rearing) were detectable and largely separable by their characteristic voltage fluctuations. Larger movement-related events had comparably larger voltage dynamics that easily allowed for a broad approximation of overall motor activity. Spectrograms were used to quickly depict characteristic frequencies in long-lasting recordings, while filtering and thresholding software allowed for detection and quantification of movement-related physio-behavioral events. EPIC electric field sensors provide a means for affordable non-contact home cage detection of physio-behavioral variables. Copyright © 2016 Elsevier B.V. All rights reserved.

  13. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    PubMed Central

    Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina

    2006-01-01

    Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary criterion by which to evaluate the quality and performance on different matrixes and extraction techniques. The effect of PCR efficiency on the resulting GMO content is demonstrated. Conclusion The crucial influence of extraction technique and sample matrix properties on the results of GMO quantification is demonstrated. Appropriate extraction techniques for each matrix need to be determined to achieve accurate DNA quantification. Nevertheless, as it is shown that in the area of food and feed testing matrix with certain specificities is impossible to define strict quality controls need to be introduced to monitor PCR. The results of our study are also applicable to other fields of quantitative testing by real-time PCR. PMID:16907967

  14. Implementation of a smartphone as a wireless gyroscope application for the quantification of reflex response.

    PubMed

    LeMoyne, Robert; Mastroianni, Timothy

    2014-01-01

    The patellar tendon reflex constitutes a fundamental aspect of the conventional neurological evaluation. Dysfunctional characteristics of the reflex response can augment the diagnostic acuity of a clinician for subsequent referral to more advanced medical resources. The capacity to quantify the reflex response while alleviating the growing strain on specialized medical resources is a topic of interest. The quantification of the tendon reflex response has been successfully demonstrated with considerable accuracy and consistency through using a potential energy impact pendulum attached to a reflex hammer for evoking the tendon reflex with a smartphone, such as an iPhone, application representing a wireless accelerometer platform to quantify reflex response. Another sensor integrated into the smartphone, such as an iPhone, is the gyroscope, which measures rate of angular rotation. A smartphone application enables wireless transmission through Internet connectivity of the gyroscope signal recording of the reflex response as an email attachment. The smartphone wireless gyroscope application demonstrates considerable accuracy and consistency for the quantification of the tendon reflex response.

  15. High sensitivity mass spectrometric quantification of serum growth hormone by amphiphilic peptide conjugation

    NASA Astrophysics Data System (ADS)

    Arsene, Cristian G.; Schulze, Dirk; Kratzsch, Jürgen; Henrion, André

    2012-12-01

    Amphiphilic peptide conjugation affords a significant increase in sensitivity with protein quantification by electrospray-ionization mass spectrometry. This has been demonstrated here for human growth hormone in serum using N-(3-iodopropyl)-N,N,N-dimethyloctylammonium iodide (IPDOA-iodide) as derivatizing reagent. The signal enhancement achieved in comparison to the method without derivatization enables extension of the applicable concentration range down to the very low concentrations as encountered with clinical glucose suppression tests for patients with acromegaly. The method has been validated using a set of serum samples spiked with known amounts of recombinant 22 kDa growth hormone in the range of 0.48 to 7.65 \\mug/L. The coefficient of variation (CV) calculated, based on the deviation of results from the expected concentrations, was 3.5% and the limit of quantification (LoQ) was determined as 0.4 \\mug/L. The potential of the method as a tool in clinical practice has been demonstrated with patient samples of about 1 \\mug/L.

  16. Design and application of a synthetic DNA standard for real-time PCR analysis of microbial communities in a biogas digester.

    PubMed

    May, T; Koch-Singenstreu, M; Ebling, J; Stantscheff, R; Müller, L; Jacobi, F; Polag, D; Keppler, F; König, H

    2015-08-01

    A synthetic DNA fragment containing primer binding sites for the quantification of ten different microbial groups was constructed and evaluated as a reliable enumeration standard for quantitative real-time PCR (qPCR) analyses. This approach has been exemplary verified for the quantification of several methanogenic orders and families in a series of samples drawn from a mesophilic biogas plant. Furthermore, the total amounts of bacteria as well as the number of sulfate-reducing and propionic acid bacteria as potential methanogenic interaction partners were successfully determined. The obtained results indicated a highly dynamic microbial community structure which was distinctly affected by the organic loading rate, the substrate selection, and the amount of free volatile fatty acids in the fermenter. Methanosarcinales was the most predominant methanogenic order during the 3 months of observation despite fluctuating process conditions. During all trials, the modified quantification standard indicated a maximum of reproducibility and efficiency, enabling this method to open up a wide range of novel application options.

  17. A TaqMan-PCR protocol for quantification and differentiation of the phytopathogenic Clavibacter michiganensis subspecies.

    PubMed

    Bach, H-J; Jessen, I; Schloter, M; Munch, J C

    2003-01-01

    Real-time TaqMan-PCR assays were developed for detection, differentiation and absolute quantification of the pathogenic subspecies of Clavibacter michiganensis (Cm) in one single PCR run. The designed primer pair, targeting intergenic sequences of the rRNA operon (ITS) common in all subspecies, was suitable for the amplification of the expected 223-nt DNA fragments of all subspecies. Closely related bacteria were completely discriminated, except of Rathayibacter iranicus, from which weak PCR product bands appeared on agarose gel after 35 PCR cycles. Sufficient specificity of PCR detection was reached by introduction of the additional subspecies specific probes used in TaqMan-PCR. Only Cm species were detected and there was clear differentiation among the subspecies C. michiganensis sepedonicus (Cms), C. michiganensis michiganensis (Cmm), C. michiganensis nebraskensis (Cmn), C. michiganensis insidiosus (Cmi) and C. michiganensis tessellarius (Cmt). The TaqMan assays were optimized to enable a simultaneous quantification of each subspecies. Validity is shown by comparison with cell counts.

  18. Microvolume Protein Concentration Determination using the NanoDrop 2000c Spectrophotometer

    PubMed Central

    Desjardins, Philippe; Hansen, Joel B.; Allen, Michael

    2009-01-01

    Traditional spectrophotometry requires placing samples into cuvettes or capillaries. This is often impractical due to the limited sample volumes often used for protein analysis. The Thermo Scientific NanoDrop 2000c Spectrophotometer solves this issue with an innovative sample retention system that holds microvolume samples between two measurement surfaces using the surface tension properties of liquids, enabling the quantification of samples in volumes as low as 0.5-2 μL. The elimination of cuvettes or capillaries allows real time changes in path length, which reduces the measurement time while greatly increasing the dynamic range of protein concentrations that can be measured. The need for dilutions is also eliminated, and preparations for sample quantification are relatively easy as the measurement surfaces can be simply wiped with laboratory wipe. This video article presents modifications to traditional protein concentration determination methods for quantification of microvolume amounts of protein using A280 absorbance readings or the BCA colorimetric assay. PMID:19890248

  19. Human genomic DNA quantitation system, H-Quant: development and validation for use in forensic casework.

    PubMed

    Shewale, Jaiprakash G; Schneida, Elaine; Wilson, Jonathan; Walker, Jerilyn A; Batzer, Mark A; Sinha, Sudhir K

    2007-03-01

    The human DNA quantification (H-Quant) system, developed for use in human identification, enables quantitation of human genomic DNA in biological samples. The assay is based on real-time amplification of AluYb8 insertions in hominoid primates. The relatively high copy number of subfamily-specific Alu repeats in the human genome enables quantification of very small amounts of human DNA. The oligonucleotide primers present in H-Quant are specific for human DNA and closely related great apes. During the real-time PCR, the SYBR Green I dye binds to the DNA that is synthesized by the human-specific AluYb8 oligonucleotide primers. The fluorescence of the bound SYBR Green I dye is measured at the end of each PCR cycle. The cycle at which the fluorescence crosses the chosen threshold correlates to the quantity of amplifiable DNA in that sample. The minimal sensitivity of the H-Quant system is 7.6 pg/microL of human DNA. The amplicon generated in the H-Quant assay is 216 bp, which is within the same range of the common amplifiable short tandem repeat (STR) amplicons. This size amplicon enables quantitation of amplifiable DNA as opposed to a quantitation of degraded or nonamplifiable DNA of smaller sizes. Development and validation studies were performed on the 7500 real-time PCR system following the Quality Assurance Standards for Forensic DNA Testing Laboratories.

  20. Quantification of pelvic floor muscle strength in female urinary incontinence: A systematic review and comparison of contemporary methodologies.

    PubMed

    Deegan, Emily G; Stothers, Lynn; Kavanagh, Alex; Macnab, Andrew J

    2018-01-01

    There remains no gold standard for quantification of voluntary pelvic floor muscle (PFM) strength, despite international guidelines that recommend PFM assessment in females with urinary incontinence (UI). Methods currently reported for quantification of skeletal muscle strength across disciplines are systematically reviewed and their relevance for clinical and academic use related to the pelvic floor are described. A systematic review via Medline, PubMed, CINHAL, and the Cochrane database using key terms for pelvic floor anatomy and function were cross referenced with skeletal muscle strength quantification from 1946 to 2016. Full text peer-reviewed articles in English having female subjects with incontinence were identified. Each study was analyzed for use of controls, type of methodology as direct or indirect measures, benefits, and limitations of the technique. A total of 1586 articles were identified of which 50 met the inclusion criteria. Nine methodologies of determining PFM strength were described including: digital palpation, perineometer, dynamometry, EMG, vaginal cones, ultrasonography, magnetic resonance imaging, urine stream interruption test, and the Colpexin pull test. Thirty-two percent lacked a control group. Technical refinements in both direct and indirect instrumentation for PFM strength measurement are allowing for sensitivity. However, the most common methods of quantification remain digital palpation and perineometry; techniques that pose limitations and yield subjective or indirect measures of muscular strength. Dynamometry has potential as an accurate and sensitive tool, but is limited by inability to assess PFM strength during dynamic movements. © 2017 Wiley Periodicals, Inc.

  1. An Optimized Informatics Pipeline for Mass Spectrometry-Based Peptidomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Chaochao; Monroe, Matthew E.; Xu, Zhe

    2015-12-26

    Comprehensive MS analysis of peptidome, the intracellular and intercellular products of protein degradation, has the potential to provide novel insights on endogenous proteolytic processing and their utility in disease diagnosis and prognosis. Along with the advances in MS instrumentation, a plethora of proteomics data analysis tools have been applied for direct use in peptidomics; however an evaluation of the currently available informatics pipelines for peptidomics data analysis has yet to be reported. In this study, we set off by evaluating the results of several popular MS/MS database search engines including MS-GF+, SEQUEST and MS-Align+ for peptidomics data analysis, followed bymore » identification and label-free quantification using the well-established accurate mass and time (AMT) tag and newly developed informed quantification (IQ) approaches, both based on direct LC-MS analysis. Our result demonstrated that MS-GF+ outperformed both SEQUEST and MS-Align+ in identifying peptidome peptides. Using a database established from the MS-GF+ peptide identifications, both the AMT tag and IQ approaches provided significantly deeper peptidome coverage and less missing value for each individual data set than the MS/MS methods, while achieving robust label-free quantification. Besides having an excellent correlation with the AMT tag quantification results, IQ also provided slightly higher peptidome coverage than AMT. Taken together, we propose an optimal informatics pipeline combining MS-GF+ for initial database searching with IQ (or AMT) for identification and label-free quantification for high-throughput, comprehensive and quantitative peptidomics analysis.« less

  2. Development and validation of a liquid chromatography isotope dilution mass spectrometry method for the reliable quantification of alkylphenols in environmental water samples by isotope pattern deconvolution.

    PubMed

    Fabregat-Cabello, Neus; Sancho, Juan V; Vidal, Andreu; González, Florenci V; Roig-Navarro, Antoni Francesc

    2014-02-07

    We present here a new measurement method for the rapid extraction and accurate quantification of technical nonylphenol (NP) and 4-t-octylphenol (OP) in complex matrix water samples by UHPLC-ESI-MS/MS. The extraction of both compounds is achieved in 30min by means of hollow fiber liquid phase microextraction (HF-LPME) using 1-octanol as acceptor phase, which provides an enrichment (preconcentration) factor of 800. On the other hand we have developed a quantification method based on isotope dilution mass spectrometry (IDMS) and singly (13)C1-labeled compounds. To this end the minimal labeled (13)C1-4-(3,6-dimethyl-3-heptyl)-phenol and (13)C1-t-octylphenol isomers were synthesized, which coelute with the natural compounds and allows the compensation of the matrix effect. The quantification was carried out by using isotope pattern deconvolution (IPD), which permits to obtain the concentration of both compounds without the need to build any calibration graph, reducing the total analysis time. The combination of both extraction and determination techniques have allowed to validate for the first time a HF-LPME methodology at the required levels by legislation achieving limits of quantification of 0.1ngmL(-1) and recoveries within 97-109%. Due to the low cost of HF-LPME and total time consumption, this methodology is ready for implementation in routine analytical laboratories. Copyright © 2013 Elsevier B.V. All rights reserved.

  3. Comparison of colorimetric assays with quantitative amino acid analysis for protein quantification of Generalized Modules for Membrane Antigens (GMMA).

    PubMed

    Rossi, Omar; Maggiore, Luana; Necchi, Francesca; Koeberling, Oliver; MacLennan, Calman A; Saul, Allan; Gerke, Christiane

    2015-01-01

    Genetically induced outer membrane particles from Gram-negative bacteria, called Generalized Modules for Membrane Antigens (GMMA), are being investigated as vaccines. Rapid methods are required for estimating the protein content for in-process assays during production. Since GMMA are complex biological structures containing lipid and polysaccharide as well as protein, protein determinations are not necessarily straightforward. We compared protein quantification by Bradford, Lowry, and Non-Interfering assays using bovine serum albumin (BSA) as standard with quantitative amino acid (AA) analysis, the most accurate currently available method for protein quantification. The Lowry assay has the lowest inter- and intra-assay variation and gives the best linearity between protein amount and absorbance. In all three assays, the color yield (optical density per mass of protein) of GMMA was markedly different from that of BSA with a ratio of approximately 4 for the Bradford assay, and highly variable between different GMMA; and approximately 0.7 for the Lowry and Non-Interfering assays, highlighting the need for calibrating the standard used in the colorimetric assay against GMMA quantified by AA analysis. In terms of a combination of ease, reproducibility, and proportionality of protein measurement, and comparability between samples, the Lowry assay was superior to Bradford and Non-Interfering assays for GMMA quantification.

  4. Direct quantification of fatty acids in wet microalgal and yeast biomass via a rapid in situ fatty acid methyl ester derivatization approach.

    PubMed

    Dong, Tao; Yu, Liang; Gao, Difeng; Yu, Xiaochen; Miao, Chao; Zheng, Yubin; Lian, Jieni; Li, Tingting; Chen, Shulin

    2015-12-01

    Accurate determination of fatty acid contents is routinely required in microalgal and yeast biofuel studies. A method of rapid in situ fatty acid methyl ester (FAME) derivatization directly from wet fresh microalgal and yeast biomass was developed in this study. This method does not require prior solvent extraction or dehydration. FAMEs were prepared with a sequential alkaline hydrolysis (15 min at 85 °C) and acidic esterification (15 min at 85 °C) process. The resulting FAMEs were extracted into n-hexane and analyzed using gas chromatography. The effects of each processing parameter (temperature, reaction time, and water content) upon the lipids quantification in the alkaline hydrolysis step were evaluated with a full factorial design. This method could tolerate water content up to 20% (v/v) in total reaction volume, which equaled up to 1.2 mL of water in biomass slurry (with 0.05-25 mg of fatty acid). There were no significant differences in FAME quantification (p>0.05) between the standard AOAC 991.39 method and the proposed wet in situ FAME preparation method. This fatty acid quantification method is applicable to fresh wet biomass of a wide range of microalgae and yeast species.

  5. Quantification of neutral human milk oligosaccharides by graphitic carbon HPLC with tandem mass spectrometry

    PubMed Central

    Bao, Yuanwu; Chen, Ceng; Newburg, David S.

    2012-01-01

    Defining the biologic roles of human milk oligosaccharides (HMOS) requires an efficient, simple, reliable, and robust analytical method for simultaneous quantification of oligosaccharide profiles from multiple samples. The HMOS fraction of milk is a complex mixture of polar, highly branched, isomeric structures that contain no intrinsic facile chromophore, making their resolution and quantification challenging. A liquid chromatography-mass spectrometry (LC-MS) method was devised to resolve and quantify 11 major neutral oligosaccharides of human milk simultaneously. Crude HMOS fractions are reduced, resolved by porous graphitic carbon HPLC with a water/acetonitrile gradient, detected by mass spectrometric specific ion monitoring, and quantified. The HPLC separates isomers of identical molecular weights allowing 11 peaks to be fully resolved and quantified by monitoring mass to charge (m/z) ratios of the deprotonated negative ions. The standard curves for each of the 11 oligosaccharides is linear from 0.078 or 0.156 to 20 μg/mL (R2 > 0.998). Precision (CV) ranges from 1% to 9%. Accuracy is from 86% to 104%. This analytical technique provides sensitive, precise, accurate quantification for each of the 11 milk oligosaccharides and allows measurement of differences in milk oligosaccharide patterns between individuals and at different stages of lactation. PMID:23068043

  6. Overcoming biofluid protein complexity during targeted mass spectrometry detection and quantification of protein biomarkers by MRM cubed (MRM3).

    PubMed

    Jeudy, Jeremy; Salvador, Arnaud; Simon, Romain; Jaffuel, Aurore; Fonbonne, Catherine; Léonard, Jean-François; Gautier, Jean-Charles; Pasquier, Olivier; Lemoine, Jerome

    2014-02-01

    Targeted mass spectrometry in the so-called multiple reaction monitoring mode (MRM) is certainly a promising way for the precise, accurate, and multiplexed measurement of proteins and their genetic or posttranslationally modified isoforms. MRM carried out on a low-resolution triple quadrupole instrument faces a lack of specificity when addressing the quantification of weakly concentrated proteins. In this case, extensive sample fractionation or immunoenrichment alleviates signal contamination by interferences, but in turn decreases assay performance and throughput. Recently, MRM(3) was introduced as an alternative to MRM to improve the limit of quantification of weakly concentrated protein biomarkers. In the present work, we compare MRM and MRM(3) modes for the detection of biomarkers in plasma and urine. Calibration curves drawn with MRM and MRM(3) showed a similar range of linearity (R(2) > 0.99 for both methods) with protein concentrations above 1 μg/mL in plasma and a few nanogram per milliliter in urine. In contrast, optimized MRM(3) methods improve the limits of quantification by a factor of 2 to 4 depending on the targeted peptide. This gain arises from the additional MS(3) fragmentation step, which significantly removes or decreases interfering signals within the targeted transition channels.

  7. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection

    PubMed Central

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-01-01

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets. PMID:27739510

  8. Near-infrared spectroscopy for the detection and quantification of bacterial contaminations in pharmaceutical products.

    PubMed

    Quintelas, Cristina; Mesquita, Daniela P; Lopes, João A; Ferreira, Eugénio C; Sousa, Clara

    2015-08-15

    Accurate detection and quantification of microbiological contaminations remains an issue mainly due the lack of rapid and precise analytical techniques. Standard methods are expensive and time-consuming being associated to high economic losses and public health threats. In the context of pharmaceutical industry, the development of fast analytical techniques able to overcome these limitations is crucial and spectroscopic techniques might constitute a reliable alternative. In this work we proved the ability of Fourier transform near infrared spectroscopy (FT-NIRS) to detect and quantify bacteria (Bacillus subtilis, Escherichia coli, Pseudomonas fluorescens, Salmonella enterica, Staphylococcus epidermidis) from 10 to 10(8) CFUs/mL in sterile saline solutions (NaCl 0.9%). Partial least squares discriminant analysis (PLSDA) models showed that FT-NIRS was able to discriminate between sterile and contaminated solutions for all bacteria as well as to identify the contaminant bacteria. Partial least squares (PLS) models allowed bacterial quantification with limits of detection ranging from 5.1 to 9 CFU/mL for E. coli and B. subtilis, respectively. This methodology was successfully validated in three pharmaceutical preparations (contact lens solution, cough syrup and topic anti-inflammatory solution) proving that this technique possess a high potential to be routinely used for the detection and quantification of bacterial contaminations. Copyright © 2015 Elsevier B.V. All rights reserved.

  9. Real-time PCR based on SYBR-Green I fluorescence: an alternative to the TaqMan assay for a relative quantification of gene rearrangements, gene amplifications and micro gene deletions.

    PubMed

    Ponchel, Frederique; Toomes, Carmel; Bransfield, Kieran; Leong, Fong T; Douglas, Susan H; Field, Sarah L; Bell, Sandra M; Combaret, Valerie; Puisieux, Alain; Mighell, Alan J; Robinson, Philip A; Inglehearn, Chris F; Isaacs, John D; Markham, Alex F

    2003-10-13

    Real-time PCR is increasingly being adopted for RNA quantification and genetic analysis. At present the most popular real-time PCR assay is based on the hybridisation of a dual-labelled probe to the PCR product, and the development of a signal by loss of fluorescence quenching as PCR degrades the probe. Though this so-called 'TaqMan' approach has proved easy to optimise in practice, the dual-labelled probes are relatively expensive. We have designed a new assay based on SYBR-Green I binding that is quick, reliable, easily optimised and compares well with the published assay. Here we demonstrate its general applicability by measuring copy number in three different genetic contexts; the quantification of a gene rearrangement (T-cell receptor excision circles (TREC) in peripheral blood mononuclear cells); the detection and quantification of GLI, MYC-C and MYC-N gene amplification in cell lines and cancer biopsies; and detection of deletions in the OPA1 gene in dominant optic atrophy. Our assay has important clinical applications, providing accurate diagnostic results in less time, from less biopsy material and at less cost than assays currently employed such as FISH or Southern blotting.

  10. An alternative method for irones quantification in iris rhizomes using headspace solid-phase microextraction.

    PubMed

    Roger, B; Fernandez, X; Jeannot, V; Chahboun, J

    2010-01-01

    The essential oil obtained from iris rhizomes is one of the most precious raw materials for the perfume industry. Its fragrance is due to irones that are gradually formed by oxidative degradation of iridals during rhizome ageing. The development of an alternative method allowing irone quantification in iris rhizomes using HS-SPME-GC. The development of the method using HS-SPME-GC was achieved using the results obtained from a conventional method, i.e. a solid-liquid extraction (SLE) followed by irone quantification by CG. Among several calibration methods tested, internal calibration gave the best results and was the least sensitive to the matrix effect. The proposed method using HS-SPME-GC is as accurate and reproducible as the conventional one using SLE. These two methods were used to monitor and compare irone concentrations in iris rhizomes that had been stored for 6 months to 9 years. Irone quantification in iris rhizome can be achieved using HS-SPME-GC. This method can thus be used for the quality control of the iris rhizomes. It offers the advantage of combining extraction and analysis with an automated device and thus allows a large number of rhizome batches to be analysed and compared in a limited amount of time. Copyright © 2010 John Wiley & Sons, Ltd.

  11. Direct Quantification of Methane Emissions Across the Supply Chain: Identification of Mitigation Targets

    NASA Astrophysics Data System (ADS)

    Darzi, M.; Johnson, D.; Heltzel, R.; Clark, N.

    2017-12-01

    Researchers at West Virginia University's Center for Alternative Fuels, Engines, and Emissions have recently participated in a variety of studies targeted at direction quantification of methane emissions from across the natural gas supply chain. These studies included assessing methane emissions from heavy-duty vehicles and their fuel stations, active unconventional well sites - during both development and production, natural gas compression and storage facilities, natural gas engines - both large and small, two- and four-stroke, and low-throughput equipment associated with coal bed methane wells. Engine emissions were sampled using conventional instruments such as Fourier transform infrared spectrometers and heated flame ionization detection analyzers. However, to accurately quantify a wide range of other sources beyond the tailpipe (both leaks and losses), a full flow sampling system was developed, which included an integrated cavity-enhanced absorption spectrometer. Through these direct quantification efforts and analysis major sources of methane emissions were identified. Technological solutions and best practices exist or could be developed to reduce methane emissions by focusing on the "lowest-hanging fruit." For example, engine crankcases from across the supply chain should employ vent mitigation systems to reduce methane and other emissions. An overview of the direct quantification system and various campaign measurements results will be presented along with the identification of other targets for additional mitigation.

  12. Rapid Quantification of 25-Hydroxyvitamin D3 in Human Serum by Matrix-Assisted Laser Desorption/Ionization Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Qi, Yulin; Müller, Miriam; Stokes, Caroline S.; Volmer, Dietrich A.

    2018-04-01

    LC-MS/MS is widely utilized today for quantification of vitamin D in biological fluids. Mass spectrometric assays for vitamin D require very careful method optimization for precise and interference-free, accurate analyses however. Here, we explore chemical derivatization and matrix-assisted laser desorption/ionization (MALDI) as a rapid alternative for quantitative measurement of 25-hydroxyvitamin D3 in human serum, and compare it to results from LC-MS/MS. The method implemented an automated imaging step of each MALDI spot, to locate areas of high intensity, avoid sweet spot phenomena, and thus improve precision. There was no statistically significant difference in vitamin D quantification between the MALDI-MS/MS and LC-MS/MS: mean ± standard deviation for MALDI-MS—29.4 ± 10.3 ng/mL—versus LC-MS/MS—30.3 ± 11.2 ng/mL (P = 0.128)—for the sum of the 25-hydroxyvitamin D epimers. The MALDI-based assay avoided time-consuming chromatographic separation steps and was thus much faster than the LC-MS/MS assay. It also consumed less sample, required no organic solvents, and was readily automated. In this proof-of-concept study, MALDI-MS readily demonstrated its potential for mass spectrometric quantification of vitamin D compounds in biological fluids.

  13. Volumetric adsorptive microsampling-liquid chromatography tandem mass spectrometry assay for the simultaneous quantification of four antibiotics in human blood: Method development, validation and comparison with dried blood spot.

    PubMed

    Barco, Sebastiano; Castagnola, Elio; Moscatelli, Andrea; Rudge, James; Tripodi, Gino; Cangemi, Giuliana

    2017-10-25

    In this paper we show the development and validation of a volumetric absorptive microsampling (VAMS™)-LC-MS/MS method for the simultaneous quantification of four antibiotics: piperacillin-tazobactam, meropenem, linezolid and ceftazidime in 10μL human blood. The novel VAMS-LC-MS/MS method has been compared with a dried blood spot (DBS)-based method in terms of impact of hematocrit (HCT) on accuracy, reproducibility, recovery and matrix effect. Antibiotics were extracted from VAMS and DBS by protein precipitation with methanol after a re-hydration step at 37°C for 10min. LC-MS/MS was carried out on a Thermo Scientific™ TSQ Quantum™ Access MAX triple quadrupole coupled to an Accela ™UHPLC system. The VAMS-LC-MS/MS method is selective, precise and reproducible. In contrast to DBS, it allows an accurate quantification without any HCT influence. It has been applied to samples derived from pediatric patients under therapy. VAMS is a valid alternative sampling strategy for the quantification of antibiotics and is valuable in support of clinical PK/PD studies and consequently therapeutic drug monitoring (TDM) in pediatrics. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. Comparing aboveground biomass predictions for an uneven-aged pine-dominated stand using local, regional, and national models

    Treesearch

    D.C. Bragg; K.M. McElligott

    2013-01-01

    Sequestration by Arkansas forests removes carbon dioxide from the atmosphere, storing this carbon in biomass that fills a number of critical ecological and socioeconomic functions. We need a better understanding of the contribution of forests to the carbon cycle, including the accurate quantification of tree biomass. Models have long been developed to predict...

  15. Detection of the Coupling between Vegetation Leaf Area and Climate in a Multifunctional Watershed, Northwestern China

    Treesearch

    Lu Hao; Cen Pan; Peilong Liu; Decheng Zhou; Liangxia Zhang; Zhe Xiong; Yongqiang Liu; Ge Sun

    2016-01-01

    Accurate detection and quantification of vegetation dynamics and drivers of observed climatic and anthropogenic change in space and time is fundamental for our understanding of the atmosphere–biosphere interactions at local and global scales. This case study examined the coupled spatial patterns of vegetation dynamics and climatic variabilities during the past...

  16. ARPA-E Program: Advanced Management Protection of Energy Storage Devices (AMPED) - Fifth Quarterly Project Report - FY14 Q1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farmer, Joseph

    Technology has been developed that enables monitoring of individual cells in high - capacity lithium-ion battery packs, with a distributed array of wireless Bluetooth 4.0 tags and sensors, and without proliferation of extensive wiring harnesses. Given the safety challenges facing lithium-ion batteries in electric vehicle, civilian aviation and defense applications, these wireless sensors may be particularly important to these emerging markets. These wireless sensors will enhance the performance, reliability and safety of such energy storage systems. Specific accomplishments to date include, but are not limited to: (1) the development of wireless tags using Bluetooth 4.0 standard to monitor a largemore » array of sensors in battery pack; (2) sensor suites enabling the simultaneous monitoring of cell voltage, cell current, cell temperature, and package strain, indicative of swelling and increased internal pressure, (3) small receivers compatible with USB ports on portable computers; (4) software drivers and logging software; (5) a 7S2P battery simulator, enabling the safe development of wireless BMS hardware in the laboratory; (6) demonstrated data transmission out of metal enclosures, including battery box, with small variable aperture opening; (7) test data demonstrating the accurate and reliable operation of sensors, with transmission of terminal voltage, cell temperature and package strain at distances up to 110 feet; (8) quantification of the data transmission error as a function of distance, in both indoor and outdoor operation; (9) electromagnetic interference testing during operation with live, high -capacity battery management system at Yardney Technical Products; (10) demonstrat ed operation with live high-capacity lithium-ion battery pack during charge-discharge cycling; (11) development of special polymer-gel lithium-ion batteries with embedded temperature sensors, capable of measuring the core temperature of individual of the cells during charge-discharge cycling at various temperatures, thereby enabling earlier warning of thermal runaway than possible with external sensors. Ultimately, the team plans to extend this work to include: (12) flexible wireless controllers, also using Bluetooth 4.0 standard, essential for balancing large-scale battery packs. LLNL received $925K for this project, and has $191K remaining after accomplishing these objectives.« less

  17. ARPA-E Program: Advanced Management Protection of Energy Storage Devices (AMPED) - Monthly Report - November 2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Farmer, J.

    Technology has been developed that enables monitoring of individual cells in high - capacity lithium-ion battery packs, with a distributed array of wireless Bluetooth 4.0 tags and sensors, and without proliferation of extensive wiring harnesses. Given the safety challenges facing lithium-ion batteries in electric vehicle, civilian aviation and defense applications, these wireless sensors may be particularly important to these emerging markets. These wireless sensors will enhance the performance, reliability and safety of such energy storage systems. Specific accomplishments to date include, but are not limited to: (1) the development of wireless tags using Bluetooth 4.0 standard to monitor a largemore » array of sensors in battery pack; (2) sensor suites enabling the simultaneous monitoring of cell voltage, cell current, cell temperature, and package strain, indicative of swelling and increased internal pressure, (3) small receivers compatible with USB ports on portable computers; (4) software drivers and logging software; (5) a 7S2P battery simulator, enabling the safe development of wireless BMS hardware in the laboratory; (6) demonstrated data transmission out of metal enclosures, including battery box, with small variable aperture opening; (7) test data demonstrating the accurate and reliable operation of sensors, with transmission of terminal voltage, cell temperature and package strain at distances up to 110 feet; (8) quantification of the data transmission error as a function of distance, in both indoor and outdoor operation; (9) electromagnetic interference testing during operation with live, high -capacity battery management system at Yardney Technical Products; (10) demonstrat ed operation with live high-capacity lithium-ion battery pack during charge-discharge cycling; (11) development of special polymer-gel lithium-ion batteries with embedded temperature sensors, capable of measuring the core temperature of individual of the cells during charge-discharge cycling at various temperatures, thereby enabling earlier warning of thermal runaway than possible with external sensors. Ultimately, the team plans to extend this work to include: (12) flexible wireless controllers, also using Bluetooth 4.0 standard, essential for balancing large-scale battery packs. LLNL received $925K for this project, and has $191K remaining after accomplishing these objectives.« less

  18. Quantification of hyaluronan (HA) using a simplified fluorophore-assisted carbohydrate electrophoresis (FACE) procedure.

    PubMed

    Midura, Ronald J; Cali, Valbona; Lauer, Mark E; Calabro, Anthony; Hascall, Vincent C

    2018-01-01

    Hyaluronan (HA) exhibits numerous important roles in physiology and pathologies, and these facts necessitate an ability to accurately and reproducibly measure its quantities in tissues and cell cultures. Our group previously reported a rigorous and analytical procedure to quantify HA (and chondroitin sulfate, CS) using a reductive amination chemistry and separation of the fluorophore-conjugated, unsaturated disaccharides unique to HA and CS on high concentration acrylamide gels. This procedure is known as fluorophore-assisted carbohydrate electrophoresis (FACE) and has been adapted for the detection and quantification of all glycosaminoglycan types. While this previous FACE procedure is relatively straightforward to implement by carbohydrate research investigators, many nonglycoscience laboratories now studying HA biology might have difficulties establishing this prior FACE procedure as a routine assay for HA. To address this need, we have greatly simplified our prior FACE procedure for accurate and reproducible assessment of HA in tissues and cell cultures. This chapter describes in detail this simplified FACE procedure and, because it uses an enzyme that degrades both HA and CS, investigators will also gain additional insight into the quantities of CS in the same samples dedicated for HA analysis. © 2018 Elsevier Inc. All rights reserved.

  19. Calculating forces on thin flat plates with incomplete vorticity-field data

    NASA Astrophysics Data System (ADS)

    Limacher, Eric; Morton, Chris; Wood, David

    2016-11-01

    Optical experimental techniques such as particle image velocimetry (PIV) permit detailed quantification of velocities in the wakes of bluff bodies. Patterns in the wake development are significant to force generation, but it is not trivial to quantitatively relate changes in the wake to changes in measured forces. Key difficulties in this regard include: (i) accurate quantification of velocities close to the body, and (ii) the effect of missing velocity or vorticity data in regions where optical access is obscured. In the present work, we consider force formulations based on the vorticity field, wherein mathematical manipulation eliminates the need for accurate near-body velocity information. Attention is restricted to nominally two dimensional problems, namely (i) a linearly accelerating flat plate, investigated using PIV in a water tunnel, and (ii) a pitching plate in a freestream flow, as investigated numerically by Wang & Eldredge (2013). The effect of missing vorticity data on the pressure side of the plate has a significant impact on the calculation of force for the pitching plate test case. Fortunately, if the vorticity on the pressure side remains confined to a thin boundary layer, simple corrections can be applied to recover a force estimate.

  20. Quantification of intensity variations in functional MR images using rotated principal components

    NASA Astrophysics Data System (ADS)

    Backfrieder, W.; Baumgartner, R.; Sámal, M.; Moser, E.; Bergmann, H.

    1996-08-01

    In functional MRI (fMRI), the changes in cerebral haemodynamics related to stimulated neural brain activity are measured using standard clinical MR equipment. Small intensity variations in fMRI data have to be detected and distinguished from non-neural effects by careful image analysis. Based on multivariate statistics we describe an algorithm involving oblique rotation of the most significant principal components for an estimation of the temporal and spatial distribution of the stimulated neural activity over the whole image matrix. This algorithm takes advantage of strong local signal variations. A mathematical phantom was designed to generate simulated data for the evaluation of the method. In simulation experiments, the potential of the method to quantify small intensity changes, especially when processing data sets containing multiple sources of signal variations, was demonstrated. In vivo fMRI data collected in both visual and motor stimulation experiments were analysed, showing a proper location of the activated cortical regions within well known neural centres and an accurate extraction of the activation time profile. The suggested method yields accurate absolute quantification of in vivo brain activity without the need of extensive prior knowledge and user interaction.

Top