Sample records for allowed reliable quantification

  1. Surface smoothness: cartilage biomarkers for knee OA beyond the radiologist

    NASA Astrophysics Data System (ADS)

    Tummala, Sudhakar; Dam, Erik B.

    2010-03-01

    Fully automatic imaging biomarkers may allow quantification of patho-physiological processes that a radiologist would not be able to assess reliably. This can introduce new insight but is problematic to validate due to lack of meaningful ground truth expert measurements. Rather than quantification accuracy, such novel markers must therefore be validated against clinically meaningful end-goals such as the ability to allow correct diagnosis. We present a method for automatic cartilage surface smoothness quantification in the knee joint. The quantification is based on a curvature flow method used on tibial and femoral cartilage compartments resulting from an automatic segmentation scheme. These smoothness estimates are validated for their ability to diagnose osteoarthritis and compared to smoothness estimates based on manual expert segmentations and to conventional cartilage volume quantification. We demonstrate that the fully automatic markers eliminate the time required for radiologist annotations, and in addition provide a diagnostic marker superior to the evaluated semi-manual markers.

  2. Deep learning approach for the detection and quantification of intraretinal cystoid fluid in multivendor optical coherence tomography.

    PubMed

    Venhuizen, Freerk G; van Ginneken, Bram; Liefers, Bart; van Asten, Freekje; Schreur, Vivian; Fauser, Sascha; Hoyng, Carel; Theelen, Thomas; Sánchez, Clara I

    2018-04-01

    We developed a deep learning algorithm for the automatic segmentation and quantification of intraretinal cystoid fluid (IRC) in spectral domain optical coherence tomography (SD-OCT) volumes independent of the device used for acquisition. A cascade of neural networks was introduced to include prior information on the retinal anatomy, boosting performance significantly. The proposed algorithm approached human performance reaching an overall Dice coefficient of 0.754 ± 0.136 and an intraclass correlation coefficient of 0.936, for the task of IRC segmentation and quantification, respectively. The proposed method allows for fast quantitative IRC volume measurements that can be used to improve patient care, reduce costs, and allow fast and reliable analysis in large population studies.

  3. Deep learning approach for the detection and quantification of intraretinal cystoid fluid in multivendor optical coherence tomography

    PubMed Central

    Venhuizen, Freerk G.; van Ginneken, Bram; Liefers, Bart; van Asten, Freekje; Schreur, Vivian; Fauser, Sascha; Hoyng, Carel; Theelen, Thomas; Sánchez, Clara I.

    2018-01-01

    We developed a deep learning algorithm for the automatic segmentation and quantification of intraretinal cystoid fluid (IRC) in spectral domain optical coherence tomography (SD-OCT) volumes independent of the device used for acquisition. A cascade of neural networks was introduced to include prior information on the retinal anatomy, boosting performance significantly. The proposed algorithm approached human performance reaching an overall Dice coefficient of 0.754 ± 0.136 and an intraclass correlation coefficient of 0.936, for the task of IRC segmentation and quantification, respectively. The proposed method allows for fast quantitative IRC volume measurements that can be used to improve patient care, reduce costs, and allow fast and reliable analysis in large population studies. PMID:29675301

  4. Development and validation of a fast and simple multi-analyte procedure for quantification of 40 drugs relevant to emergency toxicology using GC-MS and one-point calibration.

    PubMed

    Meyer, Golo M J; Weber, Armin A; Maurer, Hans H

    2014-05-01

    Diagnosis and prognosis of poisonings should be confirmed by comprehensive screening and reliable quantification of xenobiotics, for example by gas chromatography-mass spectrometry (GC-MS) or liquid chromatography-mass spectrometry (LC-MS). The turnaround time should be short enough to have an impact on clinical decisions. In emergency toxicology, quantification using full-scan acquisition is preferable because this allows screening and quantification of expected and unexpected drugs in one run. Therefore, a multi-analyte full-scan GC-MS approach was developed and validated with liquid-liquid extraction and one-point calibration for quantification of 40 drugs relevant to emergency toxicology. Validation showed that 36 drugs could be determined quickly, accurately, and reliably in the range of upper therapeutic to toxic concentrations. Daily one-point calibration with calibrators stored for up to four weeks reduced workload and turn-around time to less than 1 h. In summary, the multi-analyte approach with simple liquid-liquid extraction, GC-MS identification, and quantification over fast one-point calibration could successfully be applied to proficiency tests and real case samples. Copyright © 2013 John Wiley & Sons, Ltd.

  5. Semi-automated quantification and neuroanatomical mapping of heterogeneous cell populations.

    PubMed

    Mendez, Oscar A; Potter, Colin J; Valdez, Michael; Bello, Thomas; Trouard, Theodore P; Koshy, Anita A

    2018-07-15

    Our group studies the interactions between cells of the brain and the neurotropic parasite Toxoplasma gondii. Using an in vivo system that allows us to permanently mark and identify brain cells injected with Toxoplasma protein, we have identified that Toxoplasma-injected neurons (TINs) are heterogeneously distributed throughout the brain. Unfortunately, standard methods to quantify and map heterogeneous cell populations onto a reference brain atlas are time consuming and prone to user bias. We developed a novel MATLAB-based semi-automated quantification and mapping program to allow the rapid and consistent mapping of heterogeneously distributed cells on to the Allen Institute Mouse Brain Atlas. The system uses two-threshold background subtraction to identify and quantify cells of interest. We demonstrate that we reliably quantify and neuroanatomically localize TINs with low intra- or inter-observer variability. In a follow up experiment, we show that specific regions of the mouse brain are enriched with TINs. The procedure we use takes advantage of simple immunohistochemistry labeling techniques, use of a standard microscope with a motorized stage, and low cost computing that can be readily obtained at a research institute. To our knowledge there is no other program that uses such readily available techniques and equipment for mapping heterogeneous populations of cells across the whole mouse brain. The quantification method described here allows reliable visualization, quantification, and mapping of heterogeneous cell populations in immunolabeled sections across whole mouse brains. Copyright © 2018 Elsevier B.V. All rights reserved.

  6. An automated system using spatial oversampling for optical mapping in murine atria. Development and validation with monophasic and transmembrane action potentials.

    PubMed

    Yu, Ting Yue; Syeda, Fahima; Holmes, Andrew P; Osborne, Benjamin; Dehghani, Hamid; Brain, Keith L; Kirchhof, Paulus; Fabritz, Larissa

    2014-08-01

    We developed and validated a new optical mapping system for quantification of electrical activation and repolarisation in murine atria. The system makes use of a novel 2nd generation complementary metal-oxide-semiconductor (CMOS) camera with deliberate oversampling to allow both assessment of electrical activation with high spatial and temporal resolution (128 × 2048 pixels) and reliable assessment of atrial murine repolarisation using post-processing of signals. Optical recordings were taken from isolated, superfused and electrically stimulated murine left atria. The system reliably describes activation sequences, identifies areas of functional block, and allows quantification of conduction velocities and vectors. Furthermore, the system records murine atrial action potentials with comparable duration to both monophasic and transmembrane action potentials in murine atria. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  7. Digital PCR as a tool to measure HIV persistence.

    PubMed

    Rutsaert, Sofie; Bosman, Kobus; Trypsteen, Wim; Nijhuis, Monique; Vandekerckhove, Linos

    2018-01-30

    Although antiretroviral therapy is able to suppress HIV replication in infected patients, the virus persists and rebounds when treatment is stopped. In order to find a cure that can eradicate the latent reservoir, one must be able to quantify the persisting virus. Traditionally, HIV persistence studies have used real-time PCR (qPCR) to measure the viral reservoir represented by HIV DNA and RNA. Most recently, digital PCR is gaining popularity as a novel approach to nucleic acid quantification as it allows for absolute target quantification. Various commercial digital PCR platforms are nowadays available that implement the principle of digital PCR, of which Bio-Rad's QX200 ddPCR is currently the most used platform in HIV research. Quantification of HIV by digital PCR is proving to be a valuable improvement over qPCR as it is argued to have a higher robustness to mismatches between the primers-probe set and heterogeneous HIV, and forfeits the need for a standard curve, both of which are known to complicate reliable quantification. However, currently available digital PCR platforms occasionally struggle with unexplained false-positive partitions, and reliable segregation between positive and negative droplets remains disputed. Future developments and advancements of the digital PCR technology are promising to aid in the accurate quantification and characterization of the persistent HIV reservoir.

  8. Droplet Digital PCR for Minimal Residual Disease Detection in Mature Lymphoproliferative Disorders.

    PubMed

    Drandi, Daniela; Ferrero, Simone; Ladetto, Marco

    2018-01-01

    Minimal residual disease (MRD) detection has a powerful prognostic relevance for response evaluation and prediction of relapse in hematological malignancies. Real-time quantitative PCR (qPCR) has become the settled and standardized method for MRD assessment in lymphoid disorders. However, qPCR is a relative quantification approach, since it requires a reference standard curve. Droplet digital TM PCR (ddPCR TM ) allows a reliable absolute tumor burden quantification withdrawing the need for preparing, for each experiment, a tumor-specific standard curve. We have recently shown that ddPCR has a good concordance with qPCR and could be a feasible and reliable tool for MRD monitoring in mature lymphoproliferative disorders. In this chapter we describe the experimental workflow, from the detection of the clonal molecular marker to the MRD monitoring by ddPCR, in patients affected by multiple myeloma, mantle cell lymphoma and follicular lymphoma. However, standardization programs among different laboratories are needed in order to ensure the reliability and reproducibility of ddPCR-based MRD results.

  9. Overall Key Performance Indicator to Optimizing Operation of High-Pressure Homogenizers for a Reliable Quantification of Intracellular Components in Pichia pastoris.

    PubMed

    Garcia-Ortega, Xavier; Reyes, Cecilia; Montesinos, José Luis; Valero, Francisco

    2015-01-01

    The most commonly used cell disruption procedures may present lack of reproducibility, which introduces significant errors in the quantification of intracellular components. In this work, an approach consisting in the definition of an overall key performance indicator (KPI) was implemented for a lab scale high-pressure homogenizer (HPH) in order to determine the disruption settings that allow the reliable quantification of a wide sort of intracellular components. This innovative KPI was based on the combination of three independent reporting indicators: decrease of absorbance, release of total protein, and release of alkaline phosphatase activity. The yeast Pichia pastoris growing on methanol was selected as model microorganism due to it presents an important widening of the cell wall needing more severe methods and operating conditions than Escherichia coli and Saccharomyces cerevisiae. From the outcome of the reporting indicators, the cell disruption efficiency achieved using HPH was about fourfold higher than other lab standard cell disruption methodologies, such bead milling cell permeabilization. This approach was also applied to a pilot plant scale HPH validating the methodology in a scale-up of the disruption process. This innovative non-complex approach developed to evaluate the efficacy of a disruption procedure or equipment can be easily applied to optimize the most common disruption processes, in order to reach not only reliable quantification but also recovery of intracellular components from cell factories of interest.

  10. Overall Key Performance Indicator to Optimizing Operation of High-Pressure Homogenizers for a Reliable Quantification of Intracellular Components in Pichia pastoris

    PubMed Central

    Garcia-Ortega, Xavier; Reyes, Cecilia; Montesinos, José Luis; Valero, Francisco

    2015-01-01

    The most commonly used cell disruption procedures may present lack of reproducibility, which introduces significant errors in the quantification of intracellular components. In this work, an approach consisting in the definition of an overall key performance indicator (KPI) was implemented for a lab scale high-pressure homogenizer (HPH) in order to determine the disruption settings that allow the reliable quantification of a wide sort of intracellular components. This innovative KPI was based on the combination of three independent reporting indicators: decrease of absorbance, release of total protein, and release of alkaline phosphatase activity. The yeast Pichia pastoris growing on methanol was selected as model microorganism due to it presents an important widening of the cell wall needing more severe methods and operating conditions than Escherichia coli and Saccharomyces cerevisiae. From the outcome of the reporting indicators, the cell disruption efficiency achieved using HPH was about fourfold higher than other lab standard cell disruption methodologies, such bead milling cell permeabilization. This approach was also applied to a pilot plant scale HPH validating the methodology in a scale-up of the disruption process. This innovative non-complex approach developed to evaluate the efficacy of a disruption procedure or equipment can be easily applied to optimize the most common disruption processes, in order to reach not only reliable quantification but also recovery of intracellular components from cell factories of interest. PMID:26284241

  11. quantGenius: implementation of a decision support system for qPCR-based gene quantification.

    PubMed

    Baebler, Špela; Svalina, Miha; Petek, Marko; Stare, Katja; Rotter, Ana; Pompe-Novak, Maruša; Gruden, Kristina

    2017-05-25

    Quantitative molecular biology remains a challenge for researchers due to inconsistent approaches for control of errors in the final results. Due to several factors that can influence the final result, quantitative analysis and interpretation of qPCR data are still not trivial. Together with the development of high-throughput qPCR platforms, there is a need for a tool allowing for robust, reliable and fast nucleic acid quantification. We have developed "quantGenius" ( http://quantgenius.nib.si ), an open-access web application for a reliable qPCR-based quantification of nucleic acids. The quantGenius workflow interactively guides the user through data import, quality control (QC) and calculation steps. The input is machine- and chemistry-independent. Quantification is performed using the standard curve approach, with normalization to one or several reference genes. The special feature of the application is the implementation of user-guided QC-based decision support system, based on qPCR standards, that takes into account pipetting errors, assay amplification efficiencies, limits of detection and quantification of the assays as well as the control of PCR inhibition in individual samples. The intermediate calculations and final results are exportable in a data matrix suitable for further statistical analysis or visualization. We additionally compare the most important features of quantGenius with similar advanced software tools and illustrate the importance of proper QC system in the analysis of qPCR data in two use cases. To our knowledge, quantGenius is the only qPCR data analysis tool that integrates QC-based decision support and will help scientists to obtain reliable results which are the basis for biologically meaningful data interpretation.

  12. Simple and Inexpensive Quantification of Ammonia in Whole Blood

    PubMed Central

    Ayyub, Omar B.; Behrens, Adam M.; Heligman, Brian T.; Natoli, Mary E.; Ayoub, Joseph J.; Cunningham, Gary; Summar, Marshall; Kofinas, Peter

    2015-01-01

    Quantification of ammonia in whole blood has applications in the diagnosis and management of many hepatic diseases, including cirrhosis and rare urea cycle disorders, amounting to more than 5 million patients in the United States. Current techniques for ammonia measurement suffer from limited range, poor resolution, false positives or large, complex sensor set-ups. Here we demonstrate a technique utilizing inexpensive reagents and simple methods for quantifying ammonia in 100 μl of whole blood. The sensor comprises a modified form of the indophenol reaction, which resists sources of destructive interference in blood, in conjunction with a cation-exchange membrane. The presented sensing scheme is selective against other amine containing molecules such as amino acids and has a shelf life of at least 50 days. Additionally, the resulting system has high sensitivity and allows for the accurate reliable quantification of ammonia in whole human blood samples at a minimum range of 25 to 500 μM, which is clinically for rare hyperammonemic disorders and liver disease. Furthermore, concentrations of 50 and 100 μM ammonia could be reliably discerned with p=0.0001. PMID:25936660

  13. Normalized Quantitative Western Blotting Based on Standardized Fluorescent Labeling.

    PubMed

    Faden, Frederik; Eschen-Lippold, Lennart; Dissmeyer, Nico

    2016-01-01

    Western blot (WB) analysis is the most widely used method to monitor expression of proteins of interest in protein extracts of high complexity derived from diverse experimental setups. WB allows the rapid and specific detection of a target protein, such as non-tagged endogenous proteins as well as protein-epitope tag fusions depending on the availability of specific antibodies. To generate quantitative data from independent samples within one experiment and to allow accurate inter-experimental quantification, a reliable and reproducible method to standardize and normalize WB data is indispensable. To date, it is a standard procedure to normalize individual bands of immunodetected proteins of interest from a WB lane to other individual bands of so-called housekeeping proteins of the same sample lane. These are usually detected by an independent antibody or colorimetric detection and do not reflect the real total protein of a sample. Housekeeping proteins-assumed to be constitutively expressed mostly independent of developmental and environmental states-can greatly differ in their expression under these various conditions. Therefore, they actually do not represent a reliable reference to normalize the target protein's abundance to the total amount of protein contained in each lane of a blot.Here, we demonstrate the Smart Protein Layers (SPL) technology, a combination of fluorescent standards and a stain-free fluorescence-based visualization of total protein in gels and after transfer via WB. SPL allows a rapid and highly sensitive protein visualization and quantification with a sensitivity comparable to conventional silver staining with a 1000-fold higher dynamic range. For normalization, standardization and quantification of protein gels and WBs, a sample-dependent bi-fluorescent standard reagent is applied and, for accurate quantification of data derived from different experiments, a second calibration standard is used. Together, the precise quantification of protein expression by lane-to-lane, gel-to-gel, and blot-to-blot comparisons is facilitated especially with respect to experiments in the area of proteostasis dealing with highly variable protein levels and involving protein degradation mutants and treatments modulating protein abundance.

  14. Automated Pilot Performance Assessment in the T-37: A Feasibility Study. Final Report (May 1968-April 1971).

    ERIC Educational Resources Information Center

    Knoop, Patricia A.; Welde, William L.

    Air Force investigators conducted a three year program to develop a capability for automated quantification and assessment of in-flight pilot performance. Such a capability enhances pilot training by making ratings more objective, valid, reliable and sensitive, and by freeing instructors from rating responsibilities, allowing them to concentrate…

  15. Optimizing total reflection X-ray fluorescence for direct trace element quantification in proteins I: Influence of sample homogeneity and reflector type

    NASA Astrophysics Data System (ADS)

    Wellenreuther, G.; Fittschen, U. E. A.; Achard, M. E. S.; Faust, A.; Kreplin, X.; Meyer-Klaucke, W.

    2008-12-01

    Total reflection X-ray fluorescence (TXRF) is a very promising method for the direct, quick and reliable multi-elemental quantification of trace elements in protein samples. With the introduction of an internal standard consisting of two reference elements, scandium and gallium, a wide range of proteins can be analyzed, regardless of their salt content, buffer composition, additives and amino acid composition. This strategy also enables quantification of matrix effects. Two potential issues associated with drying have been considered in this study: (1) Formation of heterogeneous residues of varying thickness and/or density; and (2) separation of the internal standard and protein during drying (which has to be prevented to allow accurate quantification). These issues were investigated by microbeam X-ray fluorescence (μXRF) with special emphasis on (I) the influence of sample support and (II) the protein / buffer system used. In the first part, a model protein was studied on well established sample supports used in TXRF, PIXE and XRF (Mylar, siliconized quartz, Plexiglas and silicon). In the second part we imaged proteins of different molecular weight, oligomerization state, bound metals and solubility. A partial separation of protein and internal standard was only observed with untreated silicon, suggesting it may not be an adequate support material. Siliconized quartz proved to be the least prone to heterogeneous drying of the sample and yielded the most reliable results.

  16. Quantification of neutral human milk oligosaccharides by graphitic carbon HPLC with tandem mass spectrometry

    PubMed Central

    Bao, Yuanwu; Chen, Ceng; Newburg, David S.

    2012-01-01

    Defining the biologic roles of human milk oligosaccharides (HMOS) requires an efficient, simple, reliable, and robust analytical method for simultaneous quantification of oligosaccharide profiles from multiple samples. The HMOS fraction of milk is a complex mixture of polar, highly branched, isomeric structures that contain no intrinsic facile chromophore, making their resolution and quantification challenging. A liquid chromatography-mass spectrometry (LC-MS) method was devised to resolve and quantify 11 major neutral oligosaccharides of human milk simultaneously. Crude HMOS fractions are reduced, resolved by porous graphitic carbon HPLC with a water/acetonitrile gradient, detected by mass spectrometric specific ion monitoring, and quantified. The HPLC separates isomers of identical molecular weights allowing 11 peaks to be fully resolved and quantified by monitoring mass to charge (m/z) ratios of the deprotonated negative ions. The standard curves for each of the 11 oligosaccharides is linear from 0.078 or 0.156 to 20 μg/mL (R2 > 0.998). Precision (CV) ranges from 1% to 9%. Accuracy is from 86% to 104%. This analytical technique provides sensitive, precise, accurate quantification for each of the 11 milk oligosaccharides and allows measurement of differences in milk oligosaccharide patterns between individuals and at different stages of lactation. PMID:23068043

  17. Matrix suppression as a guideline for reliable quantification of peptides by matrix-assisted laser desorption ionization.

    PubMed

    Ahn, Sung Hee; Bae, Yong Jin; Moon, Jeong Hee; Kim, Myung Soo

    2013-09-17

    We propose to divide matrix suppression in matrix-assisted laser desorption ionization into two parts, normal and anomalous. In quantification of peptides, the normal effect can be accounted for by constructing the calibration curve in the form of peptide-to-matrix ion abundance ratio versus concentration. The anomalous effect forbids reliable quantification and is noticeable when matrix suppression is larger than 70%. With this 70% rule, matrix suppression becomes a guideline for reliable quantification, rather than a nuisance. A peptide in a complex mixture can be quantified even in the presence of large amounts of contaminants, as long as matrix suppression is below 70%. The theoretical basis for the quantification method using a peptide as an internal standard is presented together with its weaknesses. A systematic method to improve quantification of high concentration analytes has also been developed.

  18. Development and validation of a liquid chromatography isotope dilution mass spectrometry method for the reliable quantification of alkylphenols in environmental water samples by isotope pattern deconvolution.

    PubMed

    Fabregat-Cabello, Neus; Sancho, Juan V; Vidal, Andreu; González, Florenci V; Roig-Navarro, Antoni Francesc

    2014-02-07

    We present here a new measurement method for the rapid extraction and accurate quantification of technical nonylphenol (NP) and 4-t-octylphenol (OP) in complex matrix water samples by UHPLC-ESI-MS/MS. The extraction of both compounds is achieved in 30min by means of hollow fiber liquid phase microextraction (HF-LPME) using 1-octanol as acceptor phase, which provides an enrichment (preconcentration) factor of 800. On the other hand we have developed a quantification method based on isotope dilution mass spectrometry (IDMS) and singly (13)C1-labeled compounds. To this end the minimal labeled (13)C1-4-(3,6-dimethyl-3-heptyl)-phenol and (13)C1-t-octylphenol isomers were synthesized, which coelute with the natural compounds and allows the compensation of the matrix effect. The quantification was carried out by using isotope pattern deconvolution (IPD), which permits to obtain the concentration of both compounds without the need to build any calibration graph, reducing the total analysis time. The combination of both extraction and determination techniques have allowed to validate for the first time a HF-LPME methodology at the required levels by legislation achieving limits of quantification of 0.1ngmL(-1) and recoveries within 97-109%. Due to the low cost of HF-LPME and total time consumption, this methodology is ready for implementation in routine analytical laboratories. Copyright © 2013 Elsevier B.V. All rights reserved.

  19. Apparent exchange rate for breast cancer characterization.

    PubMed

    Lasič, Samo; Oredsson, Stina; Partridge, Savannah C; Saal, Lao H; Topgaard, Daniel; Nilsson, Markus; Bryskhe, Karin

    2016-05-01

    Although diffusion MRI has shown promise for the characterization of breast cancer, it has low specificity to malignant subtypes. Higher specificity might be achieved if the effects of cell morphology and molecular exchange across cell membranes could be disentangled. The quantification of exchange might thus allow the differentiation of different types of breast cancer cells. Based on differences in diffusion rates between the intra- and extracellular compartments, filter exchange spectroscopy/imaging (FEXSY/FEXI) provides non-invasive quantification of the apparent exchange rate (AXR) of water between the two compartments. To test the feasibility of FEXSY for the differentiation of different breast cancer cells, we performed experiments on several breast epithelial cell lines in vitro. Furthermore, we performed the first in vivo FEXI measurement of water exchange in human breast. In cell suspensions, pulsed gradient spin-echo experiments with large b values and variable pulse duration allow the characterization of the intracellular compartment, whereas FEXSY provides a quantification of AXR. These experiments are very sensitive to the physiological state of cells and can be used to establish reliable protocols for the culture and harvesting of cells. Our results suggest that different breast cancer subtypes can be distinguished on the basis of their AXR values in cell suspensions. Time-resolved measurements allow the monitoring of the physiological state of cells in suspensions over the time-scale of hours, and reveal an abrupt disintegration of the intracellular compartment. In vivo, exchange can be detected in a tumor, whereas, in normal tissue, the exchange rate is outside the range experimentally accessible for FEXI. At present, low signal-to-noise ratio and limited scan time allows the quantification of AXR only in a region of interest of relatively large tumors. © 2016 The Authors. NMR in Biomedicine published by John Wiley & Sons Ltd.

  20. A new and reliable method for live imaging and quantification of reactive oxygen species in Botrytis cinerea: technological advancement.

    PubMed

    Marschall, Robert; Tudzynski, Paul

    2014-10-01

    Reactive oxygen species (ROS) are produced in conserved cellular processes either as by-products of the cellular respiration in mitochondria, or purposefully for defense mechanisms, signaling cascades or cell homeostasis. ROS have two diametrically opposed attributes due to their highly damaging potential for DNA, lipids and other molecules and due to their indispensability for signaling and developmental processes. In filamentous fungi, the role of ROS in growth and development has been studied in detail, but these analyses were often hampered by the lack of reliable and specific techniques to monitor different activities of ROS in living cells. Here, we present a new method for live cell imaging of ROS in filamentous fungi. We demonstrate that by use of a mixture of two fluorescent dyes it is possible to monitor H2O2 and superoxide specifically and simultaneously in distinct cellular structures during various hyphal differentiation processes. In addition, the method allows for reliable fluorometric quantification of ROS. We demonstrate that this can be used to characterize different mutants with respect to their ROS production/scavenging potential. Copyright © 2014 Elsevier Inc. All rights reserved.

  1. HCV-RNA quantification in liver bioptic samples and extrahepatic compartments, using the abbott RealTime HCV assay.

    PubMed

    Antonucci, FrancescoPaolo; Cento, Valeria; Sorbo, Maria Chiara; Manuelli, Matteo Ciancio; Lenci, Ilaria; Sforza, Daniele; Di Carlo, Domenico; Milana, Martina; Manzia, Tommaso Maria; Angelico, Mario; Tisone, Giuseppe; Perno, Carlo Federico; Ceccherini-Silberstein, Francesca

    2017-08-01

    We evaluated the performance of a rapid method to quantify HCV-RNA in the hepatic and extrahepatic compartments, by using for the first time the Abbott RealTime HCV-assay. Non-tumoral (NT), tumoral (TT) liver samples, lymph nodes and ascitic fluid from patients undergoing orthotopic-liver-transplantation (N=18) or liver resection (N=4) were used for the HCV-RNA quantification; 5/22 patients were tested after or during direct acting antivirals (DAA) treatment. Total RNA and DNA quantification from tissue-biopsies allowed normalization of HCV-RNA concentrations in IU/μg of total RNA and IU/10 6 liver-cells, respectively. HCV-RNA was successfully quantified with high reliability in liver biopsies, lymph nodes and ascitic fluid samples. Among the 17 untreated patients, a positive and significant HCV-RNA correlation between serum and NT liver-samples was observed (Pearson: rho=0.544, p=0.024). Three DAA-treated patients were HCV-RNA "undetectable" in serum, but still "detectable" in all tested liver-tissues. Differently, only one DAA-treated patient, tested after sustained-virological-response, showed HCV-RNA "undetectability" in liver-tissue. HCV-RNA was successfully quantified with high reliability in liver bioptic samples and extrahepatic compartments, even when HCV-RNA was "undetectable" in serum. Abbott RealTime HCV-assay is a good diagnostic tool for HCV quantification in intra- and extra-hepatic compartments, whenever a bioptic sample is available. Copyright © 2017 Elsevier B.V. All rights reserved.

  2. An Excel‐based implementation of the spectral method of action potential alternans analysis

    PubMed Central

    Pearman, Charles M.

    2014-01-01

    Abstract Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro‐arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T‐wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. PMID:25501439

  3. Multivariate Analysis for Quantification of Plutonium(IV) in Nitric Acid Based on Absorption Spectra

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lines, Amanda M.; Adami, Susan R.; Sinkov, Sergey I.

    Development of more effective, reliable, and fast methods for monitoring process streams is a growing opportunity for analytical applications. Many fields can benefit from on-line monitoring, including the nuclear fuel cycle where improved methods for monitoring radioactive materials will facilitate maintenance of proper safeguards and ensure safe and efficient processing of materials. On-line process monitoring with a focus on optical spectroscopy can provide a fast, non-destructive method for monitoring chemical species. However, identification and quantification of species can be hindered by the complexity of the solutions if bands overlap or show condition-dependent spectral features. Plutonium (IV) is one example ofmore » a species which displays significant spectral variation with changing nitric acid concentration. Single variate analysis (i.e. Beer’s Law) is difficult to apply to the quantification of Pu(IV) unless the nitric acid concentration is known and separate calibration curves have been made for all possible acid strengths. Multivariate, or chemometric, analysis is an approach that allows for the accurate quantification of Pu(IV) without a priori knowledge of nitric acid concentration.« less

  4. A fast, reliable, ultra high performance liquid chromatography method for the simultaneous determination of amino acids, biogenic amines and ammonium ions in cheese, using diethyl ethoxymethylenemalonate as a derivatising agent.

    PubMed

    Redruello, Begoña; Ladero, Victor; Cuesta, Isabel; Álvarez-Buylla, Jorge R; Martín, María Cruz; Fernández, María; Alvarez, Miguel A

    2013-08-15

    Derivatisation treatment with diethyl ethoxymethylenemalonate followed by ultra-HPLC allowed the simultaneous quantification of 22 amino acids, 7 biogenic amines and ammonium ions in cheese samples in under 10 min. This is the fastest elution time ever reported for such a resolution. The proposed method shows good linearity (R(2)>0.995) and sensitivity (detection limit 0.08-3.91 μM; quantification limit <13.02 μM). Intra- and inter-day repeatability ranged from 0.35% to 1.25% and from 0.85% to 5.2%, respectively. No significant effect of the cheese matrix was observed. Copyright © 2013 Elsevier Ltd. All rights reserved.

  5. Comparison of Sample and Detection Quantification Methods for Salmonella Enterica from Produce

    NASA Technical Reports Server (NTRS)

    Hummerick, M. P.; Khodadad, C.; Richards, J. T.; Dixit, A.; Spencer, L. M.; Larson, B.; Parrish, C., II; Birmele, M.; Wheeler, Raymond

    2014-01-01

    The purpose of this study was to identify and optimize fast and reliable sampling and detection methods for the identification of pathogens that may be present on produce grown in small vegetable production units on the International Space Station (ISS), thus a field setting. Microbiological testing is necessary before astronauts are allowed to consume produce grown on ISS where currently there are two vegetable production units deployed, Lada and Veggie.

  6. Molecules and elements for quantitative bioanalysis: The allure of using electrospray, MALDI, and ICP mass spectrometry side-by-side.

    PubMed

    Linscheid, Michael W

    2018-03-30

    To understand biological processes, not only reliable identification, but quantification of constituents in biological processes play a pivotal role. This is especially true for the proteome: protein quantification must follow protein identification, since sometimes minute changes in abundance tell the real tale. To obtain quantitative data, many sophisticated strategies using electrospray and MALDI mass spectrometry (MS) have been developed in recent years. All of them have advantages and limitations. Several years ago, we started to work on strategies, which are principally capable to overcome some of these limits. The fundamental idea is to use elemental signals as a measure for quantities. We began by replacing the radioactive 32 P with the "cold" natural 31 P to quantify modified nucleotides and phosphorylated peptides and proteins and later used tagging strategies for quantification of proteins more generally. To do this, we introduced Inductively Coupled Plasma Mass Spectrometry (ICP-MS) into the bioanalytical workflows, allowing not only reliable and sensitive detection but also quantification based on isotope dilution absolute measurements using poly-isotopic elements. The detection capability of ICP-MS becomes particularly attractive with heavy metals. The covalently bound proteins tags developed in our group are based on the well-known DOTA chelate complex (1,4,7,10-tetraazacyclododecane-N,N',N″,N‴-tetraacetic acid) carrying ions of lanthanoides as metal core. In this review, I will outline the development of this mutual assistance between molecular and elemental mass spectrometry and discuss the scope and limitations particularly of peptide and protein quantification. The lanthanoide tags provide low detection limits, but offer multiplexing capabilities due to the number of very similar lanthanoides and their isotopes. With isotope dilution comes previously unknown accuracy. Separation techniques such as electrophoresis and HPLC were used and just slightly adapted workflows, already in use for quantification in bioanalysis. Imaging mass spectrometry (MSI) with MALDI and laser ablation ICP-MS complemented the range of application in recent years. © 2018 Wiley Periodicals, Inc.

  7. Quantification of Pulmonary Fibrosis in a Bleomycin Mouse Model Using Automated Histological Image Analysis.

    PubMed

    Gilhodes, Jean-Claude; Julé, Yvon; Kreuz, Sebastian; Stierstorfer, Birgit; Stiller, Detlef; Wollin, Lutz

    2017-01-01

    Current literature on pulmonary fibrosis induced in animal models highlights the need of an accurate, reliable and reproducible histological quantitative analysis. One of the major limits of histological scoring concerns the fact that it is observer-dependent and consequently subject to variability, which may preclude comparative studies between different laboratories. To achieve a reliable and observer-independent quantification of lung fibrosis we developed an automated software histological image analysis performed from digital image of entire lung sections. This automated analysis was compared to standard evaluation methods with regard to its validation as an end-point measure of fibrosis. Lung fibrosis was induced in mice by intratracheal administration of bleomycin (BLM) at 0.25, 0.5, 0.75 and 1 mg/kg. A detailed characterization of BLM-induced fibrosis was performed 14 days after BLM administration using lung function testing, micro-computed tomography and Ashcroft scoring analysis. Quantification of fibrosis by automated analysis was assessed based on pulmonary tissue density measured from thousands of micro-tiles processed from digital images of entire lung sections. Prior to analysis, large bronchi and vessels were manually excluded from the original images. Measurement of fibrosis has been expressed by two indexes: the mean pulmonary tissue density and the high pulmonary tissue density frequency. We showed that tissue density indexes gave access to a very accurate and reliable quantification of morphological changes induced by BLM even for the lowest concentration used (0.25 mg/kg). A reconstructed 2D-image of the entire lung section at high resolution (3.6 μm/pixel) has been performed from tissue density values allowing the visualization of their distribution throughout fibrotic and non-fibrotic regions. A significant correlation (p<0.0001) was found between automated analysis and the above standard evaluation methods. This correlation establishes automated analysis as a novel end-point measure of BLM-induced lung fibrosis in mice, which will be very valuable for future preclinical drug explorations.

  8. Quantification of Pulmonary Fibrosis in a Bleomycin Mouse Model Using Automated Histological Image Analysis

    PubMed Central

    Gilhodes, Jean-Claude; Kreuz, Sebastian; Stierstorfer, Birgit; Stiller, Detlef; Wollin, Lutz

    2017-01-01

    Current literature on pulmonary fibrosis induced in animal models highlights the need of an accurate, reliable and reproducible histological quantitative analysis. One of the major limits of histological scoring concerns the fact that it is observer-dependent and consequently subject to variability, which may preclude comparative studies between different laboratories. To achieve a reliable and observer-independent quantification of lung fibrosis we developed an automated software histological image analysis performed from digital image of entire lung sections. This automated analysis was compared to standard evaluation methods with regard to its validation as an end-point measure of fibrosis. Lung fibrosis was induced in mice by intratracheal administration of bleomycin (BLM) at 0.25, 0.5, 0.75 and 1 mg/kg. A detailed characterization of BLM-induced fibrosis was performed 14 days after BLM administration using lung function testing, micro-computed tomography and Ashcroft scoring analysis. Quantification of fibrosis by automated analysis was assessed based on pulmonary tissue density measured from thousands of micro-tiles processed from digital images of entire lung sections. Prior to analysis, large bronchi and vessels were manually excluded from the original images. Measurement of fibrosis has been expressed by two indexes: the mean pulmonary tissue density and the high pulmonary tissue density frequency. We showed that tissue density indexes gave access to a very accurate and reliable quantification of morphological changes induced by BLM even for the lowest concentration used (0.25 mg/kg). A reconstructed 2D-image of the entire lung section at high resolution (3.6 μm/pixel) has been performed from tissue density values allowing the visualization of their distribution throughout fibrotic and non-fibrotic regions. A significant correlation (p<0.0001) was found between automated analysis and the above standard evaluation methods. This correlation establishes automated analysis as a novel end-point measure of BLM-induced lung fibrosis in mice, which will be very valuable for future preclinical drug explorations. PMID:28107543

  9. The Challenges of Credible Thermal Protection System Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Green, Lawrence L.

    2013-01-01

    The paper discusses several of the challenges associated with developing a credible reliability estimate for a human-rated crew capsule thermal protection system. The process of developing such a credible estimate is subject to the quantification, modeling and propagation of numerous uncertainties within a probabilistic analysis. The development of specific investment recommendations, to improve the reliability prediction, among various potential testing and programmatic options is then accomplished through Bayesian analysis.

  10. Imaging-based quantification of hepatic fat: methods and clinical applications.

    PubMed

    Ma, Xiaozhou; Holalkere, Nagaraj-Setty; Kambadakone R, Avinash; Mino-Kenudson, Mari; Hahn, Peter F; Sahani, Dushyant V

    2009-01-01

    Fatty liver disease comprises a spectrum of conditions (simple hepatic steatosis, steatohepatitis with inflammatory changes, and end-stage liver disease with fibrosis and cirrhosis). Hepatic steatosis is often associated with diabetes and obesity and may be secondary to alcohol and drug use, toxins, viral infections, and metabolic diseases. Detection and quantification of liver fat have many clinical applications, and early recognition is crucial to institute appropriate management and prevent progression. Histopathologic analysis is the reference standard to detect and quantify fat in the liver, but results are vulnerable to sampling error. Moreover, it can cause morbidity and complications and cannot be repeated often enough to monitor treatment response. Imaging can be repeated regularly and allows assessment of the entire liver, thus avoiding sampling error. Selection of appropriate imaging methods demands understanding of their advantages and limitations and the suitable clinical setting. Ultrasonography is effective for detecting moderate or severe fatty infiltration but is limited by lack of interobserver reliability and intraobserver reproducibility. Computed tomography allows quantitative and qualitative evaluation and is generally highly accurate and reliable; however, the results may be confounded by hepatic parenchymal changes due to cirrhosis or depositional diseases. Magnetic resonance (MR) imaging with appropriate sequences (eg, chemical shift techniques) has similarly high sensitivity, and MR spectroscopy provides unique advantages for some applications. However, both are expensive and too complex to be used to monitor steatosis. (c) RSNA, 2009.

  11. Label-free measurement of histone lysine methyltransferases activity by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry.

    PubMed

    Guitot, Karine; Scarabelli, Silvia; Drujon, Thierry; Bolbach, Gérard; Amoura, Mehdi; Burlina, Fabienne; Jeltsch, Albert; Sagan, Sandrine; Guianvarc'h, Dominique

    2014-07-01

    Histone lysine methyltransferases (HKMTs) are enzymes that play an essential role in epigenetic regulation. Thus, identification of inhibitors specifically targeting these enzymes represents a challenge for the development of new antitumor therapeutics. Several methods for measuring HKMT activity are already available. Most of them use indirect measurement of the enzymatic reaction through radioactive labeling or antibody-recognized products or coupled enzymatic assays. Mass spectrometry (MS) represents an interesting alternative approach because it allows direct detection and quantification of enzymatic reactions and can be used to determine kinetics and to screen small molecules as potential inhibitors. Application of mass spectrometry to the study of HKMTs has not been fully explored yet. We describe here the development of a simple reliable label-free MALDI-TOF MS-based assay for the detection and quantification of peptide methylation, using SET7/9 as a model enzyme. Importantly, the use of expensive internal standard often required in mass spectrometry quantitative analysis is not necessary in this assay. This MS assay allowed us to determine enzyme kinetic parameters as well as IC50 for a known inhibitor of this enzyme. Furthermore, a comparative study with an antibody-based immunosorbent assay showed that the MS assay is more reliable and suitable for the screening of inhibitors. Copyright © 2014 Elsevier Inc. All rights reserved.

  12. An Excel-based implementation of the spectral method of action potential alternans analysis.

    PubMed

    Pearman, Charles M

    2014-12-01

    Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro-arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T-wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. © 2014 The Author. Physiological Reports published by Wiley Periodicals, Inc. on behalf of the American Physiological Society and The Physiological Society.

  13. Quantification of video-taped images in microcirculation research using inexpensive imaging software (Adobe Photoshop).

    PubMed

    Brunner, J; Krummenauer, F; Lehr, H A

    2000-04-01

    Study end-points in microcirculation research are usually video-taped images rather than numeric computer print-outs. Analysis of these video-taped images for the quantification of microcirculatory parameters usually requires computer-based image analysis systems. Most software programs for image analysis are custom-made, expensive, and limited in their applicability to selected parameters and study end-points. We demonstrate herein that an inexpensive, commercially available computer software (Adobe Photoshop), run on a Macintosh G3 computer with inbuilt graphic capture board provides versatile, easy to use tools for the quantification of digitized video images. Using images obtained by intravital fluorescence microscopy from the pre- and postischemic muscle microcirculation in the skinfold chamber model in hamsters, Photoshop allows simple and rapid quantification (i) of microvessel diameters, (ii) of the functional capillary density and (iii) of postischemic leakage of FITC-labeled high molecular weight dextran from postcapillary venules. We present evidence of the technical accuracy of the software tools and of a high degree of interobserver reliability. Inexpensive commercially available imaging programs (i.e., Adobe Photoshop) provide versatile tools for image analysis with a wide range of potential applications in microcirculation research.

  14. The Infeasibility of Experimental Quantification of Life-Critical Software Reliability

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Finelli, George B.

    1991-01-01

    This paper affirms that quantification of life-critical software reliability is infeasible using statistical methods whether applied to standard software or fault-tolerant software. The key assumption of software fault tolerance|separately programmed versions fail independently|is shown to be problematic. This assumption cannot be justified by experimentation in the ultra-reliability region and subjective arguments in its favor are not sufficiently strong to justify it as an axiom. Also, the implications of the recent multi-version software experiments support this affirmation.

  15. Use of multiple competitors for quantification of human immunodeficiency virus type 1 RNA in plasma.

    PubMed

    Vener, T; Nygren, M; Andersson, A; Uhlén, M; Albert, J; Lundeberg, J

    1998-07-01

    Quantification of human immunodeficiency virus type 1 (HIV-1) RNA in plasma has rapidly become an important tool in basic HIV research and in the clinical care of infected individuals. Here, a quantitative HIV assay based on competitive reverse transcription-PCR with multiple competitors was developed. Four RNA competitors containing identical PCR primer binding sequences as the viral HIV-1 RNA target were constructed. One of the PCR primers was fluorescently labeled, which facilitated discrimination between the viral RNA and competitor amplicons by fragment analysis with conventional automated sequencers. The coamplification of known amounts of the RNA competitors provided the means to establish internal calibration curves for the individual reactions resulting in exclusion of tube-to-tube variations. Calibration curves were created from the peak areas, which were proportional to the starting amount of each competitor. The fluorescence detection format was expanded to provide a dynamic range of more than 5 log units. This quantitative assay allowed for reproducible analysis of samples containing as few as 40 viral copies of HIV-1 RNA per reaction. The within- and between-run coefficients of variation were <24% (range, 10 to 24) and <36% (range, 27 to 36), respectively. The high reproducibility (standard deviation, <0.13 log) of the overall procedure for quantification of HIV-1 RNA in plasma, including sample preparation, amplification, and detection variations, allowed reliable detection of a 0.5-log change in RNA viral load. The assay could be a useful tool for monitoring HIV-1 disease progression and antiviral treatment and can easily be adapted to the quantification of other pathogens.

  16. Quantitative PCR estimates Angiostrongylus cantonensis (rat lungworm) infection levels in semi-slugs (Parmarion martensi)

    PubMed Central

    Jarvi, Susan I.; Farias, Margaret E.M.; Howe, Kay; Jacquier, Steven; Hollingsworth, Robert; Pitt, William

    2013-01-01

    The life cycle of the nematode Angiostrongylus cantonensis involves rats as the definitive host and slugs and snails as intermediate hosts. Humans can become infected upon ingestion of intermediate or paratenic (passive carrier) hosts containing stage L3 A. cantonensis larvae. Here, we report a quantitative PCR (qPCR) assay that provides a reliable, relative measure of parasite load in intermediate hosts. Quantification of the levels of infection of intermediate hosts is critical for determining A. cantonensis intensity on the Island of Hawaii. The identification of high intensity infection ‘hotspots’ will allow for more effective targeted rat and slug control measures. qPCR appears more efficient and sensitive than microscopy and provides a new tool for quantification of larvae from intermediate hosts, and potentially from other sources as well. PMID:22902292

  17. A Fast, Reliable, and Sensitive Method for Detection and Quantification of Listeria monocytogenes and Escherichia coli O157:H7 in Ready-to-Eat Fresh-Cut Products by MPN-qPCR

    PubMed Central

    Russo, Pasquale; Botticella, Giuseppe; Capozzi, Vittorio; Massa, Salvatore; Spano, Giuseppe; Beneduce, Luciano

    2014-01-01

    In the present work we developed a MPN quantitative real-time PCR (MPN-qPCR) method for a fast and reliable detection and quantification of Listeria monocytogenes and Escherichia coli O157:H7 in minimally processed vegetables. In order to validate the proposed technique, the results were compared with conventional MPN followed by phenotypic and biochemical assays methods. When L. monocytogenes and E. coli O157:H7 were artificially inoculated in fresh-cut vegetables, a concentration as low as 1 CFU g−1 could be detected in 48 hours for both pathogens. qPCR alone allowed a limit of detection of 101 CFU g−1 after 2 hours of enrichment for L. monocytogenes and E. coli O157:H7. Since minimally processed ready-to-eat vegetables are characterized by very short shelf life, our method can potentially address the consistent reduction of time for microbial analysis, allowing a better management of quality control. Moreover, the occurrences of both pathogenic bacteria in mixed salad samples and fresh-cut melons were monitored in two production plants from the receipt of the raw materials to the early stages of shelf life. No sample was found to be contaminated by L. monocytogenes. One sample of raw mixed salad was found positive to an H7 enterohemorrhagic serotype. PMID:24949460

  18. Comprehensive Design Reliability Activities for Aerospace Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Christenson, R. L.; Whitley, M. R.; Knight, K. C.

    2000-01-01

    This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.

  19. The Qiagen Investigator® Quantiplex HYres as an alternative kit for DNA quantification.

    PubMed

    Frégeau, Chantal J; Laurin, Nancy

    2015-05-01

    The Investigator® Quantiplex HYres kit was evaluated as a potential replacement for dual DNA quantification of casework samples. This kit was determined to be highly sensitive with a limit of quantification and limit of detection of 0.0049ng/μL and 0.0003ng/μL, respectively, for both human and male DNA, using full or half reaction volumes. It was also accurate in assessing the amount of male DNA present in 96 mock and actual casework male:female mixtures (various ratios) processed in this exercise. The close correlation between the male/human DNA ratios expressed in percentages derived from the Investigator® Quantiplex HYres quantification results and the male DNA proportion calculated in mixed AmpFlSTR® Profiler® Plus or AmpFlSTR® Identifiler® Plus profiles, using the Amelogenin Y peak and STR loci, allowed guidelines to be developed to facilitate decisions regarding when to submit samples to Y-STR rather than autosomal STR profiling. The internal control (IC) target was shown to be more sensitive to inhibitors compared to the human and male DNA targets included in the Investigator® Quantiplex HYres kit serving as a good quality assessor of DNA extracts. The new kit met our criteria of enhanced sensitivity, accuracy, consistency, reliability and robustness for casework DNA quantification. Crown Copyright © 2015. Published by Elsevier Ireland Ltd. All rights reserved.

  20. Automated solid-phase extraction coupled online with HPLC-FLD for the quantification of zearalenone in edible oil.

    PubMed

    Drzymala, Sarah S; Weiz, Stefan; Heinze, Julia; Marten, Silvia; Prinz, Carsten; Zimathies, Annett; Garbe, Leif-Alexander; Koch, Matthias

    2015-05-01

    Established maximum levels for the mycotoxin zearalenone (ZEN) in edible oil require monitoring by reliable analytical methods. Therefore, an automated SPE-HPLC online system based on dynamic covalent hydrazine chemistry has been developed. The SPE step comprises a reversible hydrazone formation by ZEN and a hydrazine moiety covalently attached to a solid phase. Seven hydrazine materials with different properties regarding the resin backbone, pore size, particle size, specific surface area, and loading have been evaluated. As a result, a hydrazine-functionalized silica gel was chosen. The final automated online method was validated and applied to the analysis of three maize germ oil samples including a provisionally certified reference material. Important performance criteria for the recovery (70-120 %) and precision (RSDr <25 %) as set by the Commission Regulation EC 401/2006 were fulfilled: The mean recovery was 78 % and RSDr did not exceed 8 %. The results of the SPE-HPLC online method were further compared to results obtained by liquid-liquid extraction with stable isotope dilution analysis LC-MS/MS and found to be in good agreement. The developed SPE-HPLC online system with fluorescence detection allows a reliable, accurate, and sensitive quantification (limit of quantification, 30 μg/kg) of ZEN in edible oils while significantly reducing the workload. To our knowledge, this is the first report on an automated SPE-HPLC method based on a covalent SPE approach.

  1. Integrated protocol for reliable and fast quantification and documentation of electrophoresis gels.

    PubMed

    Rehbein, Peter; Schwalbe, Harald

    2015-06-01

    Quantitative analysis of electrophoresis gels is an important part in molecular cloning, as well as in protein expression and purification. Parallel quantifications in yield and purity can be most conveniently obtained from densitometric analysis. This communication reports a comprehensive, reliable and simple protocol for gel quantification and documentation, applicable for single samples and with special features for protein expression screens. As major component of the protocol, the fully annotated code of a proprietary open source computer program for semi-automatic densitometric quantification of digitized electrophoresis gels is disclosed. The program ("GelQuant") is implemented for the C-based macro-language of the widespread integrated development environment of IGOR Pro. Copyright © 2014 Elsevier Inc. All rights reserved.

  2. Methods for quantification of soil-transmitted helminths in environmental media: current techniques and recent advances

    PubMed Central

    Collender, Philip A.; Kirby, Amy E.; Addiss, David G.; Freeman, Matthew C.; Remais, Justin V.

    2015-01-01

    Limiting the environmental transmission of soil-transmitted helminths (STH), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost effective methods to detect and quantify STH in the environment. We review the state of the art of STH quantification in soil, biosolids, water, produce, and vegetation with respect to four major methodological issues: environmental sampling; recovery of STH from environmental matrices; quantification of recovered STH; and viability assessment of STH ova. We conclude that methods for sampling and recovering STH require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment. PMID:26440788

  3. MSblender: A probabilistic approach for integrating peptide identifications from multiple database search engines.

    PubMed

    Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I; Marcotte, Edward M

    2011-07-01

    Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for every possible PSM and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for most proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses.

  4. MSblender: a probabilistic approach for integrating peptide identifications from multiple database search engines

    PubMed Central

    Kwon, Taejoon; Choi, Hyungwon; Vogel, Christine; Nesvizhskii, Alexey I.; Marcotte, Edward M.

    2011-01-01

    Shotgun proteomics using mass spectrometry is a powerful method for protein identification but suffers limited sensitivity in complex samples. Integrating peptide identifications from multiple database search engines is a promising strategy to increase the number of peptide identifications and reduce the volume of unassigned tandem mass spectra. Existing methods pool statistical significance scores such as p-values or posterior probabilities of peptide-spectrum matches (PSMs) from multiple search engines after high scoring peptides have been assigned to spectra, but these methods lack reliable control of identification error rates as data are integrated from different search engines. We developed a statistically coherent method for integrative analysis, termed MSblender. MSblender converts raw search scores from search engines into a probability score for all possible PSMs and properly accounts for the correlation between search scores. The method reliably estimates false discovery rates and identifies more PSMs than any single search engine at the same false discovery rate. Increased identifications increment spectral counts for all detected proteins and allow quantification of proteins that would not have been quantified by individual search engines. We also demonstrate that enhanced quantification contributes to improve sensitivity in differential expression analyses. PMID:21488652

  5. Photoacoustic-fluorescence in vitro flow cytometry for quantification of absorption, scattering and fluorescence properties of the cells

    NASA Astrophysics Data System (ADS)

    Nedosekin, D. A.; Sarimollaoglu, M.; Foster, S.; Galanzha, E. I.; Zharov, V. P.

    2013-03-01

    Fluorescence flow cytometry is a well-established analytical tool that provides quantification of multiple biological parameters of cells at molecular levels, including their functional states, morphology, composition, proliferation, and protein expression. However, only the fluorescence and scattering parameters of the cells or labels are available for detection. Cell pigmentation, presence of non-fluorescent dyes or nanoparticles cannot be reliably quantified. Herewith, we present a novel photoacoustic (PA) flow cytometry design for simple integration of absorbance measurements into schematics of conventional in vitro flow cytometers. The integrated system allow simultaneous measurements of light absorbance, scattering and of multicolor fluorescence from single cells in the flow at rates up to 2 m/s. We compared various combinations of excitation laser sources for multicolor detection, including simultaneous excitation of PA and fluorescence using a single 500 kHz pulsed nanosecond laser. Multichannel detection scheme allows simultaneous detection of up to 8 labels, including 4 fluorescent tags and 4 PA colors. In vitro PA-fluorescence flow cytometer was used for studies of nanoparticles uptake and for the analysis of cell line pigmentation, including genetically encoded melanin expression in breast cancer cell line. We demonstrate that this system can be used for direct nanotoxicity studies with simultaneous quantification of nanoparticles content and assessment of cell viability using a conventional fluorescent apoptosis assays.

  6. Near-infrared spectroscopy for the detection and quantification of bacterial contaminations in pharmaceutical products.

    PubMed

    Quintelas, Cristina; Mesquita, Daniela P; Lopes, João A; Ferreira, Eugénio C; Sousa, Clara

    2015-08-15

    Accurate detection and quantification of microbiological contaminations remains an issue mainly due the lack of rapid and precise analytical techniques. Standard methods are expensive and time-consuming being associated to high economic losses and public health threats. In the context of pharmaceutical industry, the development of fast analytical techniques able to overcome these limitations is crucial and spectroscopic techniques might constitute a reliable alternative. In this work we proved the ability of Fourier transform near infrared spectroscopy (FT-NIRS) to detect and quantify bacteria (Bacillus subtilis, Escherichia coli, Pseudomonas fluorescens, Salmonella enterica, Staphylococcus epidermidis) from 10 to 10(8) CFUs/mL in sterile saline solutions (NaCl 0.9%). Partial least squares discriminant analysis (PLSDA) models showed that FT-NIRS was able to discriminate between sterile and contaminated solutions for all bacteria as well as to identify the contaminant bacteria. Partial least squares (PLS) models allowed bacterial quantification with limits of detection ranging from 5.1 to 9 CFU/mL for E. coli and B. subtilis, respectively. This methodology was successfully validated in three pharmaceutical preparations (contact lens solution, cough syrup and topic anti-inflammatory solution) proving that this technique possess a high potential to be routinely used for the detection and quantification of bacterial contaminations. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Methods for Quantification of Soil-Transmitted Helminths in Environmental Media: Current Techniques and Recent Advances.

    PubMed

    Collender, Philip A; Kirby, Amy E; Addiss, David G; Freeman, Matthew C; Remais, Justin V

    2015-12-01

    Limiting the environmental transmission of soil-transmitted helminths (STHs), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost-effective methods to detect and quantify STHs in the environment. We review the state-of-the-art of STH quantification in soil, biosolids, water, produce, and vegetation with regard to four major methodological issues: environmental sampling; recovery of STHs from environmental matrices; quantification of recovered STHs; and viability assessment of STH ova. We conclude that methods for sampling and recovering STHs require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment. Copyright © 2015 Elsevier Ltd. All rights reserved.

  8. Automated quantification of pancreatic β-cell mass

    PubMed Central

    Golson, Maria L.; Bush, William S.

    2014-01-01

    β-Cell mass is a parameter commonly measured in studies of islet biology and diabetes. However, the rigorous quantification of pancreatic β-cell mass using conventional histological methods is a time-consuming process. Rapidly evolving virtual slide technology with high-resolution slide scanners and newly developed image analysis tools has the potential to transform β-cell mass measurement. To test the effectiveness and accuracy of this new approach, we assessed pancreata from normal C57Bl/6J mice and from mouse models of β-cell ablation (streptozotocin-treated mice) and β-cell hyperplasia (leptin-deficient mice), using a standardized systematic sampling of pancreatic specimens. Our data indicate that automated analysis of virtual pancreatic slides is highly reliable and yields results consistent with those obtained by conventional morphometric analysis. This new methodology will allow investigators to dramatically reduce the time required for β-cell mass measurement by automating high-resolution image capture and analysis of entire pancreatic sections. PMID:24760991

  9. Customized Consensus Spectral Library Building for Untargeted Quantitative Metabolomics Analysis with Data Independent Acquisition Mass Spectrometry and MetaboDIA Workflow.

    PubMed

    Chen, Gengbo; Walmsley, Scott; Cheung, Gemmy C M; Chen, Liyan; Cheng, Ching-Yu; Beuerman, Roger W; Wong, Tien Yin; Zhou, Lei; Choi, Hyungwon

    2017-05-02

    Data independent acquisition-mass spectrometry (DIA-MS) coupled with liquid chromatography is a promising approach for rapid, automatic sampling of MS/MS data in untargeted metabolomics. However, wide isolation windows in DIA-MS generate MS/MS spectra containing a mixed population of fragment ions together with their precursor ions. This precursor-fragment ion map in a comprehensive MS/MS spectral library is crucial for relative quantification of fragment ions uniquely representative of each precursor ion. However, existing reference libraries are not sufficient for this purpose since the fragmentation patterns of small molecules can vary in different instrument setups. Here we developed a bioinformatics workflow called MetaboDIA to build customized MS/MS spectral libraries using a user's own data dependent acquisition (DDA) data and to perform MS/MS-based quantification with DIA data, thus complementing conventional MS1-based quantification. MetaboDIA also allows users to build a spectral library directly from DIA data in studies of a large sample size. Using a marine algae data set, we show that quantification of fragment ions extracted with a customized MS/MS library can provide as reliable quantitative data as the direct quantification of precursor ions based on MS1 data. To test its applicability in complex samples, we applied MetaboDIA to a clinical serum metabolomics data set, where we built a DDA-based spectral library containing consensus spectra for 1829 compounds. We performed fragment ion quantification using DIA data using this library, yielding sensitive differential expression analysis.

  10. On the short-term uncertainty in performance f a point absorber wave energy converter

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Coe, Ryan Geoffrey; Michelen, Carlos; Manuel, Lance

    2016-03-01

    Of interest, in this study, is the quantification of uncertainty in the performance of a two-body wave point absorber (Reference Model 3 or RM3), which serves as a wave energy converter (WEC). We demonstrate how simulation tools may be used to establish short-term relationships between any performance parameter of the WEC device and wave height in individual sea states. We demonstrate this methodology for two sea states. Efficient structural reliability methods, validated using more expensive Monte Carlo sampling, allow the estimation of uncertainty in performance of the device. Such methods, when combined with metocean data quantifying the likelihood of differentmore » sea states, can be useful in long-term studies and in reliability-based design.« less

  11. Use of Multiple Competitors for Quantification of Human Immunodeficiency Virus Type 1 RNA in Plasma

    PubMed Central

    Vener, Tanya; Nygren, Malin; Andersson, AnnaLena; Uhlén, Mathias; Albert, Jan; Lundeberg, Joakim

    1998-01-01

    Quantification of human immunodeficiency virus type 1 (HIV-1) RNA in plasma has rapidly become an important tool in basic HIV research and in the clinical care of infected individuals. Here, a quantitative HIV assay based on competitive reverse transcription-PCR with multiple competitors was developed. Four RNA competitors containing identical PCR primer binding sequences as the viral HIV-1 RNA target were constructed. One of the PCR primers was fluorescently labeled, which facilitated discrimination between the viral RNA and competitor amplicons by fragment analysis with conventional automated sequencers. The coamplification of known amounts of the RNA competitors provided the means to establish internal calibration curves for the individual reactions resulting in exclusion of tube-to-tube variations. Calibration curves were created from the peak areas, which were proportional to the starting amount of each competitor. The fluorescence detection format was expanded to provide a dynamic range of more than 5 log units. This quantitative assay allowed for reproducible analysis of samples containing as few as 40 viral copies of HIV-1 RNA per reaction. The within- and between-run coefficients of variation were <24% (range, 10 to 24) and <36% (range, 27 to 36), respectively. The high reproducibility (standard deviation, <0.13 log) of the overall procedure for quantification of HIV-1 RNA in plasma, including sample preparation, amplification, and detection variations, allowed reliable detection of a 0.5-log change in RNA viral load. The assay could be a useful tool for monitoring HIV-1 disease progression and antiviral treatment and can easily be adapted to the quantification of other pathogens. PMID:9650926

  12. Quantification of Endospore-Forming Firmicutes by Quantitative PCR with the Functional Gene spo0A

    PubMed Central

    Bueche, Matthieu; Wunderlin, Tina; Roussel-Delif, Ludovic; Junier, Thomas; Sauvain, Loic; Jeanneret, Nicole

    2013-01-01

    Bacterial endospores are highly specialized cellular forms that allow endospore-forming Firmicutes (EFF) to tolerate harsh environmental conditions. EFF are considered ubiquitous in natural environments, in particular, those subjected to stress conditions. In addition to natural habitats, EFF are often the cause of contamination problems in anthropogenic environments, such as industrial production plants or hospitals. It is therefore desirable to assess their prevalence in environmental and industrial fields. To this end, a high-sensitivity detection method is still needed. The aim of this study was to develop and evaluate an approach based on quantitative PCR (qPCR). For this, the suitability of functional genes specific for and common to all EFF were evaluated. Seven genes were considered, but only spo0A was retained to identify conserved regions for qPCR primer design. An approach based on multivariate analysis was developed for primer design. Two primer sets were obtained and evaluated with 16 pure cultures, including representatives of the genera Bacillus, Paenibacillus, Brevibacillus, Geobacillus, Alicyclobacillus, Sulfobacillus, Clostridium, and Desulfotomaculum, as well as with environmental samples. The primer sets developed gave a reliable quantification when tested on laboratory strains, with the exception of Sulfobacillus and Desulfotomaculum. A test using sediment samples with a diverse EFF community also gave a reliable quantification compared to 16S rRNA gene pyrosequencing. A detection limit of about 104 cells (or spores) per gram of initial material was calculated, indicating this method has a promising potential for the detection of EFF over a wide range of applications. PMID:23811505

  13. Global Relative Quantification with Liquid Chromatography–Matrix-assisted Laser Desorption Ionization Time-of-flight (LC-MALDI-TOF)—Cross–validation with LTQ-Orbitrap Proves Reliability and Reveals Complementary Ionization Preferences*

    PubMed Central

    Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte

    2013-01-01

    Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC–electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments. PMID:23788530

  14. Global relative quantification with liquid chromatography-matrix-assisted laser desorption ionization time-of-flight (LC-MALDI-TOF)--cross-validation with LTQ-Orbitrap proves reliability and reveals complementary ionization preferences.

    PubMed

    Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte

    2013-10-01

    Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC-electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments.

  15. Separation and quantification of 15 carotenoids by reversed phase high performance liquid chromatography coupled to diode array detection with isosbestic wavelength approach.

    PubMed

    Mitrowska, Kamila; Vincent, Ursula; von Holst, Christoph

    2012-04-13

    The manuscript presents the development of a new reverse phase high performance liquid chromatography (RP-HPLC) photo diode array detection method allowing the separation and quantification of 15 carotenoids (adonirubin, adonixanthin, astaxanthin, astaxanthin dimethyl disuccinate, asteroidenone, beta-apo-8'-carotenal, beta-apo-8'-carotenoic acid ethyl ester, beta-carotene, canthaxanthin, capsanthin, citranaxanthin, echinenone, lutein, lycopene, and zeaxanthin), 10 of which are feed additives authorised within the European Union. The developed method allows for the reliable determination of the total carotenoid content in one run using the corresponding E-isomer as calibration standard while taking into account the E/Z-isomers composition. This is a key criterion for the application of the method, since for most of the analytes included in this study analytical standards are only available for the E-isomers. This goal was achieved by applying the isosbestic concept, in order to identify specific wavelengths, at which the absorption coefficients are identical for all stereoisomers concerned. The second target referred to the optimisation of the LC conditions. By means of an experimental design, an optimised RP-HPLC method was developed allowing for a sufficient chromatographic separation of all carotenoids. The selected method uses a Suplex pKb-100 HPLC column and applying a gradient with a mixture of acetonitrile, tert-butyl-methyl ether and water as mobile phases. The limits of detection and limits of quantification ranged from 0.06 mg L(-1) to 0.14 mg L(-1) and from 0.20 mg L(-1) to 0.48 mg L(-1), respectively. Copyright © 2012 Elsevier B.V. All rights reserved.

  16. Confidence in outcome estimates from systematic reviews used in informed consent.

    PubMed

    Fritz, Robert; Bauer, Janet G; Spackman, Sue S; Bains, Amanjyot K; Jetton-Rangel, Jeanette

    2016-12-01

    Evidence-based dentistry now guides informed consent in which clinicians are obliged to provide patients with the most current, best evidence, or best estimates of outcomes, of regimens, therapies, treatments, procedures, materials, and equipment or devices when developing personal oral health care, treatment plans. Yet, clinicians require that the estimates provided from systematic reviews be verified to their validity, reliability, and contextualized as to performance competency so that clinicians may have confidence in explaining outcomes to patients in clinical practice. The purpose of this paper was to describe types of informed estimates from which clinicians may have confidence in their capacity to assist patients in competent decision-making, one of the most important concepts of informed consent. Using systematic review methodology, researchers provide clinicians with valid best estimates of outcomes regarding a subject of interest from best evidence. Best evidence is verified through critical appraisals using acceptable sampling methodology either by scoring instruments (Timmer analysis) or checklist (grade), a Cochrane Collaboration standard that allows transparency in open reviews. These valid best estimates are then tested for reliability using large databases. Finally, valid and reliable best estimates are assessed for meaning using quantification of margins and uncertainties. Through manufacturer and researcher specifications, quantification of margins and uncertainties develops a performance competency continuum by which valid, reliable best estimates may be contextualized for their performance competency: at a lowest margin performance competency (structural failure), high margin performance competency (estimated true value of success), or clinically determined critical values (clinical failure). Informed consent may be achieved when clinicians are confident of their ability to provide useful and accurate best estimates of outcomes regarding regimens, therapies, treatments, and equipment or devices to patients in their clinical practices and when developing personal, oral health care, treatment plans. Copyright © 2016 Elsevier Inc. All rights reserved.

  17. Automation of a Nile red staining assay enables high throughput quantification of microalgal lipid production.

    PubMed

    Morschett, Holger; Wiechert, Wolfgang; Oldiges, Marco

    2016-02-09

    Within the context of microalgal lipid production for biofuels and bulk chemical applications, specialized higher throughput devices for small scale parallelized cultivation are expected to boost the time efficiency of phototrophic bioprocess development. However, the increasing number of possible experiments is directly coupled to the demand for lipid quantification protocols that enable reliably measuring large sets of samples within short time and that can deal with the reduced sample volume typically generated at screening scale. To meet these demands, a dye based assay was established using a liquid handling robot to provide reproducible high throughput quantification of lipids with minimized hands-on-time. Lipid production was monitored using the fluorescent dye Nile red with dimethyl sulfoxide as solvent facilitating dye permeation. The staining kinetics of cells at different concentrations and physiological states were investigated to successfully down-scale the assay to 96 well microtiter plates. Gravimetric calibration against a well-established extractive protocol enabled absolute quantification of intracellular lipids improving precision from ±8 to ±2 % on average. Implementation into an automated liquid handling platform allows for measuring up to 48 samples within 6.5 h, reducing hands-on-time to a third compared to manual operation. Moreover, it was shown that automation enhances accuracy and precision compared to manual preparation. It was revealed that established protocols relying on optical density or cell number for biomass adjustion prior to staining may suffer from errors due to significant changes of the cells' optical and physiological properties during cultivation. Alternatively, the biovolume was used as a measure for biomass concentration so that errors from morphological changes can be excluded. The newly established assay proved to be applicable for absolute quantification of algal lipids avoiding limitations of currently established protocols, namely biomass adjustment and limited throughput. Automation was shown to improve data reliability, as well as experimental throughput simultaneously minimizing the needed hands-on-time to a third. Thereby, the presented protocol meets the demands for the analysis of samples generated by the upcoming generation of devices for higher throughput phototrophic cultivation and thereby contributes to boosting the time efficiency for setting up algae lipid production processes.

  18. Assessing the impact of land use change on hydrology by ensemble modeling (LUCHEM) III: Scenario analysis

    USGS Publications Warehouse

    Huisman, J.A.; Breuer, L.; Bormann, H.; Bronstert, A.; Croke, B.F.W.; Frede, H.-G.; Graff, T.; Hubrechts, L.; Jakeman, A.J.; Kite, G.; Lanini, J.; Leavesley, G.; Lettenmaier, D.P.; Lindstrom, G.; Seibert, J.; Sivapalan, M.; Viney, N.R.; Willems, P.

    2009-01-01

    An ensemble of 10 hydrological models was applied to the same set of land use change scenarios. There was general agreement about the direction of changes in the mean annual discharge and 90% discharge percentile predicted by the ensemble members, although a considerable range in the magnitude of predictions for the scenarios and catchments under consideration was obvious. Differences in the magnitude of the increase were attributed to the different mean annual actual evapotranspiration rates for each land use type. The ensemble of model runs was further analyzed with deterministic and probabilistic ensemble methods. The deterministic ensemble method based on a trimmed mean resulted in a single somewhat more reliable scenario prediction. The probabilistic reliability ensemble averaging (REA) method allowed a quantification of the model structure uncertainty in the scenario predictions. It was concluded that the use of a model ensemble has greatly increased our confidence in the reliability of the model predictions. ?? 2008 Elsevier Ltd.

  19. A liquid chromatography-tandem mass spectrometry-based targeted proteomics assay for monitoring P-glycoprotein levels in human breast tissue.

    PubMed

    Yang, Ting; Chen, Fei; Xu, Feifei; Wang, Fengliang; Xu, Qingqing; Chen, Yun

    2014-09-25

    P-glycoprotein (P-gp) can efflux drugs from cancer cells, and its overexpression is commonly associated with multi-drug resistance (MDR). Thus, the accurate quantification of P-gp would help predict the response to chemotherapy and for prognosis of breast cancer patients. An advanced liquid chromatography-tandem mass spectrometry (LC/MS/MS)-based targeted proteomics assay was developed and validated for monitoring P-gp levels in breast tissue. Tryptic peptide 368IIDNKPSIDSYSK380 was selected as a surrogate analyte for quantification, and immuno-depleted tissue extract was used as a surrogate matrix. Matched pairs of breast tissue samples from 60 patients who were suspected to have drug resistance were subject to analysis. The levels of P-gp were quantified. Using data from normal tissue, we suggested a P-gp reference interval. The experimental values of tumor tissue samples were compared with those obtained from Western blotting and immunohistochemistry (IHC). The result indicated that the targeted proteomics approach was comparable to IHC but provided a lower limit of quantification (LOQ) and could afford more reliable results at low concentrations than the other two methods. LC/MS/MS-based targeted proteomics may allow the quantification of P-gp in breast tissue in a more accurate manner. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Quantification of chitinase and thaumatin-like proteins in grape juices and wines.

    PubMed

    Le Bourse, D; Conreux, A; Villaume, S; Lameiras, P; Nuzillard, J-M; Jeandet, P

    2011-09-01

    Chitinases and thaumatin-like proteins are important grape proteins as they have a great influence on wine quality. The quantification of these proteins in grape juices and wines, along with their purification, is therefore crucial to study their intrinsic characteristics and the exact role they play in wines. The main isoforms of these two proteins from Chardonnay grape juice were thus purified by liquid chromatography. Two fast protein liquid chromatography (FLPC) steps allowed the fractionation and purification of the juice proteins, using cation exchange and hydrophobic interaction media. A further high-performance liquid chromatography (HPLC) step was used to achieve higher purity levels. Fraction assessment was achieved by mass spectrometry. Fraction purity was determined by HPLC to detect the presence of protein contaminants, and by nuclear magnetic resonance (NMR) spectroscopy to detect the presence of organic contaminants. Once pure fractions of lyophilized chitinase and thaumatin-like protein were obtained, ultra-HPLC (UHPLC) and enzyme-linked immunosorbent assay (ELISA) calibration curves were constructed. The quantification of these proteins in different grape juice and wine samples was thus achieved for the first time with both techniques through comparison with the purified protein calibration curve. UHPLC and ELISA showed very consistent results (less than 16% deviation for both proteins) and either could be considered to provide an accurate and reliable quantification of proteins in the oenology field.

  1. Evaluation of the reliability of maize reference assays for GMO quantification.

    PubMed

    Papazova, Nina; Zhang, David; Gruden, Kristina; Vojvoda, Jana; Yang, Litao; Buh Gasparic, Meti; Blejec, Andrej; Fouilloux, Stephane; De Loose, Marc; Taverniers, Isabel

    2010-03-01

    A reliable PCR reference assay for relative genetically modified organism (GMO) quantification must be specific for the target taxon and amplify uniformly along the commercialised varieties within the considered taxon. Different reference assays for maize (Zea mays L.) are used in official methods for GMO quantification. In this study, we evaluated the reliability of eight existing maize reference assays, four of which are used in combination with an event-specific polymerase chain reaction (PCR) assay validated and published by the Community Reference Laboratory (CRL). We analysed the nucleotide sequence variation in the target genomic regions in a broad range of transgenic and conventional varieties and lines: MON 810 varieties cultivated in Spain and conventional varieties from various geographical origins and breeding history. In addition, the reliability of the assays was evaluated based on their PCR amplification performance. A single base pair substitution, corresponding to a single nucleotide polymorphism (SNP) reported in an earlier study, was observed in the forward primer of one of the studied alcohol dehydrogenase 1 (Adh1) (70) assays in a large number of varieties. The SNP presence is consistent with a poor PCR performance observed for this assay along the tested varieties. The obtained data show that the Adh1 (70) assay used in the official CRL NK603 assay is unreliable. Based on our results from both the nucleotide stability study and the PCR performance test, we can conclude that the Adh1 (136) reference assay (T25 and Bt11 assays) as well as the tested high mobility group protein gene assay, which also form parts of CRL methods for quantification, are highly reliable. Despite the observed uniformity in the nucleotide sequence of the invertase gene assay, the PCR performance test reveals that this target sequence might occur in more than one copy. Finally, although currently not forming a part of official quantification methods, zein and SSIIb assays are found to be highly reliable in terms of nucleotide stability and PCR performance and are proposed as good alternative targets for a reference assay for maize.

  2. Development of a real-time PCR method for the differential detection and quantification of four solanaceae in GMO analysis: potato (Solanum tuberosum), tomato (Solanum lycopersicum), eggplant (Solanum melongena), and pepper (Capsicum annuum).

    PubMed

    Chaouachi, Maher; El Malki, Redouane; Berard, Aurélie; Romaniuk, Marcel; Laval, Valérie; Brunel, Dominique; Bertheau, Yves

    2008-03-26

    The labeling of products containing genetically modified organisms (GMO) is linked to their quantification since a threshold for the presence of fortuitous GMOs in food has been established. This threshold is calculated from a combination of two absolute quantification values: one for the specific GMO target and the second for an endogenous reference gene specific to the taxon. Thus, the development of reliable methods to quantify GMOs using endogenous reference genes in complex matrixes such as food and feed is needed. Plant identification can be difficult in the case of closely related taxa, which moreover are subject to introgression events. Based on the homology of beta-fructosidase sequences obtained from public databases, two couples of consensus primers were designed for the detection, quantification, and differentiation of four Solanaceae: potato (Solanum tuberosum), tomato (Solanum lycopersicum), pepper (Capsicum annuum), and eggplant (Solanum melongena). Sequence variability was studied first using lines and cultivars (intraspecies sequence variability), then using taxa involved in gene introgressions, and finally, using taxonomically close taxa (interspecies sequence variability). This study allowed us to design four highly specific TaqMan-MGB probes. A duplex real time PCR assay was developed for simultaneous quantification of tomato and potato. For eggplant and pepper, only simplex real time PCR tests were developed. The results demonstrated the high specificity and sensitivity of the assays. We therefore conclude that beta-fructosidase can be used as an endogenous reference gene for GMO analysis.

  3. Detection and Quantification of Human Fecal Pollution with Real-Time PCR

    EPA Science Inventory

    ABSTRACT Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for enumeration of two recently described ...

  4. Critical points of DNA quantification by real-time PCR – effects of DNA extraction method and sample matrix on quantification of genetically modified organisms

    PubMed Central

    Cankar, Katarina; Štebih, Dejan; Dreo, Tanja; Žel, Jana; Gruden, Kristina

    2006-01-01

    Background Real-time PCR is the technique of choice for nucleic acid quantification. In the field of detection of genetically modified organisms (GMOs) quantification of biotech products may be required to fulfil legislative requirements. However, successful quantification depends crucially on the quality of the sample DNA analyzed. Methods for GMO detection are generally validated on certified reference materials that are in the form of powdered grain material, while detection in routine laboratories must be performed on a wide variety of sample matrixes. Due to food processing, the DNA in sample matrixes can be present in low amounts and also degraded. In addition, molecules of plant origin or from other sources that affect PCR amplification of samples will influence the reliability of the quantification. Further, the wide variety of sample matrixes presents a challenge for detection laboratories. The extraction method must ensure high yield and quality of the DNA obtained and must be carefully selected, since even components of DNA extraction solutions can influence PCR reactions. GMO quantification is based on a standard curve, therefore similarity of PCR efficiency for the sample and standard reference material is a prerequisite for exact quantification. Little information on the performance of real-time PCR on samples of different matrixes is available. Results Five commonly used DNA extraction techniques were compared and their suitability for quantitative analysis was assessed. The effect of sample matrix on nucleic acid quantification was assessed by comparing 4 maize and 4 soybean matrixes. In addition 205 maize and soybean samples from routine analysis were analyzed for PCR efficiency to assess variability of PCR performance within each sample matrix. Together with the amount of DNA needed for reliable quantification, PCR efficiency is the crucial parameter determining the reliability of quantitative results, therefore it was chosen as the primary criterion by which to evaluate the quality and performance on different matrixes and extraction techniques. The effect of PCR efficiency on the resulting GMO content is demonstrated. Conclusion The crucial influence of extraction technique and sample matrix properties on the results of GMO quantification is demonstrated. Appropriate extraction techniques for each matrix need to be determined to achieve accurate DNA quantification. Nevertheless, as it is shown that in the area of food and feed testing matrix with certain specificities is impossible to define strict quality controls need to be introduced to monitor PCR. The results of our study are also applicable to other fields of quantitative testing by real-time PCR. PMID:16907967

  5. Quantification of the Effects of Salt Stress and Physiological State on Thermotolerance of Bacillus cereus ATCC 10987 and ATCC 14579

    PubMed Central

    den Besten, Heidy M. W.; Mataragas, Marios; Moezelaar, Roy; Abee, Tjakko; Zwietering, Marcel H.

    2006-01-01

    The food-borne pathogen Bacillus cereus can acquire enhanced thermal resistance through multiple mechanisms. Two Bacillus cereus strains, ATCC 10987 and ATCC 14579, were used to quantify the effects of salt stress and physiological state on thermotolerance. Cultures were exposed to increasing concentrations of sodium chloride for 30 min, after which their thermotolerance was assessed at 50°C. Linear and nonlinear microbial survival models, which cover a wide range of known inactivation curvatures for vegetative cells, were fitted to the inactivation data and evaluated. Based on statistical indices and model characteristics, biphasic models with a shoulder were selected and used for quantification. Each model parameter reflected a survival characteristic, and both models were flexible, allowing a reduction of parameters when certain phenomena were not present. Both strains showed enhanced thermotolerance after preexposure to (non)lethal salt stress conditions in the exponential phase. The maximum adaptive stress response due to salt preexposure demonstrated for exponential-phase cells was comparable to the effect of physiological state on thermotolerance in both strains. However, the adaptive salt stress response was less pronounced for transition- and stationary-phase cells. The distinct tailing of strain ATCC 10987 was attributed to the presence of a subpopulation of spores. The existence of a stable heat-resistant subpopulation of vegetative cells could not be demonstrated for either of the strains. Quantification of the adaptive stress response might be instrumental in understanding adaptation mechanisms and will allow the food industry to develop more accurate and reliable stress-integrated predictive modeling to optimize minimal processing conditions. PMID:16957208

  6. Stirling Convertor Fasteners Reliability Quantification

    NASA Technical Reports Server (NTRS)

    Shah, Ashwin R.; Korovaichuk, Igor; Kovacevich, Tiodor; Schreiber, Jeffrey G.

    2006-01-01

    Onboard Radioisotope Power Systems (RPS) being developed for NASA s deep-space science and exploration missions require reliable operation for up to 14 years and beyond. Stirling power conversion is a candidate for use in an RPS because it offers a multifold increase in the conversion efficiency of heat to electric power and reduced inventory of radioactive material. Structural fasteners are responsible to maintain structural integrity of the Stirling power convertor, which is critical to ensure reliable performance during the entire mission. Design of fasteners involve variables related to the fabrication, manufacturing, behavior of fasteners and joining parts material, structural geometry of the joining components, size and spacing of fasteners, mission loads, boundary conditions, etc. These variables have inherent uncertainties, which need to be accounted for in the reliability assessment. This paper describes these uncertainties along with a methodology to quantify the reliability, and provides results of the analysis in terms of quantified reliability and sensitivity of Stirling power conversion reliability to the design variables. Quantification of the reliability includes both structural and functional aspects of the joining components. Based on the results, the paper also describes guidelines to improve the reliability and verification testing.

  7. Quantification of in vivo short echo-time proton magnetic resonance spectra at 14.1 T using two different approaches of modelling the macromolecule spectrum

    NASA Astrophysics Data System (ADS)

    Cudalbu, C.; Mlynárik, V.; Xin, L.; Gruetter, Rolf

    2009-10-01

    Reliable quantification of the macromolecule signals in short echo-time 1H MRS spectra is particularly important at high magnetic fields for an accurate quantification of metabolite concentrations (the neurochemical profile) due to effectively increased spectral resolution of the macromolecule components. The purpose of the present study was to assess two approaches of quantification, which take the contribution of macromolecules into account in the quantification step. 1H spectra were acquired on a 14.1 T/26 cm horizontal scanner on five rats using the ultra-short echo-time SPECIAL (spin echo full intensity acquired localization) spectroscopy sequence. Metabolite concentrations were estimated using LCModel, combined with a simulated basis set of metabolites using published spectral parameters and either the spectrum of macromolecules measured in vivo, using an inversion recovery technique, or baseline simulated by the built-in spline function. The fitted spline function resulted in a smooth approximation of the in vivo macromolecules, but in accordance with previous studies using Subtract-QUEST could not reproduce completely all features of the in vivo spectrum of macromolecules at 14.1 T. As a consequence, the measured macromolecular 'baseline' led to a more accurate and reliable quantification at higher field strengths.

  8. Monitoring the Wobbe Index of Natural Gas Using Fiber-Enhanced Raman Spectroscopy.

    PubMed

    Sandfort, Vincenz; Trabold, Barbara M; Abdolvand, Amir; Bolwien, Carsten; Russell, Philip St. J; Wöllenstein, Jürgen; Palzer, Stefan

    2017-11-24

    The fast and reliable analysis of the natural gas composition requires the simultaneous quantification of numerous gaseous components. To this end, fiber-enhanced Raman spectroscopy is a powerful tool to detect most components in a single measurement using a single laser source. However, practical issues such as detection limit, gas exchange time and background Raman signals from the fiber material still pose obstacles to utilizing the scheme in real-world settings. This paper compares the performance of two types of hollow-core photonic crystal fiber (PCF), namely photonic bandgap PCF and kagomé-style PCF, and assesses their potential for online determination of the Wobbe index. In contrast to bandgap PCF, kagomé-PCF allows for reliable detection of Raman-scattered photons even below 1200 cm -1 , which in turn enables fast and comprehensive assessment of the natural gas quality of arbitrary mixtures.

  9. Monitoring the Wobbe Index of Natural Gas Using Fiber-Enhanced Raman Spectroscopy

    PubMed Central

    Sandfort, Vincenz; Trabold, Barbara M.; Abdolvand, Amir; Bolwien, Carsten; Russell, Philip St. J.; Wöllenstein, Jürgen

    2017-01-01

    The fast and reliable analysis of the natural gas composition requires the simultaneous quantification of numerous gaseous components. To this end, fiber-enhanced Raman spectroscopy is a powerful tool to detect most components in a single measurement using a single laser source. However, practical issues such as detection limit, gas exchange time and background Raman signals from the fiber material still pose obstacles to utilizing the scheme in real-world settings. This paper compares the performance of two types of hollow-core photonic crystal fiber (PCF), namely photonic bandgap PCF and kagomé-style PCF, and assesses their potential for online determination of the Wobbe index. In contrast to bandgap PCF, kagomé-PCF allows for reliable detection of Raman-scattered photons even below 1200 cm−1, which in turn enables fast and comprehensive assessment of the natural gas quality of arbitrary mixtures. PMID:29186768

  10. DETECTION AND QUANTIFICATION OF COW FECAL POLLUTION WITH REAL-TIME PCR

    EPA Science Inventory

    Assessment of health risk and fecal bacteria loads associated with cow fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for enumeration of two recently described cow-specific g...

  11. FineSplice, enhanced splice junction detection and quantification: a novel pipeline based on the assessment of diverse RNA-Seq alignment solutions.

    PubMed

    Gatto, Alberto; Torroja-Fungairiño, Carlos; Mazzarotto, Francesco; Cook, Stuart A; Barton, Paul J R; Sánchez-Cabo, Fátima; Lara-Pezzi, Enrique

    2014-04-01

    Alternative splicing is the main mechanism governing protein diversity. The recent developments in RNA-Seq technology have enabled the study of the global impact and regulation of this biological process. However, the lack of standardized protocols constitutes a major bottleneck in the analysis of alternative splicing. This is particularly important for the identification of exon-exon junctions, which is a critical step in any analysis workflow. Here we performed a systematic benchmarking of alignment tools to dissect the impact of design and method on the mapping, detection and quantification of splice junctions from multi-exon reads. Accordingly, we devised a novel pipeline based on TopHat2 combined with a splice junction detection algorithm, which we have named FineSplice. FineSplice allows effective elimination of spurious junction hits arising from artefactual alignments, achieving up to 99% precision in both real and simulated data sets and yielding superior F1 scores under most tested conditions. The proposed strategy conjugates an efficient mapping solution with a semi-supervised anomaly detection scheme to filter out false positives and allows reliable estimation of expressed junctions from the alignment output. Ultimately this provides more accurate information to identify meaningful splicing patterns. FineSplice is freely available at https://sourceforge.net/p/finesplice/.

  12. Droplet digital PCR technology promises new applications and research areas.

    PubMed

    Manoj, P

    2016-01-01

    Digital Polymerase Chain Reaction (dPCR) is used to quantify nucleic acids and its applications are in the detection and precise quantification of low-level pathogens, rare genetic sequences, quantification of copy number variants, rare mutations and in relative gene expressions. Here the PCR is performed in large number of reaction chambers or partitions and the reaction is carried out in each partition individually. This separation allows a more reliable collection and sensitive measurement of nucleic acid. Results are calculated by counting amplified target sequence (positive droplets) and the number of partitions in which there is no amplification (negative droplets). The mean number of target sequences was calculated by Poisson Algorithm. Poisson correction compensates the presence of more than one copy of target gene in any droplets. The method provides information with accuracy and precision which is highly reproducible and less susceptible to inhibitors than qPCR. It has been demonstrated in studying variations in gene sequences, such as copy number variants and point mutations, distinguishing differences between expression of nearly identical alleles, assessment of clinically relevant genetic variations and it is routinely used for clonal amplification of samples for NGS methods. dPCR enables more reliable predictors of tumor status and patient prognosis by absolute quantitation using reference normalizations. Rare mitochondrial DNA deletions associated with a range of diseases and disorders as well as aging can be accurately detected with droplet digital PCR.

  13. A liquid chromatography-tandem mass spectrometry assay for the detection and quantification of trehalose in biological samples.

    PubMed

    Kretschmer, Philip M; Bannister, Austin M; O Brien, Molly K; MacManus-Spencer, Laura A; Paulick, Margot G

    2016-10-15

    Trehalose is an important disaccharide that is used as a cellular protectant by many different organisms, helping these organisms better survive extreme conditions, such as dehydration, oxidative stress, and freezing temperatures. Methods to detect and accurately measure trehalose from different organisms will help us gain a better understanding of the mechanisms behind trehalose's ability to act as a cellular protectant. A liquid chromatography-tandem mass spectrometry (LC-MS/MS) assay using selected reaction monitoring mode for the detection and quantification of trehalose using maltose as an internal standard has been developed. This assay uses a commercially available LC column for trehalose separation and a standard triple quadrupole mass spectrometer, thus allowing many scientists to take advantage of this simple assay. The calibration curve from 3 to 100μM trehalose was fit best by a single polynomial. This LC-MS/MS assay directly detects and accurately quantifies trehalose, with an instrument limit of detection (LOD) that is 2-1000 times more sensitive than the most commonly-used assays for trehalose detection and quantification. Furthermore, this assay was used to detect and quantify endogenous trehalose produced by Escherichia coli (E. coli) cells, which were found to have an intracellular concentration of 8.5±0.9mM trehalose. This method thus shows promise for the reliable detection and quantification of trehalose from different biological sources. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. Intracavity absorption with a continuous wave dye laser - Quantification for a narrowband absorber

    NASA Technical Reports Server (NTRS)

    Brobst, William D.; Allen, John E., Jr.

    1987-01-01

    An experimental investigation of the dependence of intracavity absorption on factors including transition strength, concentration, absorber path length, and pump power is presented for a CW dye laser with a narrow-band absorber (NO2). A Beer-Lambert type relationship is found over a small but useful range of these parameters. Quantitative measurement of intracavity absorption from the dye laser spectral profiles showed enhancements up to 12,000 (for pump powers near lasing threshold) when compared to extracavity measurements. The definition of an intracavity absorption coefficient allowed the determination of accurate transition strength ratios, demonstrating the reliability of the method.

  15. Quantification of major flavonoids in carnation tissues (Dianthus caryophyllus) as a tool for cultivar discrimination.

    PubMed

    Galeotti, Francesco; Barile, Elisa; Lanzotti, Virginia; Dolci, Marcello; Curir, Paolo

    2008-01-01

    One flavone-C-glycoside and two flavonol-O-glycosides were recognized and isolated as the main flavonoidal components in nine different carnation cultivars, and their chemical structures have been determined by spectroscopic methods, including UV detection, MS and NMR. The distribution of these three compounds in flowers, leaves, stems, young sprouts, and roots of each cultivar was evaluated by a simple HPLC-UV method: the graphic representation of their content in the different tissues allows to identify and characterize unambiguously each considered carnation cultivar. The presented method could be an easy, inexpensive and reliable tool for carnation cultivar discrimination.

  16. Quantification of Neural Ethanol and Acetaldehyde Using Headspace GC-MS

    PubMed Central

    Heit, Claire; Eriksson, Peter; Thompson, David C; Fritz, Kristofer S; Vasiliou, Vasilis

    2016-01-01

    BACKGROUND There is controversy regarding the active agent responsible for alcohol addiction. The theory that ethanol itself was the agent in alcohol drinking behavior was widely accepted until acetaldehyde was found in the brain. The importance of acetaldehyde formation in the brain role is still subject to speculation due to the lack of a method to accurately assay the acetaldehyde levels directly. A highly sensitive GC-MS method to reliably determine acetaldehyde concentration with certainty is needed to address whether neural acetaldehyde is indeed responsible for increased alcohol consumption. METHODS A headspace gas chromatograph coupled to selected ion monitoring mass spectrometry was utilized to develop a quantitative assay for acetaldehyde and ethanol. Our GC-MS approach was carried out using a Bruker Scion 436-GC SQ MS. RESULTS Our approach yields limits of detection of acetaldehyde in the nanomolar range and limits of quantification in the low micromolar range. Our linear calibration includes 5 concentrations with a least square regression greater than 0.99 for both acetaldehyde and ethanol. Tissue analyses using this method revealed the capacity to quantify ethanol and acetaldehyde in blood, brain, and liver tissue from mice. CONCLUSIONS By allowing quantification of very low concentrations, this method may be used to examine the formation of ethanol metabolites, specifically acetaldehyde, in murine brain tissue in alcohol research. PMID:27501276

  17. Molecular Approaches for High Throughput Detection and Quantification of Genetically Modified Crops: A Review

    PubMed Central

    Salisu, Ibrahim B.; Shahid, Ahmad A.; Yaqoob, Amina; Ali, Qurban; Bajwa, Kamran S.; Rao, Abdul Q.; Husnain, Tayyab

    2017-01-01

    As long as the genetically modified crops are gaining attention globally, their proper approval and commercialization need accurate and reliable diagnostic methods for the transgenic content. These diagnostic techniques are mainly divided into two major groups, i.e., identification of transgenic (1) DNA and (2) proteins from GMOs and their products. Conventional methods such as PCR (polymerase chain reaction) and enzyme-linked immunosorbent assay (ELISA) were routinely employed for DNA and protein based quantification respectively. Although, these Techniques (PCR and ELISA) are considered as significantly convenient and productive, but there is need for more advance technologies that allow for high throughput detection and the quantification of GM event as the production of more complex GMO is increasing day by day. Therefore, recent approaches like microarray, capillary gel electrophoresis, digital PCR and next generation sequencing are more promising due to their accuracy and precise detection of transgenic contents. The present article is a brief comparative study of all such detection techniques on the basis of their advent, feasibility, accuracy, and cost effectiveness. However, these emerging technologies have a lot to do with detection of a specific event, contamination of different events and determination of fusion as well as stacked gene protein are the critical issues to be addressed in future. PMID:29085378

  18. Novel method to detect microRNAs using chip-based QuantStudio 3D digital PCR.

    PubMed

    Conte, Davide; Verri, Carla; Borzi, Cristina; Suatoni, Paola; Pastorino, Ugo; Sozzi, Gabriella; Fortunato, Orazio

    2015-10-23

    Research efforts for the management of cancer, in particular for lung cancer, are directed to identify new strategies for its early detection. MicroRNAs (miRNAs) are a new promising class of circulating biomarkers for cancer detection, but lack of consensus on data normalization methods has affected the diagnostic potential of circulating miRNAs. There is a growing interest in techniques that allow an absolute quantification of miRNAs which could be useful for early diagnosis. Recently, digital PCR, mainly based on droplets generation, emerged as an affordable technology for precise and absolute quantification of nucleic acids. In this work, we described a new interesting approach for profiling circulating miRNAs in plasma samples using a chip-based platform, the QuantStudio 3D digital PCR. The proposed method was validated using synthethic oligonucleotide at serial dilutions in plasma samples of lung cancer patients and in lung tissues and cell lines. Given its reproducibility and reliability, our approach could be potentially applied for the identification and quantification of miRNAs in other biological samples such as circulating exosomes or protein complexes. As chip-digital PCR becomes more established, it would be a robust tool for quantitative assessment of miRNA copy number for diagnosis of lung cancer and other diseases.

  19. Simultaneous determination of carbohydrates and simmondsins in jojoba seed meal (Simmondsia chinensis) by gas chromatography.

    PubMed

    Lein, Sabine; Van Boven, Maurits; Holser, Ron; Decuypere, Eddy; Flo, Gerda; Lievens, Sylvia; Cokelaere, Marnix

    2002-11-22

    Separate methods for the analyses of soluble carbohydrates in different plants and simmondsins in jojoba seed meal are described. A reliable gas chromatographic procedure for the simultaneous quantification of D-pinitol, myo-inositoL sucrose, 5-alpha-D-galactopyranosyl-D-pinitol. 2-alpha-D-galactopyranosyl-D-pinitol, simmondsin, 4-demethylsimmondsin, 5-demethylsimmondsin and 4,5-didemethylsimmondsin as trimethylsilyl derivatives in jojoba seed meal has been developed. The study of different extraction mixtures allowed for the quantitative recovery of the 9 analytes by a mixture of methanol-water (80:20, v/v) in the concentration range between 0.1 and 4%. Comparison of the separation parameters on three different capillary stationary phases with MS detection allowed for the choice of the optimal gas chromatographic conditions for baseline separation of the analytes.

  20. Uncertainty quantification in nanomechanical measurements using the atomic force microscope

    Treesearch

    Ryan Wagner; Robert Moon; Jon Pratt; Gordon Shaw; Arvind Raman

    2011-01-01

    Quantifying uncertainty in measured properties of nanomaterials is a prerequisite for the manufacture of reliable nanoengineered materials and products. Yet, rigorous uncertainty quantification (UQ) is rarely applied for material property measurements with the atomic force microscope (AFM), a widely used instrument that can measure properties at nanometer scale...

  1. Single Color Multiplexed ddPCR Copy Number Measurements and Single Nucleotide Variant Genotyping.

    PubMed

    Wood-Bouwens, Christina M; Ji, Hanlee P

    2018-01-01

    Droplet digital PCR (ddPCR) allows for accurate quantification of genetic events such as copy number variation and single nucleotide variants. Probe-based assays represent the current "gold-standard" for detection and quantification of these genetic events. Here, we introduce a cost-effective single color ddPCR assay that allows for single genome resolution quantification of copy number and single nucleotide variation.

  2. Quantification of uncertainties for application in detonation simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  3. Quantification of biofilm in microtiter plates: overview of testing conditions and practical recommendations for assessment of biofilm production by staphylococci.

    PubMed

    Stepanović, Srdjan; Vuković, Dragana; Hola, Veronika; Di Bonaventura, Giovanni; Djukić, Slobodanka; Cirković, Ivana; Ruzicka, Filip

    2007-08-01

    The details of all steps involved in the quantification of biofilm formation in microtiter plates are described. The presented protocol incorporates information on assessment of biofilm production by staphylococci, gained both by direct experience as well as by analysis of methods for assaying biofilm production. The obtained results should simplify quantification of biofilm formation in microtiter plates, and make it more reliable and comparable among different laboratories.

  4. Surface Enhanced Raman Spectroscopy (SERS) methods for endpoint and real-time quantification of miRNA assays

    NASA Astrophysics Data System (ADS)

    Restaino, Stephen M.; White, Ian M.

    2017-03-01

    Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.

  5. Simultaneous quantification of various retinoids by high performance liquid chromatography: its relevance to alcohol research.

    PubMed

    Yokoyama, H; Matsumoto, M; Shiraishi, H; Ishii, H

    2000-04-01

    We established a high performance liquid chromatography system that allowed simultaneous quantification of various retinoids. We applied the retinoids to a high performance liquid chromatography system with a silica gel absorption column. Samples were separated by the system with a binary multistep gradient with two kinds of solvent that contained n-Hexan, 2-propanol, and glacial acetic acid in different ratios. Each retinoid was detected at a wavelength of 350 nm. This condition allowed separation of 13-cis-retinoic acid, 9-cis-retinoic acid, all-trans-retinoic acid, 13-cis-retinol, all-trans-retinol, all-trans-4-oxo-retinoic acid, and 13-cis-4-oxo-retinoic acid as distinct single peaks. Each retinoid was also analyzed separately and its retention time determined. To ascertain the reliability of this system for retinoid quantification, retinoids at various concentrations were applied to the system. We observed the linearities between the concentration and area under the curve of the peak for each retinoid by linear least-squares regression analysis up to 2.5 ng/ml for all retinoic acids and up to 5 ng/ml for all retinols. There was no significant scattering in tests of within-day reproducibility or day-to-day reproducibility. Using this system, we examined effects of light exposure on isomerization of retinoids. When retinoids were exposed to room light for 2 hr, the amounts of all but 13-cis-retinol changed significantly. In particular, the amounts of all-trans-retinoic acid and 9-cis-retinoic acid were reduced by 40% and 60%, respectively. The HPLC system established in this study should be useful for studying the oxidation pathway of retinol to retinoic acid. A light-shielded condition is required when particular retinoic acids are analyzed.

  6. Validation of an enzyme-linked immunosorbent assay for the quantification of citrullinated histone H3 as a marker for neutrophil extracellular traps in human plasma.

    PubMed

    Thålin, Charlotte; Daleskog, Maud; Göransson, Sophie Paues; Schatzberg, Daphne; Lasselin, Julie; Laska, Ann-Charlotte; Kallner, Anders; Helleday, Thomas; Wallén, Håkan; Demers, Mélanie

    2017-06-01

    There is an emerging interest in the diverse functions of neutrophil extracellular traps (NETs) in a variety of disease settings. However, data on circulating NETs rely largely upon surrogate NET markers such as cell-free DNA, nucleosomes, and NET-associated enzymes. Citrullination of histone H3 by peptidyl arginine deiminase 4 (PAD4) is central for NET formation, and citrullinated histone H3 (H3Cit) is considered a NET-specific biomarker. We therefore aimed to optimize and validate a new enzyme-linked immunosorbent assay (ELISA) to quantify the levels of H3Cit in human plasma. A standard curve made of in vitro PAD4-citrullinated histones H3 allows for the quantification of H3Cit in plasma using an anti-histone antibody as capture antibody and an anti-histone H3 citrulline antibody for detection. The assay was evaluated for linearity, stability, specificity, and precision on plasma samples obtained from a human model of inflammation before and after lipopolysaccharide injection. The results revealed linearity and high specificity demonstrated by the inability of detecting non-citrullinated histone H3. Coefficients of variation for intra- and inter-assay variability ranged from 2.1 to 5.1% and from 5.8 to 13.5%, respectively, allowing for a high precision. Furthermore, our results support an inflammatory induction of a systemic NET burden by showing, for the first time, clear intra-individual elevations of plasma H3Cit in a human model of lipopolysaccharide-induced inflammation. Taken together, our work demonstrates the development of a new method for the quantification of H3Cit by ELISA that can reliably be used for the detection of NETs in human plasma.

  7. In situ DNA hybridized chain reaction (FISH-HCR) as a better method for quantification of bacteria and archaea within marine sediment

    NASA Astrophysics Data System (ADS)

    Buongiorno, J.; Lloyd, K. G.; Shumaker, A.; Schippers, A.; Webster, G.; Weightman, A.; Turner, S.

    2015-12-01

    Nearly 75% of the Earth's surface is covered by marine sediment that is home to an estimated 2.9 x 1029 microbial cells. A substantial impediment to understanding the abundance and distribution of cells within marine sediment is the lack of a consistent and reliable method for their taxon-specific quantification. Catalyzed reporter fluorescent in situ hybridization (CARD-FISH) provides taxon-specific enumeration, but this process requires passing a large enzyme through cell membranes, decreasing its precision relative to general cell counts using a small DNA stain. In 2015, Yamaguchi et al. developed FISH hybridization chain reaction (FISH-HCR) as an in situ whole cell detection method for environmental microorganisms. FISH-HCR amplifies the fluorescent signal, as does CARD-FISH, but it allows for milder cell permeation methods that might prevent yield loss. To compare FISH-HCR to CARD-FISH, we examined bacteria and archaea cell counts within two sediment cores, Lille Belt (~78 meters deep) and Landsort Deep (90 meters deep), which were retrieved from the Baltic Sea Basin during IODP Expedition 347. Preliminary analysis shows that CARD-FISH counts are below the quantification limit for most depths across both cores. By contrast, quantification of cells was possible with FISH-HCR in all examined depths. When quantification with CARD-FISH was above the limit of detection, counts with FISH-HCR were up to 11 fold higher for Bacteria and 3 fold higher for Archaea from the same sediment sample. Further, FISH-HCR counts follow the trends of on board counts nicely, indicating that FISH-HCR may better reflect the cellular abundance within marine sediment than other quantification methods, including qPCR. Using FISH-HCR, we found that archaeal cell counts were on average greater than bacterial cell counts, but within the same order of magnitude.

  8. The effect of applied transducer force on acoustic radiation force impulse quantification within the left lobe of the liver.

    PubMed

    Porra, Luke; Swan, Hans; Ho, Chien

    2015-08-01

    Introduction: Acoustic Radiation Force Impulse (ARFI) Quantification measures shear wave velocities (SWVs) within the liver. It is a reliable method for predicting the severity of liver fibrosis and has the potential to assess fibrosis in any part of the liver, but previous research has found ARFI quantification in the right lobe more accurate than in the left lobe. A lack of standardised applied transducer force when performing ARFI quantification in the left lobe of the liver may account for some of this inaccuracy. The research hypothesis of this present study predicted that an increase in applied transducer force would result in an increase in SWVs measured. Methods: ARFI quantification within the left lobe of the liver was performed within a group of healthy volunteers (n = 28). During each examination, each participant was subjected to ARFI quantification at six different levels of transducer force applied to the epigastric abdominal wall. Results: A repeated measures ANOVA test showed that ARFI quantification was significantly affected by applied transducer force (p = 0.002). Significant pairwise comparisons using Bonferroni correction for multiple comparisons showed that with an increase in applied transducer force, there was a decrease in SWVs. Conclusion: Applied transducer force has a significant effect on SWVs within the left lobe of the liver and it may explain some of the less accurate and less reliable results in previous studies where transducer force was not taken into consideration. Future studies in the left lobe of the liver should take this into account and control for applied transducer force.

  9. Multiplex Real-Time qPCR Assay for Simultaneous and Sensitive Detection of Phytoplasmas in Sesame Plants and Insect Vectors

    PubMed Central

    Ikten, Cengiz; Ustun, Rustem; Catal, Mursel; Yol, Engin; Uzun, Bulent

    2016-01-01

    Phyllody, a destructive and economically important disease worldwide caused by phytoplasma infections, is characterized by the abnormal development of floral structures into stunted leafy parts and contributes to serious losses in crop plants, including sesame (Sesamum indicum L.). Accurate identification, differentiation, and quantification of phyllody-causing phytoplasmas are essential for effective management of this plant disease and for selection of resistant sesame varieties. In this study, a diagnostic multiplex qPCR assay was developed using TaqMan® chemistry based on detection of the 16S ribosomal RNA gene of phytoplasmas and the 18S ribosomal gene of sesame. Phytoplasma and sesame specific primers and probes labeled with different fluorescent dyes were used for simultaneous amplification of 16SrII and 16SrIX phytoplasmas in a single tube. The multiplex real-time qPCR assay allowed accurate detection, differentiation, and quantification of 16SrII and 16SrIX groups in 109 sesame plant and 92 insect vector samples tested. The assay was found to have a detection sensitivity of 1.8 x 102 and 1.6 x 102 DNA copies for absolute quantification of 16SrII and 16SrIX group phytoplasmas, respectively. Relative quantification was effective and reliable for determination of phyllody phytoplasma DNA amounts normalized to sesame DNA in infected plant tissues. The development of this qPCR assay provides a method for the rapid measurement of infection loads to identify resistance levels of sesame genotypes against phyllody phytoplasma disease. PMID:27195795

  10. Simultaneous quantification of protein phosphorylation sites using liquid chromatography-tandem mass spectrometry-based targeted proteomics: a linear algebra approach for isobaric phosphopeptides.

    PubMed

    Xu, Feifei; Yang, Ting; Sheng, Yuan; Zhong, Ting; Yang, Mi; Chen, Yun

    2014-12-05

    As one of the most studied post-translational modifications (PTM), protein phosphorylation plays an essential role in almost all cellular processes. Current methods are able to predict and determine thousands of phosphorylation sites, whereas stoichiometric quantification of these sites is still challenging. Liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS)-based targeted proteomics is emerging as a promising technique for site-specific quantification of protein phosphorylation using proteolytic peptides as surrogates of proteins. However, several issues may limit its application, one of which relates to the phosphopeptides with different phosphorylation sites and the same mass (i.e., isobaric phosphopeptides). While employment of site-specific product ions allows for these isobaric phosphopeptides to be distinguished and quantified, site-specific product ions are often absent or weak in tandem mass spectra. In this study, linear algebra algorithms were employed as an add-on to targeted proteomics to retrieve information on individual phosphopeptides from their common spectra. To achieve this simultaneous quantification, a LC-MS/MS-based targeted proteomics assay was first developed and validated for each phosphopeptide. Given the slope and intercept of calibration curves of phosphopeptides in each transition, linear algebraic equations were developed. Using a series of mock mixtures prepared with varying concentrations of each phosphopeptide, the reliability of the approach to quantify isobaric phosphopeptides containing multiple phosphorylation sites (≥ 2) was discussed. Finally, we applied this approach to determine the phosphorylation stoichiometry of heat shock protein 27 (HSP27) at Ser78 and Ser82 in breast cancer cells and tissue samples.

  11. Immunochemical Detection Methods for Gluten in Food Products: Where Do We Go from Here?

    PubMed

    Slot, I D Bruins; van der Fels-Klerx, H J; Bremer, M G E G; Hamer, R J

    2016-11-17

    Accurate and reliable quantification methods for gluten in food are necessary to ensure proper product labeling and thus safeguard the gluten sensitive consumer against exposure. Immunochemical detection is the method of choice, as it is sensitive, rapid and relatively easy to use. Although a wide range of detection kits are commercially available, there are still many difficulties in gluten detection that have not yet been overcome. This review gives an overview of the currently commercially available immunochemical detection methods, and discusses the problems that still exist in gluten detection in food. The largest problems are encountered in the extraction of gluten from food matrices, the choice of epitopes targeted by the detection method, and the use of a standardized reference material. By comparing the available techniques with the unmet needs in gluten detection, the possible benefit of a new multiplex immunoassay is investigated. This detection method would allow for the detection and quantification of multiple harmful gluten peptides at once and would, therefore, be a logical advancement in gluten detection in food.

  12. Objective quantification of the tinnitus decompensation by synchronization measures of auditory evoked single sweeps.

    PubMed

    Strauss, Daniel J; Delb, Wolfgang; D'Amelio, Roberto; Low, Yin Fen; Falkai, Peter

    2008-02-01

    Large-scale neural correlates of the tinnitus decompensation might be used for an objective evaluation of therapies and neurofeedback based therapeutic approaches. In this study, we try to identify large-scale neural correlates of the tinnitus decompensation using wavelet phase stability criteria of single sweep sequences of late auditory evoked potentials as synchronization stability measure. The extracted measure provided an objective quantification of the tinnitus decompensation and allowed for a reliable discrimination between a group of compensated and decompensated tinnitus patients. We provide an interpretation for our results by a neural model of top-down projections based on the Jastreboff tinnitus model combined with the adaptive resonance theory which has not been applied to model tinnitus so far. Using this model, our stability measure of evoked potentials can be linked to the focus of attention on the tinnitus signal. It is concluded that the wavelet phase stability of late auditory evoked potential single sweeps might be used as objective tinnitus decompensation measure and can be interpreted in the framework of the Jastreboff tinnitus model and adaptive resonance theory.

  13. PET imaging of cardiac hypoxia: Opportunities and challenges

    PubMed Central

    Handley, M.G.; Medina, R.A.; Nagel, E.; Blower, P.J.; Southworth, R.

    2012-01-01

    Myocardial hypoxia is a major factor in the pathology of cardiac ischemia and myocardial infarction. Hypoxia also occurs in microvascular disease and cardiac hypertrophy, and is thought to be a prime determinant of the progression to heart failure, as well as the driving force for compensatory angiogenesis. The non-invasive delineation and quantification of hypoxia in cardiac tissue therefore has the potential to be an invaluable experimental, diagnostic and prognostic biomarker for applications in cardiology. However, at this time there are no validated methodologies sufficiently sensitive or reliable for clinical use. PET imaging provides real-time spatial information on the biodistribution of injected radiolabeled tracer molecules. Its inherent high sensitivity allows quantitative imaging of these tracers, even when injected at sub-pharmacological (≥pM) concentrations, allowing the non-invasive investigation of biological systems without perturbing them. PET is therefore an attractive approach for the delineation and quantification of cardiac hypoxia and ischemia. In this review we discuss the key concepts which must be considered when imaging hypoxia in the heart. We summarize the PET tracers which are currently available, and we look forward to the next generation of hypoxia-specific PET imaging agents currently being developed. We describe their potential advantages and shortcomings compared to existing imaging approaches, and what is needed in terms of validation and characterization before these agents can be exploited clinically. PMID:21781973

  14. Guided Wave Delamination Detection and Quantification With Wavefield Data Analysis

    NASA Technical Reports Server (NTRS)

    Tian, Zhenhua; Campbell Leckey, Cara A.; Seebo, Jeffrey P.; Yu, Lingyu

    2014-01-01

    Unexpected damage can occur in aerospace composites due to impact events or material stress during off-nominal loading events. In particular, laminated composites are susceptible to delamination damage due to weak transverse tensile and inter-laminar shear strengths. Developments of reliable and quantitative techniques to detect delamination damage in laminated composites are imperative for safe and functional optimally-designed next-generation composite structures. In this paper, we investigate guided wave interactions with delamination damage and develop quantification algorithms by using wavefield data analysis. The trapped guided waves in the delamination region are observed from the wavefield data and further quantitatively interpreted by using different wavenumber analysis methods. The frequency-wavenumber representation of the wavefield shows that new wavenumbers are present and correlate to trapped waves in the damage region. These new wavenumbers are used to detect and quantify the delamination damage through the wavenumber analysis, which can show how the wavenumber changes as a function of wave propagation distance. The location and spatial duration of the new wavenumbers can be identified, providing a useful means not only for detecting the presence of delamination damage but also allowing for estimation of the delamination size. Our method has been applied to detect and quantify real delamination damage with complex geometry (grown using a quasi-static indentation technique). The detection and quantification results show the location, size, and shape of the delamination damage.

  15. Quantification issues of trace metal contaminants on silicon wafers by means of TOF-SIMS, ICP-MS, and TXRF

    NASA Astrophysics Data System (ADS)

    Rostam-Khani, P.; Hopstaken, M. J. P.; Vullings, P.; Noij, G.; O'Halloran, O.; Claassen, W.

    2004-06-01

    Measurement of surface metal contamination on silicon wafers is essential for yield enhancement in IC manufacturing. Vapor phase decomposition coupled with either inductively coupled plasma mass spectrometry (VPD-ICP-MS), or total reflection X-ray fluorescence (VPD-TXRF), TXRF and more recently time of flight secondary ion mass spectrometry (TOF-SIMS) are used to monitor surface metal contamination. These techniques complement each other in their respective strengths and weaknesses. For reliable and accurate quantification, so-called relative sensitivity factors (RSF) are required for TOF-SIMS analysis. For quantification purposes in VPD, the collection efficiency (CE) is important to ensure complete collection of contamination. A standard procedure has been developed that combines the determination of these RSFs as well as the collection efficiency using all the analytical techniques mentioned above. Therefore, sample wafers were intentionally contaminated and analyzed (by TOF-SIMS) directly after preparation. After VPD-ICP-MS, several scanned surfaces were analyzed again by TOF-SIMS. Comparing the intensities of the specific metals before and after the VPD-DC procedure on the scanned surface allows the determination of so-called removing efficiency (RE). In general, very good agreement was obtained comparing the four analytical techniques after updating the RSFs for TOF-SIMS. Progress has been achieved concerning the CE evaluation as well as determining the RSFs more precisely for TOF-SIMS.

  16. Living cell dry mass measurement using quantitative phase imaging with quadriwave lateral shearing interferometry: an accuracy and sensitivity discussion.

    PubMed

    Aknoun, Sherazade; Savatier, Julien; Bon, Pierre; Galland, Frédéric; Abdeladim, Lamiae; Wattellier, Benoit; Monneret, Serge

    2015-01-01

    Single-cell dry mass measurement is used in biology to follow cell cycle, to address effects of drugs, or to investigate cell metabolism. Quantitative phase imaging technique with quadriwave lateral shearing interferometry (QWLSI) allows measuring cell dry mass. The technique is very simple to set up, as it is integrated in a camera-like instrument. It simply plugs onto a standard microscope and uses a white light illumination source. Its working principle is first explained, from image acquisition to automated segmentation algorithm and dry mass quantification. Metrology of the whole process, including its sensitivity, repeatability, reliability, sources of error, over different kinds of samples and under different experimental conditions, is developed. We show that there is no influence of magnification or spatial light coherence on dry mass measurement; effect of defocus is more critical but can be calibrated. As a consequence, QWLSI is a well-suited technique for fast, simple, and reliable cell dry mass study, especially for live cells.

  17. Reliability of recurrence quantification analysis measures of the center of pressure during standing in individuals with musculoskeletal disorders.

    PubMed

    Mazaheri, Masood; Negahban, Hossein; Salavati, Mahyar; Sanjari, Mohammad Ali; Parnianpour, Mohamad

    2010-09-01

    Although the application of nonlinear tools including recurrence quantification analysis (RQA) has increasingly grown in the recent years especially in balance-disordered populations, there have been few studies which determine their measurement properties. Therefore, a methodological study was performed to estimate the intersession and intrasession reliability of some dynamic features provided by RQA for nonlinear analysis of center of pressure (COP) signals recorded during quiet standing in a sample of patients with musculoskeletal disorders (MSDs) including low back pain (LBP), anterior cruciate ligament (ACL) injury and functional ankle instability (FAI). The subjects completed postural measurements with three levels of difficulty (rigid surface-eyes open, rigid surface-eyes closed, and foam surface-eyes closed). Four RQA measures (% recurrence, % determinism, entropy, and trend) were extracted from the recurrence plot. Relative reliability of these measures was assessed using intraclass correlation coefficient and absolute reliability using standard error of measurement and coefficient of variation. % Determinism and entropy were the most reliable features of RQA for the both intersession and intrasession reliability measures. High level of reliability of % determinism and entropy in this preliminary investigation may show their clinical promise for discriminative and evaluative purposes of balance performance. 2010 IPEM. Published by Elsevier Ltd. All rights reserved.

  18. Developing safety performance functions incorporating reliability-based risk measures.

    PubMed

    Ibrahim, Shewkar El-Bassiouni; Sayed, Tarek

    2011-11-01

    Current geometric design guides provide deterministic standards where the safety margin of the design output is generally unknown and there is little knowledge of the safety implications of deviating from these standards. Several studies have advocated probabilistic geometric design where reliability analysis can be used to account for the uncertainty in the design parameters and to provide a risk measure of the implication of deviation from design standards. However, there is currently no link between measures of design reliability and the quantification of safety using collision frequency. The analysis presented in this paper attempts to bridge this gap by incorporating a reliability-based quantitative risk measure such as the probability of non-compliance (P(nc)) in safety performance functions (SPFs). Establishing this link will allow admitting reliability-based design into traditional benefit-cost analysis and should lead to a wider application of the reliability technique in road design. The present application is concerned with the design of horizontal curves, where the limit state function is defined in terms of the available (supply) and stopping (demand) sight distances. A comprehensive collision and geometric design database of two-lane rural highways is used to investigate the effect of the probability of non-compliance on safety. The reliability analysis was carried out using the First Order Reliability Method (FORM). Two Negative Binomial (NB) SPFs were developed to compare models with and without the reliability-based risk measures. It was found that models incorporating the P(nc) provided a better fit to the data set than the traditional (without risk) NB SPFs for total, injury and fatality (I+F) and property damage only (PDO) collisions. Copyright © 2011 Elsevier Ltd. All rights reserved.

  19. Detection and quantification of beef and pork materials in meat products by duplex droplet digital PCR.

    PubMed

    Cai, Yicun; He, Yuping; Lv, Rong; Chen, Hongchao; Wang, Qiang; Pan, Liangwen

    2017-01-01

    Meat products often consist of meat from multiple animal species, and inaccurate food product adulteration and mislabeling can negatively affect consumers. Therefore, a cost-effective and reliable method for identification and quantification of animal species in meat products is required. In this study, we developed a duplex droplet digital PCR (dddPCR) detection and quantification system to simultaneously identify and quantify the source of meat in samples containing a mixture of beef (Bos taurus) and pork (Sus scrofa) in a single digital PCR reaction tube. Mixed meat samples of known composition were used to test the accuracy and applicability of this method. The limit of detection (LOD) and the limit of quantification (LOQ) of this detection and quantification system were also identified. We conclude that our dddPCR detection and quantification system is suitable for quality control and routine analyses of meat products.

  20. Development and validation of a harmonized TaqMan-based triplex real-time RT-PCR protocol for the quantitative detection of normalized gene expression profiles of seven porcine cytokines.

    PubMed

    Petrov, Anja; Beer, Martin; Blome, Sandra

    2014-01-01

    Dysregulation of cytokine responses plays a major role in the pathogenesis of severe and life-threatening infectious diseases like septicemia or viral hemorrhagic fevers. In pigs, diseases like African and classical swine fever are known to show exaggerated cytokine releases. To study these responses and their impact on disease severity and outcome in detail, reliable, highly specific and sensitive methods are needed. For cytokine research on the molecular level, real-time RT-PCRs have been proven to be suitable. Yet, the currently available and most commonly used SYBR Green I assays or heterogeneous gel-based RT-PCRs for swine show a significant lack of specificity and sensitivity. The latter is however absolutely essential for an accurate quantification of rare cytokine transcripts as well as for detection of small changes in gene expressions. For this reason, a harmonized TaqMan-based triplex real-time RT-PCR protocol for the quantitative detection of normalized gene expression profiles of seven porcine cytokines was designed and validated within the presented study. Cytokines were chosen to represent different immunological pathways and targets known to be involved in the pathogenesis of the above mentioned porcine diseases, namely interleukin (IL)-1β, IL-2, IL-4, IL-6, IL-8, tumor necrosis factor (TNF)-α and interferon (IFN)-α. Beta-Actin and glyceraldehyde 3-phosphate dehydrogenase (GAPDH) served as reference genes for normalization. For absolute quantification a synthetic standard plasmid was constructed comprising all target cytokines and reference genes within a single molecule allowing the generation of positive control RNA. The standard as well as positive RNAs from samples, and additionally more than 400 clinical samples, which were collected from animal trials, were included in the validation process to assess analytical sensitivity and applicability under routine conditions. The resulting assay allows the reliable assessment of gene expression profiles and provides a broad applicability to any kind of immunological research in swine.

  1. A UPLC-ESI-Q-TOF method for rapid and reliable identification and quantification of major indole alkaloids in Catharanthus roseus.

    PubMed

    Jeong, Won Tae; Lim, Heung Bin

    2018-03-30

    We developed a novel ultra performance liquid chromatography-quadrupole time-of-flight (UPLC-Q-TOF) mass spectrometry method that allows sensitive, rapid, and reliable detection and identification of six representative indole alkaloids (vincristine, vinblastine, ajmalicine, catharanthine, serpentine, and vindoline) that exhibit physiological activity in Catharanthus roseus (C. roseus). The alkaloids were eluted on a C18 column with acetonitrile and water containing 0.1% formic acid and 10 mM ammonium acetate, and separated with good resolution within 13 min. Electrospray ionization-Q-TOF (ESI-Q-TOF) analysis was performed to characterize the molecules and their fragment ions, and the characteristic ions and fragmentation patterns were used as to identify the alkaloids. The proposed analytical method was verified in reference to the ICH guidelines and the results showed excellent linearity (R 2  > 0.9988), limit of detection (1 ng/mL to 10 ng/mL), limit of quantification (3 ng/mL to 30 ng/mL), intra-day and inter-day precisions, and extraction recovery rates (92.8% to 104.1%) for all components. The validated UPLC-Q-TOF method was applied to the analysis of extracts from the root, stem, and leaves of C. roseus, allowing the identification of six alkaloids by comparison of retention times, molecular ions, and fragmentation patterns with those of reference compounds. Sixteen additional indole alkaloids were tentatively identified by comparison of chromatograms to chemical databases and literature reports. The contents of bis-indole alkaloids (vincristine and vinblastine) were high in the aerial parts, while the contents of mono-indole alkaloids (ajmalicine, catharanthine, serpentine, and vindoline) were high in the roots. The present results demonstrate that the proposed UPLC-Q-TOF method can be useful for the investigation of phytochemical constituents of medicinal plants. Copyright © 2018 Elsevier B.V. All rights reserved.

  2. Development of a method for measuring femoral torsion using real-time ultrasound.

    PubMed

    Hafiz, Eliza; Hiller, Claire E; Nicholson, Leslie L; Nightingale, E Jean; Clarke, Jillian L; Grimaldi, Alison; Eisenhuth, John P; Refshauge, Kathryn M

    2014-07-01

    Excessive femoral torsion has been associated with various musculoskeletal and neurological problems. To explore this relationship, it is essential to be able to measure femoral torsion in the clinic accurately. Computerized tomography (CT) and magnetic resonance imaging (MRI) are thought to provide the most accurate measurements but CT involves significant radiation exposure and MRI is expensive. The aim of this study was to design a method for measuring femoral torsion in the clinic, and to determine the reliability of this method. Details of design process, including construction of a jig, the protocol developed and the reliability of the method are presented. The protocol developed used ultrasound to image a ridge on the greater trochanter, and a customized jig placed on the femoral condyles as reference points. An inclinometer attached to the customized jig allowed quantification of the degree of femoral torsion. Measurements taken with this protocol had excellent intra- and inter-rater reliability (ICC2,1 = 0.98 and 0.97, respectively). This method of measuring femoral torsion also permitted measurement of femoral torsion with a high degree of accuracy. This method is applicable to the research setting and, with minor adjustments, will be applicable to the clinical setting.

  3. Toward improved peptide feature detection in quantitative proteomics using stable isotope labeling.

    PubMed

    Nilse, Lars; Sigloch, Florian Christoph; Biniossek, Martin L; Schilling, Oliver

    2015-08-01

    Reliable detection of peptides in LC-MS data is a key algorithmic step in the analysis of quantitative proteomics experiments. While highly abundant peptides can be detected reliably by most modern software tools, there is much less agreement on medium and low-intensity peptides in a sample. The choice of software tools can have a big impact on the quantification of proteins, especially for proteins that appear in lower concentrations. However, in many experiments, it is precisely this region of less abundant but substantially regulated proteins that holds the biggest potential for discoveries. This is particularly true for discovery proteomics in the pharmacological sector with a specific interest in key regulatory proteins. In this viewpoint article, we discuss how the development of novel software algorithms allows us to study this region of the proteome with increased confidence. Reliable results are one of many aspects to be considered when deciding on a bioinformatics software platform. Deployment into existing IT infrastructures, compatibility with other software packages, scalability, automation, flexibility, and support need to be considered and are briefly addressed in this viewpoint article. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  4. A method for limiting data acquisition in a high-resolution gamma-ray spectrometer during On-Site Inspection activities under the Comprehensive Nuclear-Test-Ban Treaty

    NASA Astrophysics Data System (ADS)

    Aviv, O.; Lipshtat, A.

    2018-05-01

    On-Site Inspection (OSI) activities under the Comprehensive Nuclear-Test-Ban Treaty (CTBT) allow limitations to measurement equipment. Thus, certain detectors require modifications to be operated in a restricted mode. The accuracy and reliability of results obtained by a restricted device may be impaired. We present here a method for limiting data acquisition during OSI. Limitations are applied to a high-resolution high-purity germanium detector system, where the vast majority of the acquired data that is not relevant to the inspection is filtered out. The limited spectrum is displayed to the user and allows analysis using standard gamma spectrometry procedures. The proposed method can be incorporated into commercial gamma-ray spectrometers, including both stationary and mobile-based systems. By applying this procedure to more than 1000 spectra, representing various scenarios, we show that partial data are sufficient for reaching reliable conclusions. A comprehensive survey of potential false-positive identifications of various radionuclides is presented as well. It is evident from the results that the analysis of a limited spectrum is practically identical to that of a standard spectrum in terms of detection and quantification of OSI-relevant radionuclides. A future limited system can be developed making use of the principles outlined by the suggested method.

  5. Plasma protein absolute quantification by nano-LC Q-TOF UDMSE for clinical biomarker verification

    PubMed Central

    ILIES, MARIA; IUGA, CRISTINA ADELA; LOGHIN, FELICIA; DHOPLE, VISHNU MUKUND; HAMMER, ELKE

    2017-01-01

    Background and aims Proteome-based biomarker studies are targeting proteins that could serve as diagnostic, prognosis, and prediction molecules. In the clinical routine, immunoassays are currently used for the absolute quantification of such biomarkers, with the major limitation that only one molecule can be targeted per assay. The aim of our study was to test a mass spectrometry based absolute quantification method for the verification of plasma protein sets which might serve as reliable biomarker panels for the clinical practice. Methods Six EDTA plasma samples were analyzed after tryptic digestion using a high throughput data independent acquisition nano-LC Q-TOF UDMSE proteomics approach. Synthetic Escherichia coli standard peptides were spiked in each sample for the absolute quantification. Data analysis was performed using ProgenesisQI v2.0 software (Waters Corporation). Results Our method ensured absolute quantification of 242 non redundant plasma proteins in a single run analysis. The dynamic range covered was 105. 86% were represented by classical plasma proteins. The overall median coefficient of variation was 0.36, while a set of 63 proteins was found to be highly stable. Absolute protein concentrations strongly correlated with values reviewed in the literature. Conclusions Nano-LC Q-TOF UDMSE proteomic analysis can be used for a simple and rapid determination of absolute amounts of plasma proteins. A large number of plasma proteins could be analyzed, while a wide dynamic range was covered with low coefficient of variation at protein level. The method proved to be a reliable tool for the quantification of protein panel for biomarker verification in the clinical practice. PMID:29151793

  6. Direct qPCR quantification using the Quantifiler(®) Trio DNA quantification kit.

    PubMed

    Liu, Jason Yingjie

    2014-11-01

    The effectiveness of a direct quantification assay is essential to the adoption of the combined direct quantification/direct STR workflow. In this paper, the feasibility of using the Quantifiler(®) Trio DNA quantification kit for the direct quantification of forensic casework samples was investigated. Both low-level touch DNA samples and blood samples were collected on PE swabs and quantified directly. The increased sensitivity of the Quantifiler(®) Trio kit enabled the detection of less than 10pg of DNA in unprocessed touch samples and also minimizes the stochastic effect experienced by different targets in the same sample. The DNA quantity information obtained from a direct quantification assay using the Quantifiler(®) Trio kit can also be used to accurately estimate the optimal input DNA quantity for a direct STR amplification reaction. The correlation between the direct quantification results (Quantifiler(®) Trio kit) and the direct STR results (GlobalFiler™ PCR amplification kit(*)) for low-level touch DNA samples indicates that direct quantification using the Quantifiler(®) Trio DNA quantification kit is more reliable than the Quantifiler(®) Duo DNA quantification kit for predicting the STR results of unprocessed touch DNA samples containing less than 10pg of DNA. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  7. Real-time PCR machine system modeling and a systematic approach for the robust design of a real-time PCR-on-a-chip system.

    PubMed

    Lee, Da-Sheng

    2010-01-01

    Chip-based DNA quantification systems are widespread, and used in many point-of-care applications. However, instruments for such applications may not be maintained or calibrated regularly. Since machine reliability is a key issue for normal operation, this study presents a system model of the real-time Polymerase Chain Reaction (PCR) machine to analyze the instrument design through numerical experiments. Based on model analysis, a systematic approach was developed to lower the variation of DNA quantification and achieve a robust design for a real-time PCR-on-a-chip system. Accelerated lift testing was adopted to evaluate the reliability of the chip prototype. According to the life test plan, this proposed real-time PCR-on-a-chip system was simulated to work continuously for over three years with similar reproducibility in DNA quantification. This not only shows the robustness of the lab-on-a-chip system, but also verifies the effectiveness of our systematic method for achieving a robust design.

  8. Reliability of testicular stiffness quantification using shear wave elastography in predicting male fertility: a preliminary prospective study.

    PubMed

    Yavuz, Alpaslan; Yokus, Adem; Taken, Kerem; Batur, Abdussamet; Ozgokce, Mesut; Arslan, Harun

    2018-05-02

    To evaluate the reliability of testicular stiffness quantification using shear wave elastography in predicting the fertility potential of males and for the pre-diagnosis of disorders based upon sperm quantification. One hundred males between the ages of 19-49 years (mean age of 28.77±6.11), ninety of whom with complaints of infertility, were enrolled in this prospective study. Scrotal grey-scale, Doppler ultrasound (US), and mean testicular shear wave velocity quantifications (SWVQs) were performed. The volumes of testes, as well as the grade of varicocele if present, were recorded. The mean shear wave velocity values (SWVVs) of each testis and a mean testicular SWVV for each patient were calculated. The semen-analyses of patients were consecutively performed. There were significant negative correlations between the mean testicular SWVVs of patients and their sperm counts or the testis volumes (r=-0.399, r=-0.565; p<0.01, respectively). A positive correlation was found between testicular volumes and sperm counts (r=0.491, p<0.01). The cut-off values regarding mean testicular SWVV to distinguish normal sperm count from azoospermia and oligozoospermia were 1.465 m/s (75.0% sensitivity and 75.0% specificity) and 1.328 m/s (64.3% sensitivity and 68.2% specificity), respectively, and the value to distinguish oligozoospermia from azoospermia was 1.528 m/s (66.7% sensitivity, 60.7% specificity). The mean testicular SWVQ using the ARFI shear wave technique was a reliable, non-invasive and acceptably stable method for predicting male infertility, especially related to sperm count issues.

  9. Development of an analytical method for the simultaneous analysis of MCPD esters and glycidyl esters in oil-based foodstuffs.

    PubMed

    Ermacora, Alessia; Hrnčiřík, Karel

    2014-01-01

    Substantial progress has been recently made in the development and optimisation of analytical methods for the quantification of 2-MCPD, 3-MCPD and glycidyl esters in oils and fats, and there are a few methods currently available that allow a reliable quantification of these contaminants in bulk oils and fats. On the other hand, no standard method for the analysis of foodstuffs has yet been established. The aim of this study was the development and validation of a new method for the simultaneous quantification of 2-MCPD, 3-MCPD and glycidyl esters in oil-based food products. The developed protocol includes a first step of liquid-liquid extraction and purification of the lipophilic substances of the sample, followed by the application of a previously developed procedure based on acid transesterification, for the indirect quantification of these contaminants in oils and fats. The method validation was carried out on food products (fat-based spreads, creams, margarine, mayonnaise) manufactured in-house, in order to control the manufacturing process and account for any food matrix-analyte interactions (the sample spiking was carried out on the single components used for the formulations rather than the final products). The method showed good accuracy (the recoveries ranged from 97% to 106% for bound 3-MCPD and 2-MCPD and from 88% to 115% for bound glycidol) and sensitivity (the LOD was 0.04 and 0.05 mg kg(-1) for bound MCPD and glycidol, respectively). Repeatability and reproducibility were satisfactory (RSD below 2% and 5%, respectively) for all analytes. The levels of salts and surface-active compounds in the formulation were found to have no impact on the accuracy and the other parameters of the method.

  10. Quantification of ligand density and stoichiometry on the surface of liposomes using single-molecule fluorescence imaging.

    PubMed

    Belfiore, Lisa; Spenkelink, Lisanne M; Ranson, Marie; van Oijen, Antoine M; Vine, Kara L

    2018-05-28

    Despite the longstanding existence of liposome technology in drug delivery applications, there have been no ligand-directed liposome formulations approved for clinical use to date. This lack of translation is due to several factors, one of which is the absence of molecular tools for the robust quantification of ligand density on the surface of liposomes. We report here for the first time the quantification of proteins attached to the surface of small unilamellar liposomes using single-molecule fluorescence imaging. Liposomes were surface-functionalized with fluorescently labeled human proteins previously validated to target the cancer cell surface biomarkers plasminogen activator inhibitor-2 (PAI-2) and trastuzumab (TZ, Herceptin®). These protein-conjugated liposomes were visualized using a custom-built wide-field fluorescence microscope with single-molecule sensitivity. By counting the photobleaching steps of the fluorescently labeled proteins, we calculated the number of attached proteins per liposome, which was 11 ± 4 proteins for single-ligand liposomes. Imaging of dual-ligand liposomes revealed stoichiometries of the two attached proteins in accordance with the molar ratios of protein added during preparation. Preparation of PAI-2/TZ dual-ligand liposomes via two different methods revealed that the post-insertion method generated liposomes with a more equal representation of the two differently sized proteins, demonstrating the ability of this preparation method to enable better control of liposome protein densities. We conclude that the single-molecule imaging method presented here is an accurate and reliable quantification tool for determining ligand density and stoichiometry on the surface of liposomes. This method has the potential to allow for comprehensive characterization of novel ligand-directed liposomes that should facilitate the translation of these nanotherapies through to the clinic. Copyright © 2018 Elsevier B.V. All rights reserved.

  11. Establishing a reliable multiple reaction monitoring-based method for the quantification of obesity-associated comorbidities in serum and adipose tissue requires intensive clinical validation.

    PubMed

    Oberbach, Andreas; Schlichting, Nadine; Neuhaus, Jochen; Kullnick, Yvonne; Lehmann, Stefanie; Heinrich, Marco; Dietrich, Arne; Mohr, Friedrich Wilhelm; von Bergen, Martin; Baumann, Sven

    2014-12-05

    Multiple reaction monitoring (MRM)-based mass spectrometric quantification of peptides and their corresponding proteins has been successfully applied for biomarker validation in serum. The option of multiplexing offers the chance to analyze various proteins in parallel, which is especially important in obesity research. Here, biomarkers that reflect multiple comorbidities and allow monitoring of therapy outcomes are required. Besides the suitability of established MRM assays for serum protein quantification, it is also feasible for analysis of tissues secreting the markers of interest. Surprisingly, studies comparing MRM data sets with established methods are rare, and therefore the biological and clinical value of most analytes remains questionable. A MRM method using nano-UPLC-MS/MS for the quantification of obesity related surrogate markers for several comorbidities in serum, plasma, visceral and subcutaneous adipose tissue was established. Proteotypic peptides for complement C3, adiponectin, angiotensinogen, and plasma retinol binding protein (RBP4) were quantified using isotopic dilution analysis and compared to the standard ELISA method. MRM method variabilities were mainly below 10%. The comparison with other MS-based approaches showed a good correlation. However, large differences in absolute quantification for complement C3 and adiponectin were obtained compared to ELISA, while less marked differences were observed for angiotensinogen and RBP4. The verification of MRM in obesity was performed to discriminate first lean and obese phenotype and second to monitor excessive weight loss after gastric bypass surgery in a seven-month follow-up. The presented MRM assay was able to discriminate obese phenotype from lean and monitor weight loss related changes of surrogate markers. However, inclusion of additional biomarkers was necessary to interpret the MRM data on obesity phenotype properly. In summary, the development of disease-related MRMs should include a step of matching the MRM data with clinically approved standard methods and defining reference values in well-sized representative age, gender, and disease-matched cohorts.

  12. Simple, Fast, and Sensitive Method for Quantification of Tellurite in Culture Media▿

    PubMed Central

    Molina, Roberto C.; Burra, Radhika; Pérez-Donoso, José M.; Elías, Alex O.; Muñoz, Claudia; Montes, Rebecca A.; Chasteen, Thomas G.; Vásquez, Claudio C.

    2010-01-01

    A fast, simple, and reliable chemical method for tellurite quantification is described. The procedure is based on the NaBH4-mediated reduction of TeO32− followed by the spectrophotometric determination of elemental tellurium in solution. The method is highly reproducible, is stable at different pH values, and exhibits linearity over a broad range of tellurite concentrations. PMID:20525868

  13. Direct quantitative evaluation of disease symptoms on living plant leaves growing under natural light.

    PubMed

    Matsunaga, Tomoko M; Ogawa, Daisuke; Taguchi-Shiobara, Fumio; Ishimoto, Masao; Matsunaga, Sachihiro; Habu, Yoshiki

    2017-06-01

    Leaf color is an important indicator when evaluating plant growth and responses to biotic/abiotic stress. Acquisition of images by digital cameras allows analysis and long-term storage of the acquired images. However, under field conditions, where light intensity can fluctuate and other factors (shade, reflection, and background, etc.) vary, stable and reproducible measurement and quantification of leaf color are hard to achieve. Digital scanners provide fixed conditions for obtaining image data, allowing stable and reliable comparison among samples, but require detached plant materials to capture images, and the destructive processes involved often induce deformation of plant materials (curled leaves and faded colors, etc.). In this study, by using a lightweight digital scanner connected to a mobile computer, we obtained digital image data from intact plant leaves grown in natural-light greenhouses without detaching the targets. We took images of soybean leaves infected by Xanthomonas campestris pv. glycines , and distinctively quantified two disease symptoms (brown lesions and yellow halos) using freely available image processing software. The image data were amenable to quantitative and statistical analyses, allowing precise and objective evaluation of disease resistance.

  14. Quantification of L-Citrulline and other physiologic amino acids in watermelon and selected cucurbits

    USDA-ARS?s Scientific Manuscript database

    High performance liquid chromatography of dabsyl derivatives of amino acids was employed for quantification of physiologic amino acids in cucurbits. This method is particularly useful because the dabsyl derivatives of glutamine and citrulline are sufficiently separated to allow quantification of ea...

  15. Interhemispheric Inhibition Measurement Reliability in Stroke: A Pilot Study

    PubMed Central

    Cassidy, Jessica M.; Chu, Haitao; Chen, Mo; Kimberley, Teresa J.; Carey, James R.

    2016-01-01

    Objective Reliable transcranial magnetic stimulation (TMS) measures for probing corticomotor excitability are important when assessing the physiological effects of non-invasive brain stimulation. The primary objective of this study was to examine test-retest reliability of an interhemispheric inhibition (IHI) index measurement in stroke. Materials and Methods Ten subjects with chronic stroke (≥ 6 months) completed two IHI testing sessions per week for three weeks (six testing sessions total). A single investigator measured IHI in the contra- to-ipsilesional primary motor cortex direction and in the opposite direction using bilateral paired-pulse TMS. Weekly sessions were separated by 24 hours with a 1-week washout period separating testing weeks. To determine if motor-evoked potential (MEP) quantification method affected measurement reliability, IHI indices computed from both MEP amplitude and area responses were found. Reliability was assessed with two-way, mixed intraclass correlation coefficients (ICC(3,k)). Standard error of measurement and minimal detectable difference statistics were also determined. Results With the exception of the initial testing week, IHI indices measured in the contra-to-ipsilesional hemisphere direction demonstrated moderate to excellent reliability (ICC = 0.725 – 0.913). Ipsi-to-contralesional IHI indices depicted poor or invalid reliability estimates throughout the three-week testing duration (ICC= −1.153 – 0.105). The overlap of ICC 95% confidence intervals suggested that IHI indices using MEP amplitude vs. area measures did not differ with respect to reliability. Conclusions IHI indices demonstrated varying magnitudes of reliability irrespective of MEP quantification method. Several strategies for improving IHI index measurement reliability are discussed. PMID:27333364

  16. Identification and evaluation of reliable reference genes for quantitative real-time PCR analysis in tea plant (Camellia sinensis (L.) O. Kuntze)

    USDA-ARS?s Scientific Manuscript database

    Quantitative real-time polymerase chain reaction (qRT-PCR) is a commonly used technique for measuring gene expression levels due to its simplicity, specificity, and sensitivity. Reliable reference selection for the accurate quantification of gene expression under various experimental conditions is a...

  17. Validation of a DIXON-based fat quantification technique for the measurement of visceral fat using a CT-based reference standard.

    PubMed

    Heckman, Katherine M; Otemuyiwa, Bamidele; Chenevert, Thomas L; Malyarenko, Dariya; Derstine, Brian A; Wang, Stewart C; Davenport, Matthew S

    2018-06-27

    The purpose of the study is to determine whether a novel semi-automated DIXON-based fat quantification algorithm can reliably quantify visceral fat using a CT-based reference standard. This was an IRB-approved retrospective cohort study of 27 subjects who underwent abdominopelvic CT within 7 days of proton density fat fraction (PDFF) mapping on a 1.5T MRI. Cross-sectional visceral fat area per slice (cm 2 ) was measured in blinded fashion in each modality at intervertebral disc levels from T12 to L4. CT estimates were obtained using a previously published semi-automated computational image processing system that sums pixels with attenuation - 205 to - 51 HU. MR estimates were obtained using two novel semi-automated DIXON-based fat quantification algorithms that measure visceral fat area by spatially regularizing non-uniform fat-only signal intensity or de-speckling PDFF 2D images and summing pixels with PDFF ≥ 50%. Pearson's correlations and Bland-Altman analyses were performed. Visceral fat area per slice ranged from 9.2 to 429.8 cm 2 for MR and from 1.6 to 405.5 cm 2 for CT. There was a strong correlation between CT and MR methods in measured visceral fat area across all studied vertebral body levels (r = 0.97; n = 101 observations); the least (r = 0.93) correlation was at T12. Bland-Altman analysis revealed a bias of 31.7 cm 2 (95% CI [- 27.1]-90.4 cm 2 ), indicating modestly higher visceral fat assessed by MR. MR- and CT-based visceral fat quantification are highly correlated and have good cross-modality reliability, indicating that visceral fat quantification by either method can yield a stable and reliable biomarker.

  18. Label-free DNA quantification via a 'pipette, aggregate and blot' (PAB) approach with magnetic silica particles on filter paper.

    PubMed

    Li, Jingyi; Liu, Qian; Alsamarri, Hussein; Lounsbury, Jenny A; Haversitick, Doris M; Landers, James P

    2013-03-07

    Reliable measurement of DNA concentration is essential for a broad range of applications in biology and molecular biology, and for many of these, quantifying the nucleic acid content is inextricably linked to obtaining optimal results. In its most simplistic form, quantitative analysis of nucleic acids can be accomplished by UV-Vis absorbance and, in more sophisticated format, by fluorimetry. A recently reported new concept, the 'pinwheel assay', involves a label-free approach for quantifying DNA through aggregation of paramagnetic beads in a rotating magnetic field. Here, we describe a simplified version of that assay adapted for execution using only a pipet and filter paper. The 'pipette, aggregate, and blot' (PAB) approach allows DNA to induce bead aggregation in a pipette tip through exposure to a magnetic field, followed by dispensing (blotting) onto filter paper. The filter paper immortalises the extent of aggregation, and digital images of the immortalized bead conformation, acquired with either a document scanner or a cell phone camera, allows for DNA quantification using a noncomplex algorithm. Human genomic DNA samples extracted from blood are quantified with the PAB approach and the results utilized to define the volume of sample used in a PCR reaction that is sensitive to input mass of template DNA. Integrating the PAB assay with paper-based DNA extraction and detection modalities has the potential to yield 'DNA quant-on-paper' devices that may be useful for point-of-care testing.

  19. Evaluation of municipal solid waste management performance by material flow analysis: Theoretical approach and case study.

    PubMed

    Zaccariello, Lucio; Cremiato, Raffaele; Mastellone, Maria Laura

    2015-10-01

    The main role of a waste management plan is to define which is the combination of waste management strategies and method needed to collect and manage the waste in such a way to ensure a given set of targets is reached. Objectives have to be sustainable and realistic, consistent with the environmental policies and regulations and monitored to verify the progressive achievement of the given targets. To get the aim, the setting up and quantification of indicators can allow the measurement of efficiency of a waste management system. The quantification of efficiency indicators requires the developing of a material flow analysis over the system boundary, from waste collection to secondary materials selling, processing and disposal. The material flow analysis has been carried out with reference to a case study for which a reliable, time- and site-specific database was available. The material flow analysis allowed the evaluation of the amount of materials sent to recycling, to landfilling and to waste-to-energy, by highlighting that the sorting of residual waste can further increase the secondary materials amount. The utilisation of energy recovery to treat the low-grade waste allows the maximisation of waste diversion from landfill with a low production of hazardous ash. A preliminary economic balance has been carried out to define the gate fee of the waste management system that was in the range of 84-145 € t(-1) without including the separate collection cost. The cost of door-by-door separate collection, designed to ensure the collection of five separate streams, resulted in 250 € t(-1) ±30%. © The Author(s) 2015.

  20. In-line multipoint near-infrared spectroscopy for moisture content quantification during freeze-drying.

    PubMed

    Kauppinen, Ari; Toiviainen, Maunu; Korhonen, Ossi; Aaltonen, Jaakko; Järvinen, Kristiina; Paaso, Janne; Juuti, Mikko; Ketolainen, Jarkko

    2013-02-19

    During the past decade, near-infrared (NIR) spectroscopy has been applied for in-line moisture content quantification during a freeze-drying process. However, NIR has been used as a single-vial technique and thus is not representative of the entire batch. This has been considered as one of the main barriers for NIR spectroscopy becoming widely used in process analytical technology (PAT) for freeze-drying. Clearly it would be essential to monitor samples that reliably represent the whole batch. The present study evaluated multipoint NIR spectroscopy for in-line moisture content quantification during a freeze-drying process. Aqueous sucrose solutions were used as model formulations. NIR data was calibrated to predict the moisture content using partial least-squares (PLS) regression with Karl Fischer titration being used as a reference method. PLS calibrations resulted in root-mean-square error of prediction (RMSEP) values lower than 0.13%. Three noncontact, diffuse reflectance NIR probe heads were positioned on the freeze-dryer shelf to measure the moisture content in a noninvasive manner, through the side of the glass vials. The results showed that the detection of unequal sublimation rates within a freeze-dryer shelf was possible with the multipoint NIR system in use. Furthermore, in-line moisture content quantification was reliable especially toward the end of the process. These findings indicate that the use of multipoint NIR spectroscopy can achieve representative quantification of moisture content and hence a drying end point determination to a desired residual moisture level.

  1. Integrating Internal Standards into Disposable Capillary Electrophoresis Devices To Improve Quantification

    PubMed Central

    2017-01-01

    To improve point-of-care quantification using microchip capillary electrophoresis (MCE), the chip-to-chip variabilities inherent in disposable, single-use devices must be addressed. This work proposes to integrate an internal standard (ISTD) into the microchip by adding it to the background electrolyte (BGE) instead of the sample—thus eliminating the need for additional sample manipulation, microchip redesigns, and/or system expansions required for traditional ISTD usage. Cs and Li ions were added as integrated ISTDs to the BGE, and their effects on the reproducibility of Na quantification were explored. Results were then compared to the conclusions of our previous publication which used Cs and Li as traditional ISTDs. The in-house fabricated microchips, electrophoretic protocols, and solution matrixes were kept constant, allowing the proposed method to be reliably compared to the traditional method. Using the integrated ISTDs, both Cs and Li improved the Na peak area reproducibility approximately 2-fold, to final RSD values of 2.2–4.7% (n = 900). In contrast (to previous work), Cs as a traditional ISTD resulted in final RSDs of 2.5–8.8%, while the traditional Li ISTD performed poorly with RSDs of 6.3–14.2%. These findings suggest integrated ISTDs are a viable method to improve the precision of disposable MCE devices—giving matched or superior results to the traditional method in this study while neither increasing system cost nor complexity. PMID:28192985

  2. Flow Cytometry: Evolution of Microbiological Methods for Probiotics Enumeration.

    PubMed

    Pane, Marco; Allesina, Serena; Amoruso, Angela; Nicola, Stefania; Deidda, Francesca; Mogna, Luca

    2018-05-14

    The purpose of this trial was to verify that the analytical method ISO 19344:2015 (E)-IDF 232:2015 (E) is valid and reliable for quantifying the concentration of the probiotic Lactobacillus rhamnosus GG (ATCC 53103) in a finished product formulation. Flow cytometry assay is emerging as an alternative rapid method for microbial detection, enumeration, and population profiling. The use of flow cytometry not only permits the determination of viable cell counts but also allows for enumeration of damaged and dead cell subpopulations. Results are expressed as TFU (Total Fluorescent Units) and AFU (Active Fluorescent Units). In December 2015, the International Standard ISO 19344-IDF 232 "Milk and milk products-Starter cultures, probiotics and fermented products-Quantification of lactic acid bacteria by flow cytometry" was published. This particular ISO can be applied universally and regardless of the species of interest. Analytical method validation was conducted on 3 different industrial batches of L. rhamnosus GG according to USP39<1225>/ICH Q2R1 in term of: accuracy, precision (repeatability), intermediate precision (ruggedness), specificity, limit of quantification, linearity, range, robustness. The data obtained on the 3 batches of finished product have significantly demonstrated the validity and robustness of the cytofluorimetric analysis. On the basis of the results obtained, the ISO 19344:2015 (E)-IDF 232:2015 (E) "Quantification of lactic acid bacteria by flow cytometry" can be used for the enumeration of L. rhamnosus GG in a finished product formulation.

  3. Non-invasive, non-radiological quantification of anteroposterior knee joint ligamentous laxity

    PubMed Central

    Russell, D. F.; Deakin, A. H.; Fogg, Q. A.; Picard, F.

    2013-01-01

    Objectives We performed in vitro validation of a non-invasive skin-mounted system that could allow quantification of anteroposterior (AP) laxity in the outpatient setting. Methods A total of 12 cadaveric lower limbs were tested with a commercial image-free navigation system using trackers secured by bone screws. We then tested a non-invasive fabric-strap system. The lower limb was secured at 10° intervals from 0° to 60° of knee flexion and 100 N of force was applied perpendicular to the tibia. Acceptable coefficient of repeatability (CR) and limits of agreement (LOA) of 3 mm were set based on diagnostic criteria for anterior cruciate ligament (ACL) insufficiency. Results Reliability and precision within the individual invasive and non-invasive systems was acceptable throughout the range of flexion tested (intra-class correlation coefficient 0.88, CR 1.6 mm). Agreement between the two systems was acceptable measuring AP laxity between full extension and 40° knee flexion (LOA 2.9 mm). Beyond 40° of flexion, agreement between the systems was unacceptable (LOA > 3 mm). Conclusions These results indicate that from full knee extension to 40° flexion, non-invasive navigation-based quantification of AP tibial translation is as accurate as the standard validated commercial system, particularly in the clinically and functionally important range of 20° to 30° knee flexion. This could be useful in diagnosis and post-operative evaluation of ACL pathology. Cite this article: Bone Joint Res 2013;2:233–7. PMID:24184443

  4. The characterization and certification of a quantitative reference material for Legionella detection and quantification by qPCR.

    PubMed

    Baume, M; Garrelly, L; Facon, J P; Bouton, S; Fraisse, P O; Yardin, C; Reyrolle, M; Jarraud, S

    2013-06-01

    The characterization and certification of a Legionella DNA quantitative reference material as a primary measurement standard for Legionella qPCR. Twelve laboratories participated in a collaborative certification campaign. A candidate reference DNA material was analysed through PCR-based limiting dilution assays (LDAs). The validated data were used to statistically assign both a reference value and an associated uncertainty to the reference material. This LDA method allowed for the direct quantification of the amount of Legionella DNA per tube in genomic units (GU) and the determination of the associated uncertainties. This method could be used for the certification of all types of microbiological standards for qPCR. The use of this primary standard will improve the accuracy of Legionella qPCR measurements and the overall consistency of these measurements among different laboratories. The extensive use of this certified reference material (CRM) has been integrated in the French standard NF T90-471 (April 2010) and in the ISO Technical Specification 12 869 (Anon 2012 International Standardisation Organisation) for validating qPCR methods and ensuring the reliability of these methods. © 2013 The Society for Applied Microbiology.

  5. Uncertainty propagation of p-boxes using sparse polynomial chaos expansions

    NASA Astrophysics Data System (ADS)

    Schöbi, Roland; Sudret, Bruno

    2017-06-01

    In modern engineering, physical processes are modelled and analysed using advanced computer simulations, such as finite element models. Furthermore, concepts of reliability analysis and robust design are becoming popular, hence, making efficient quantification and propagation of uncertainties an important aspect. In this context, a typical workflow includes the characterization of the uncertainty in the input variables. In this paper, input variables are modelled by probability-boxes (p-boxes), accounting for both aleatory and epistemic uncertainty. The propagation of p-boxes leads to p-boxes of the output of the computational model. A two-level meta-modelling approach is proposed using non-intrusive sparse polynomial chaos expansions to surrogate the exact computational model and, hence, to facilitate the uncertainty quantification analysis. The capabilities of the proposed approach are illustrated through applications using a benchmark analytical function and two realistic engineering problem settings. They show that the proposed two-level approach allows for an accurate estimation of the statistics of the response quantity of interest using a small number of evaluations of the exact computational model. This is crucial in cases where the computational costs are dominated by the runs of high-fidelity computational models.

  6. Uncertainty propagation of p-boxes using sparse polynomial chaos expansions

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schöbi, Roland, E-mail: schoebi@ibk.baug.ethz.ch; Sudret, Bruno, E-mail: sudret@ibk.baug.ethz.ch

    2017-06-15

    In modern engineering, physical processes are modelled and analysed using advanced computer simulations, such as finite element models. Furthermore, concepts of reliability analysis and robust design are becoming popular, hence, making efficient quantification and propagation of uncertainties an important aspect. In this context, a typical workflow includes the characterization of the uncertainty in the input variables. In this paper, input variables are modelled by probability-boxes (p-boxes), accounting for both aleatory and epistemic uncertainty. The propagation of p-boxes leads to p-boxes of the output of the computational model. A two-level meta-modelling approach is proposed using non-intrusive sparse polynomial chaos expansions tomore » surrogate the exact computational model and, hence, to facilitate the uncertainty quantification analysis. The capabilities of the proposed approach are illustrated through applications using a benchmark analytical function and two realistic engineering problem settings. They show that the proposed two-level approach allows for an accurate estimation of the statistics of the response quantity of interest using a small number of evaluations of the exact computational model. This is crucial in cases where the computational costs are dominated by the runs of high-fidelity computational models.« less

  7. Wireless, intraoral hybrid electronics for real-time quantification of sodium intake toward hypertension management.

    PubMed

    Lee, Yongkuk; Howe, Connor; Mishra, Saswat; Lee, Dong Sup; Mahmood, Musa; Piper, Matthew; Kim, Youngbin; Tieu, Katie; Byun, Hun-Soo; Coffey, James P; Shayan, Mahdis; Chun, Youngjae; Costanzo, Richard M; Yeo, Woon-Hong

    2018-05-22

    Recent wearable devices offer portable monitoring of biopotentials, heart rate, or physical activity, allowing for active management of human health and wellness. Such systems can be inserted in the oral cavity for measuring food intake in regard to controlling eating behavior, directly related to diseases such as hypertension, diabetes, and obesity. However, existing devices using plastic circuit boards and rigid sensors are not ideal for oral insertion. A user-comfortable system for the oral cavity requires an ultrathin, low-profile, and soft electronic platform along with miniaturized sensors. Here, we introduce a stretchable hybrid electronic system that has an exceptionally small form factor, enabling a long-range wireless monitoring of sodium intake. Computational study of flexible mechanics and soft materials provides fundamental aspects of key design factors for a tissue-friendly configuration, incorporating a stretchable circuit and sensor. Analytical calculation and experimental study enables reliable wireless circuitry that accommodates dynamic mechanical stress. Systematic in vitro modeling characterizes the functionality of a sodium sensor in the electronics. In vivo demonstration with human subjects captures the device feasibility for real-time quantification of sodium intake, which can be used to manage hypertension.

  8. Methane–oxygen electrochemical coupling in an ionic liquid: a robust sensor for simultaneous quantification†

    PubMed Central

    Wang, Zhe; Guo, Min; Baker, Gary A.; Stetter, Joseph R.; Lin, Lu; Mason, Andrew J.

    2017-01-01

    Current sensor devices for the detection of methane or natural gas emission are either expensive and have high power requirements or fail to provide a rapid response. This report describes an electrochemical methane sensor utilizing a non-volatile and conductive pyrrolidinium-based ionic liquid (IL) electrolyte and an innovative internal standard method for methane and oxygen dual-gas detection with high sensitivity, selectivity, and stability. At a platinum electrode in bis(trifluoromethylsulfonyl)imide (NTf2)-based ILs, methane is electro-oxidized to produce CO2 and water when an oxygen reduction process is included. The in situ generated CO2 arising from methane oxidation was shown to provide an excellent internal standard for quantification of the electrochemical oxygen sensor signal. The simultaneous quantification of both methane and oxygen in real time strengthens the reliability of the measurements by cross-validation of two ambient gases occurring within a single sample matrix and allows for the elimination of several types of random and systematic errors in the detection. We have also validated this IL-based methane sensor employing both conventional solid macroelectrodes and flexible microfabricated electrodes using single- and double-potential step chronoamperometry. PMID:25093213

  9. High-throughput flow alignment of barcoded hydrogel microparticles†

    PubMed Central

    Chapin, Stephen C.; Pregibon, Daniel C.

    2010-01-01

    Suspension (particle-based) arrays offer several advantages over conventional planar arrays in the detection and quantification of biomolecules, including the use of smaller sample volumes, more favorable probe-target binding kinetics, and rapid probe-set modification. We present a microfluidic system for the rapid alignment of multifunctional hydrogel microparticles designed to bear one or several biomolecule probe regions, as well as a graphical code to identify the embedded probes. Using high-speed imaging, we have developed and optimized a flow-through system that (1) allows for a high particle throughput, (2) ensures proper particle alignment for decoding and target quantification, and (3) can be reliably operated continuously without clogging. A tapered channel flanked by side focusing streams is used to orient the flexible, tablet-shaped particles into a well-ordered flow in the center of the channel. The effects of channel geometry, particle geometry, particle composition, particle loading density, and barcode design are explored to determine the best combination for eventual use in biological assays. Particles in the optimized system move at velocities of ~50 cm s−1 and with throughputs of ~40 particles s−1. Simple physical models and CFD simulations have been used to investigate flow behavior in the device. PMID:19823726

  10. A Python Interface for the Dakota Iterative Systems Analysis Toolkit

    NASA Astrophysics Data System (ADS)

    Piper, M.; Hutton, E.; Syvitski, J. P.

    2016-12-01

    Uncertainty quantification is required to improve the accuracy, reliability, and accountability of Earth science models. Dakota is a software toolkit, developed at Sandia National Laboratories, that provides an interface between models and a library of analysis methods, including support for sensitivity analysis, uncertainty quantification, optimization, and calibration techniques. Dakota is a powerful tool, but its learning curve is steep: the user not only must understand the structure and syntax of the Dakota input file, but also must develop intermediate code, called an analysis driver, that allows Dakota to run a model. The CSDMS Dakota interface (CDI) is a Python package that wraps and extends Dakota's user interface. It simplifies the process of configuring and running a Dakota experiment. A user can program to the CDI, allowing a Dakota experiment to be scripted. The CDI creates Dakota input files and provides a generic analysis driver. Any model written in Python that exposes a Basic Model Interface (BMI), as well as any model componentized in the CSDMS modeling framework, automatically works with the CDI. The CDI has a plugin architecture, so models written in other languages, or those that don't expose a BMI, can be accessed by the CDI by programmatically extending a template; an example is provided in the CDI distribution. Currently, six Dakota analysis methods have been implemented for examples from the much larger Dakota library. To demonstrate the CDI, we performed an uncertainty quantification experiment with the HydroTrend hydrological water balance and transport model. In the experiment, we evaluated the response of long-term suspended sediment load at the river mouth (Qs) to uncertainty in two input parameters, annual mean temperature (T) and precipitation (P), over a series of 100-year runs, using the polynomial chaos method. Through Dakota, we calculated moments, local and global (Sobol') sensitivity indices, and probability density and cumulative distribution functions for the response.

  11. Rapid quantification and sex determination of forensic evidence materials.

    PubMed

    Andréasson, Hanna; Allen, Marie

    2003-11-01

    DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.

  12. Sensitivity and specificity of human brain glutathione concentrations measured using short-TE (1)H MRS at 7 T.

    PubMed

    Deelchand, Dinesh K; Marjańska, Małgorzata; Hodges, James S; Terpstra, Melissa

    2016-05-01

    Although the MR editing techniques that have traditionally been used for the measurement of glutathione (GSH) concentrations in vivo address the problem of spectral overlap, they suffer detriments associated with inherently long TEs. The purpose of this study was to characterize the sensitivity and specificity for the quantification of GSH concentrations without editing at short TE. The approach was to measure synthetically generated changes in GSH concentrations from in vivo stimulated echo acquisition mode (STEAM) spectra after in vitro GSH spectra had been added to or subtracted from them. Spectra from five test subjects were synthetically altered to mimic changes in the GSH signal. To account for different background noise between measurements, retest spectra (from the same individuals as used to generate the altered data) and spectra from five other individuals were compared with the synthetically altered spectra to investigate the reliability of the quantification of GSH concentration. Using STEAM spectroscopy at 7 T, GSH concentration differences on the order of 20% were detected between test and retest studies, as well as between differing populations in a small sample (n = 5) with high accuracy (R(2) > 0.99) and certainty (p ≤ 0.01). Both increases and decreases in GSH concentration were reliably quantified with small impact on the quantification of ascorbate and γ-aminobutyric acid. These results show the feasibility of using short-TE (1)H MRS to measure biologically relevant changes and differences in human brain GSH concentration. Although these outcomes are specific to the experimental approach used and the spectral quality achieved, this study serves as a template for the analogous scrutiny of quantification reliability for other compounds, methodologies and spectral qualities. Copyright © 2016 John Wiley & Sons, Ltd.

  13. Accuracy and Precision of Radioactivity Quantification in Nuclear Medicine Images

    PubMed Central

    Frey, Eric C.; Humm, John L.; Ljungberg, Michael

    2012-01-01

    The ability to reliably quantify activity in nuclear medicine has a number of increasingly important applications. Dosimetry for targeted therapy treatment planning or for approval of new imaging agents requires accurate estimation of the activity in organs, tumors, or voxels at several imaging time points. Another important application is the use of quantitative metrics derived from images, such as the standard uptake value commonly used in positron emission tomography (PET), to diagnose and follow treatment of tumors. These measures require quantification of organ or tumor activities in nuclear medicine images. However, there are a number of physical, patient, and technical factors that limit the quantitative reliability of nuclear medicine images. There have been a large number of improvements in instrumentation, including the development of hybrid single-photon emission computed tomography/computed tomography and PET/computed tomography systems, and reconstruction methods, including the use of statistical iterative reconstruction methods, which have substantially improved the ability to obtain reliable quantitative information from planar, single-photon emission computed tomography, and PET images. PMID:22475429

  14. Reliability and discriminatory power of methods for dental plaque quantification

    PubMed Central

    RAGGIO, Daniela Prócida; BRAGA, Mariana Minatel; RODRIGUES, Jonas Almeida; FREITAS, Patrícia Moreira; IMPARATO, José Carlos Pettorossi; MENDES, Fausto Medeiros

    2010-01-01

    Objective This in situ study evaluated the discriminatory power and reliability of methods of dental plaque quantification and the relationship between visual indices (VI) and fluorescence camera (FC) to detect plaque. Material and Methods Six volunteers used palatal appliances with six bovine enamel blocks presenting different stages of plaque accumulation. The presence of plaque with and without disclosing was assessed using VI. Images were obtained with FC and digital camera in both conditions. The area covered by plaque was assessed. Examinations were done by two independent examiners. Data were analyzed by Kruskal-Wallis and Kappa tests to compare different conditions of samples and to assess the inter-examiner reproducibility. Results Some methods presented adequate reproducibility. The Turesky index and the assessment of area covered by disclosed plaque in the FC images presented the highest discriminatory powers. Conclusions The Turesky index and images with FC with disclosing present good reliability and discriminatory power in quantifying dental plaque. PMID:20485931

  15. Real-time PCR Machine System Modeling and a Systematic Approach for the Robust Design of a Real-time PCR-on-a-Chip System

    PubMed Central

    Lee, Da-Sheng

    2010-01-01

    Chip-based DNA quantification systems are widespread, and used in many point-of-care applications. However, instruments for such applications may not be maintained or calibrated regularly. Since machine reliability is a key issue for normal operation, this study presents a system model of the real-time Polymerase Chain Reaction (PCR) machine to analyze the instrument design through numerical experiments. Based on model analysis, a systematic approach was developed to lower the variation of DNA quantification and achieve a robust design for a real-time PCR-on-a-chip system. Accelerated lift testing was adopted to evaluate the reliability of the chip prototype. According to the life test plan, this proposed real-time PCR-on-a-chip system was simulated to work continuously for over three years with similar reproducibility in DNA quantification. This not only shows the robustness of the lab-on-a-chip system, but also verifies the effectiveness of our systematic method for achieving a robust design. PMID:22315563

  16. An ultra-high pressure liquid chromatography-tandem mass spectrometry method for the quantification of teicoplanin in plasma of neonates.

    PubMed

    Begou, O; Kontou, A; Raikos, N; Sarafidis, K; Roilides, E; Papadoyannis, I N; Gika, H G

    2017-03-15

    The development and validation of an ultra-high pressure liquid chromatography (UHPLC) tandem mass spectrometry (MS/MS) method was performed with the aim to be applied for the quantification of plasma teicoplanin concentrations in neonates. Pharmacokinetic data of teicoplanin in the neonatal population is very limited, therefore, a sensitive and reliable method for the determination of all isoforms of teicoplanin applied in a low volume of sample is of real importance. Teicoplanin main components were extracted by a simple acetonitrile precipitation step and analysed on a C18 chromatographic column by a triple quadrupole MS with electrospray ionization. The method provides quantitative data over a linear range of 25-6400ng/mL with LOD 8.5ng/mL and LOQ 25ng/mL for total teicoplanin. The method was applied in plasma samples from neonates to support pharmacokinetic data and proved to be a reliable and fast method for the quantification of teicoplanin concentration levels in plasma of infants during therapy in Intensive Care Unit. Copyright © 2016 Elsevier B.V. All rights reserved.

  17. Rapid Development and Validation of Improved Reversed-Phase High-performance Liquid Chromatography Method for the Quantification of Mangiferin, a Polyphenol Xanthone Glycoside in Mangifera indica

    PubMed Central

    Naveen, P.; Lingaraju, H. B.; Prasad, K. Shyam

    2017-01-01

    Mangiferin, a polyphenolic xanthone glycoside from Mangifera indica, is used as traditional medicine for the treatment of numerous diseases. The present study was aimed to develop and validate a reversed-phase high-performance liquid chromatography (RP-HPLC) method for the quantification of mangiferin from the bark extract of M. indica. RP-HPLC analysis was performed by isocratic elution with a low-pressure gradient using 0.1% formic acid: acetonitrile (87:13) as a mobile phase with a flow rate of 1.5 ml/min. The separation was done at 26°C using a Kinetex XB-C18 column as stationary phase and the detection wavelength at 256 nm. The proposed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification, and robustness by the International Conference on Harmonisation guidelines. In linearity, the excellent correlation coefficient more than 0.999 indicated good fitting of the curve and also good linearity. The intra- and inter-day precision showed < 1% of relative standard deviation of peak area indicated high reliability and reproducibility of the method. The recovery values at three different levels (50%, 100%, and 150%) of spiked samples were found to be 100.47, 100.89, and 100.99, respectively, and low standard deviation value < 1% shows high accuracy of the method. In robustness, the results remain unaffected by small variation in the analytical parameters, which shows the robustness of the method. Liquid chromatography–mass spectrometry analysis confirmed the presence of mangiferin with M/Z value of 421. The assay developed by HPLC method is a simple, rapid, and reliable for the determination of mangiferin from M. indica. SUMMARY The present study was intended to develop and validate an RP-HPLC method for the quantification of mangiferin from the bark extract of M. indica. The developed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification and robustness by International Conference on Harmonization guidelines. This study proved that the developed assay by HPLC method is a simple, rapid and reliable for the quantification of the mangiferin from M. indica. Abbreviations Used: M. indica: Mangifera indica, RP-HPLC: Reversed-phase high-performance liquid chromatography, M/Z: Mass to charge ratio, ICH: International conference on harmonization, % RSD: Percentage of relative standard deviation, ppm: Parts per million, LOD: Limit of detection, LOQ: Limit of quantification. PMID:28539748

  18. Rapid Development and Validation of Improved Reversed-Phase High-performance Liquid Chromatography Method for the Quantification of Mangiferin, a Polyphenol Xanthone Glycoside in Mangifera indica.

    PubMed

    Naveen, P; Lingaraju, H B; Prasad, K Shyam

    2017-01-01

    Mangiferin, a polyphenolic xanthone glycoside from Mangifera indica , is used as traditional medicine for the treatment of numerous diseases. The present study was aimed to develop and validate a reversed-phase high-performance liquid chromatography (RP-HPLC) method for the quantification of mangiferin from the bark extract of M. indica . RP-HPLC analysis was performed by isocratic elution with a low-pressure gradient using 0.1% formic acid: acetonitrile (87:13) as a mobile phase with a flow rate of 1.5 ml/min. The separation was done at 26°C using a Kinetex XB-C18 column as stationary phase and the detection wavelength at 256 nm. The proposed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification, and robustness by the International Conference on Harmonisation guidelines. In linearity, the excellent correlation coefficient more than 0.999 indicated good fitting of the curve and also good linearity. The intra- and inter-day precision showed < 1% of relative standard deviation of peak area indicated high reliability and reproducibility of the method. The recovery values at three different levels (50%, 100%, and 150%) of spiked samples were found to be 100.47, 100.89, and 100.99, respectively, and low standard deviation value < 1% shows high accuracy of the method. In robustness, the results remain unaffected by small variation in the analytical parameters, which shows the robustness of the method. Liquid chromatography-mass spectrometry analysis confirmed the presence of mangiferin with M/Z value of 421. The assay developed by HPLC method is a simple, rapid, and reliable for the determination of mangiferin from M. indica . The present study was intended to develop and validate an RP-HPLC method for the quantification of mangiferin from the bark extract of M. indica . The developed method was validated for linearity, precision, accuracy, limit of detection, limit of quantification and robustness by International Conference on Harmonization guidelines. This study proved that the developed assay by HPLC method is a simple, rapid and reliable for the quantification of the mangiferin from M. indica . Abbreviations Used: M. indica : Mangifera indica , RP-HPLC: Reversed-phase high-performance liquid chromatography, M/Z: Mass to charge ratio, ICH: International conference on harmonization, % RSD: Percentage of relative standard deviation, ppm: Parts per million, LOD: Limit of detection, LOQ: Limit of quantification.

  19. Validation of a Radiography-Based Quantification Designed to Longitudinally Monitor Soft Tissue Calcification in Skeletal Muscle.

    PubMed

    Moore, Stephanie N; Hawley, Gregory D; Smith, Emily N; Mignemi, Nicholas A; Ihejirika, Rivka C; Yuasa, Masato; Cates, Justin M M; Liu, Xulei; Schoenecker, Jonathan G

    2016-01-01

    Soft tissue calcification, including both dystrophic calcification and heterotopic ossification, may occur following injury. These lesions have variable fates as they are either resorbed or persist. Persistent soft tissue calcification may result in chronic inflammation and/or loss of function of that soft tissue. The molecular mechanisms that result in the development and maturation of calcifications are uncertain. As a result, directed therapies that prevent or resorb soft tissue calcifications remain largely unsuccessful. Animal models of post-traumatic soft tissue calcification that allow for cost-effective, serial analysis of an individual animal over time are necessary to derive and test novel therapies. We have determined that a cardiotoxin-induced injury of the muscles in the posterior compartment of the lower extremity represents a useful model in which soft tissue calcification develops remote from adjacent bones, thereby allowing for serial analysis by plain radiography. The purpose of the study was to design and validate a method for quantifying soft tissue calcifications in mice longitudinally using plain radiographic techniques and an ordinal scoring system. Muscle injury was induced by injecting cardiotoxin into the posterior compartment of the lower extremity in mice susceptible to developing soft tissue calcification. Seven days following injury, radiographs were obtained under anesthesia. Multiple researchers applied methods designed to standardize post-image processing of digital radiographs (N = 4) and quantify soft tissue calcification (N = 6) in these images using an ordinal scoring system. Inter- and intra-observer agreement for both post-image processing and the scoring system used was assessed using weighted kappa statistics. Soft tissue calcification quantifications by the ordinal scale were compared to mineral volume measurements (threshold 450.7mgHA/cm3) determined by μCT. Finally, sample-size calculations necessary to discriminate between a 25%, 50%, 75%, and 100% difference in STiCSS score 7 days following burn/CTX induced muscle injury were determined. Precision analysis demonstrated substantial to good agreement for both post-image processing (κ = 0.73 to 0.90) and scoring (κ = 0.88 to 0.93), with low inter- and intra-observer variability. Additionally, there was a strong correlation in quantification of soft tissue calcification between the ordinal system and by mineral volume quantification by μCT (Spearman r = 0.83 to 0.89). The ordinal scoring system reliably quantified soft tissue calcification in a burn/CTX-induced soft tissue calcification model compared to non-injured controls (Mann-Whitney rank test: P = 0.0002, ***). Sample size calculations revealed that 6 mice per group would be required to detect a 50% difference in STiCSS score with a power of 0.8. Finally, the STiCSS was demonstrated to reliably quantify soft tissue calcification [dystrophic calcification and heterotopic ossification] by radiographic analysis, independent of the histopathological state of the mineralization. Radiographic analysis can discriminate muscle injury-induced soft tissue calcification from adjacent bone and follow its clinical course over time without requiring the sacrifice of the animal. While the STiCSS cannot identify the specific type of soft tissue calcification present, it is still a useful and valid method by which to quantify the degree of soft tissue calcification. This methodology allows for longitudinal measurements of soft tissue calcification in a single animal, which is relatively less expensive, less time-consuming, and exposes the animal to less radiation than in vivo μCT. Therefore, this high-throughput, longitudinal analytic method for quantifying soft tissue calcification is a viable alternative for the study of soft tissue calcification.

  20. Validation of a Radiography-Based Quantification Designed to Longitudinally Monitor Soft Tissue Calcification in Skeletal Muscle

    PubMed Central

    Moore, Stephanie N.; Hawley, Gregory D.; Smith, Emily N.; Mignemi, Nicholas A.; Ihejirika, Rivka C.; Yuasa, Masato; Cates, Justin M. M.; Liu, Xulei; Schoenecker, Jonathan G.

    2016-01-01

    Introduction Soft tissue calcification, including both dystrophic calcification and heterotopic ossification, may occur following injury. These lesions have variable fates as they are either resorbed or persist. Persistent soft tissue calcification may result in chronic inflammation and/or loss of function of that soft tissue. The molecular mechanisms that result in the development and maturation of calcifications are uncertain. As a result, directed therapies that prevent or resorb soft tissue calcifications remain largely unsuccessful. Animal models of post-traumatic soft tissue calcification that allow for cost-effective, serial analysis of an individual animal over time are necessary to derive and test novel therapies. We have determined that a cardiotoxin-induced injury of the muscles in the posterior compartment of the lower extremity represents a useful model in which soft tissue calcification develops remote from adjacent bones, thereby allowing for serial analysis by plain radiography. The purpose of the study was to design and validate a method for quantifying soft tissue calcifications in mice longitudinally using plain radiographic techniques and an ordinal scoring system. Methods Muscle injury was induced by injecting cardiotoxin into the posterior compartment of the lower extremity in mice susceptible to developing soft tissue calcification. Seven days following injury, radiographs were obtained under anesthesia. Multiple researchers applied methods designed to standardize post-image processing of digital radiographs (N = 4) and quantify soft tissue calcification (N = 6) in these images using an ordinal scoring system. Inter- and intra-observer agreement for both post-image processing and the scoring system used was assessed using weighted kappa statistics. Soft tissue calcification quantifications by the ordinal scale were compared to mineral volume measurements (threshold 450.7mgHA/cm3) determined by μCT. Finally, sample-size calculations necessary to discriminate between a 25%, 50%, 75%, and 100% difference in STiCSS score 7 days following burn/CTX induced muscle injury were determined. Results Precision analysis demonstrated substantial to good agreement for both post-image processing (κ = 0.73 to 0.90) and scoring (κ = 0.88 to 0.93), with low inter- and intra-observer variability. Additionally, there was a strong correlation in quantification of soft tissue calcification between the ordinal system and by mineral volume quantification by μCT (Spearman r = 0.83 to 0.89). The ordinal scoring system reliably quantified soft tissue calcification in a burn/CTX-induced soft tissue calcification model compared to non-injured controls (Mann-Whitney rank test: P = 0.0002, ***). Sample size calculations revealed that 6 mice per group would be required to detect a 50% difference in STiCSS score with a power of 0.8. Finally, the STiCSS was demonstrated to reliably quantify soft tissue calcification [dystrophic calcification and heterotopic ossification] by radiographic analysis, independent of the histopathological state of the mineralization. Conclusions Radiographic analysis can discriminate muscle injury-induced soft tissue calcification from adjacent bone and follow its clinical course over time without requiring the sacrifice of the animal. While the STiCSS cannot identify the specific type of soft tissue calcification present, it is still a useful and valid method by which to quantify the degree of soft tissue calcification. This methodology allows for longitudinal measurements of soft tissue calcification in a single animal, which is relatively less expensive, less time-consuming, and exposes the animal to less radiation than in vivo μCT. Therefore, this high-throughput, longitudinal analytic method for quantifying soft tissue calcification is a viable alternative for the study of soft tissue calcification. PMID:27438007

  1. Powder X-ray diffraction method for the quantification of cocrystals in the crystallization mixture.

    PubMed

    Padrela, Luis; de Azevedo, Edmundo Gomes; Velaga, Sitaram P

    2012-08-01

    The solid state purity of cocrystals critically affects their performance. Thus, it is important to accurately quantify the purity of cocrystals in the final crystallization product. The aim of this study was to develop a powder X-ray diffraction (PXRD) quantification method for investigating the purity of cocrystals. The method developed was employed to study the formation of indomethacin-saccharin (IND-SAC) cocrystals by mechanochemical methods. Pure IND-SAC cocrystals were geometrically mixed with 1:1 w/w mixture of indomethacin/saccharin in various proportions. An accurately measured amount (550 mg) of the mixture was used for the PXRD measurements. The most intense, non-overlapping, characteristic diffraction peak of IND-SAC was used to construct the calibration curve in the range 0-100% (w/w). This calibration model was validated and used to monitor the formation of IND-SAC cocrystals by liquid-assisted grinding (LAG). The IND-SAC cocrystal calibration curve showed excellent linearity (R(2) = 0.9996) over the entire concentration range, displaying limit of detection (LOD) and limit of quantification (LOQ) values of 1.23% (w/w) and 3.74% (w/w), respectively. Validation results showed excellent correlations between actual and predicted concentrations of IND-SAC cocrystals (R(2) = 0.9981). The accuracy and reliability of the PXRD quantification method depend on the methods of sample preparation and handling. The crystallinity of the IND-SAC cocrystals was higher when larger amounts of methanol were used in the LAG method. The PXRD quantification method is suitable and reliable for verifying the purity of cocrystals in the final crystallization product.

  2. Masticatory muscle activity assessment and reliability of a portable electromyographic instrument.

    PubMed

    Bowley, J F; Marx, D B

    2001-03-01

    Masticatory muscle hyperactivity is thought to produce muscle pain and tension headaches and can cause excessive wear or breakage of restorative dental materials used in the treatment of prosthodontic patients. The quantification and identification of this type of activity is an important consideration in the preoperative diagnosis and treatment planning phase of prosthodontic care. This study investigated the quantification process in complete denture/overdenture patients with natural mandibular tooth abutments and explored the reliability of instrumentation used to assess this parafunctional activity. The nocturnal EMG activity in asymptomatic complete denture/overdenture subjects was assessed with and without prostheses worn during sleep. Because of the large variance within and between subjects, the investigators evaluated the reliability of the 3 instruments used to test nocturnal EMG activity in the sample. Electromyographic activity data of denture/overdenture subjects revealed no differences between prostheses worn versus not worn during sleep but demonstrated a very large variance factor. Further investigation of the instrumentation demonstrated a consistent in vitro as well as in vivo reliability in controlled laboratory studies. The portable EMG instrumentation used in this study revealed a large, uncontrollable variance factor within and between subjects that greatly complicated the diagnosis of parafunctional activity in prosthodontic patients.

  3. Improving statistical inference on pathogen densities estimated by quantitative molecular methods: malaria gametocytaemia as a case study.

    PubMed

    Walker, Martin; Basáñez, María-Gloria; Ouédraogo, André Lin; Hermsen, Cornelus; Bousema, Teun; Churcher, Thomas S

    2015-01-16

    Quantitative molecular methods (QMMs) such as quantitative real-time polymerase chain reaction (q-PCR), reverse-transcriptase PCR (qRT-PCR) and quantitative nucleic acid sequence-based amplification (QT-NASBA) are increasingly used to estimate pathogen density in a variety of clinical and epidemiological contexts. These methods are often classified as semi-quantitative, yet estimates of reliability or sensitivity are seldom reported. Here, a statistical framework is developed for assessing the reliability (uncertainty) of pathogen densities estimated using QMMs and the associated diagnostic sensitivity. The method is illustrated with quantification of Plasmodium falciparum gametocytaemia by QT-NASBA. The reliability of pathogen (e.g. gametocyte) densities, and the accompanying diagnostic sensitivity, estimated by two contrasting statistical calibration techniques, are compared; a traditional method and a mixed model Bayesian approach. The latter accounts for statistical dependence of QMM assays run under identical laboratory protocols and permits structural modelling of experimental measurements, allowing precision to vary with pathogen density. Traditional calibration cannot account for inter-assay variability arising from imperfect QMMs and generates estimates of pathogen density that have poor reliability, are variable among assays and inaccurately reflect diagnostic sensitivity. The Bayesian mixed model approach assimilates information from replica QMM assays, improving reliability and inter-assay homogeneity, providing an accurate appraisal of quantitative and diagnostic performance. Bayesian mixed model statistical calibration supersedes traditional techniques in the context of QMM-derived estimates of pathogen density, offering the potential to improve substantially the depth and quality of clinical and epidemiological inference for a wide variety of pathogens.

  4. An in-advance stable isotope labeling strategy for relative analysis of multiple acidic plant hormones in sub-milligram Arabidopsis thaliana seedling and a single seed.

    PubMed

    Sun, Xiaohong; Ouyang, Yue; Chu, Jinfang; Yan, Jing; Yu, Yan; Li, Xiaoqiang; Yang, Jun; Yan, Cunyu

    2014-04-18

    A sensitive and reliable in-advance stable isotope labeling strategy was developed for simultaneous relative quantification of 8 acidic plant hormones in sub-milligram amount of plant materials. Bromocholine bromide (BETA) and its deuterated counterpart D9-BETA were used to in-advance derivatize control and sample extracts individually, which were then combined and subjected to solid-phase extraction (SPE) purification followed by UPLC-MS/MS analysis. Relative quantification of target compounds was obtained by calculation of the peak area ratios of BETA/D9-BETA labeled plant hormones. The in-advance stable isotope labeling strategy realized internal standard-based relative quantification of multiple kinds of plant hormones independent of availability of internal standard of every analyte with enhanced sensitivity of 1-3 orders of magnitude. Meanwhile, the in-advance labeling contributes to higher sample throughput and more reliability. The method was successfully applied to determine 8 plant hormones in 0.8mg DW (dry weight) of seedlings and 4 plant hormones from single seed of Arabidopsis thaliana. The results show the potential of the method in relative quantification of multiple plant hormones in tiny plant tissues or organs, which will advance the knowledge of the crosstalk mechanism of plant hormones. Copyright © 2014 Elsevier B.V. All rights reserved.

  5. Simultaneous ultra-high-pressure liquid chromatography-tandem mass spectrometry determination of amphetamine and amphetamine-like stimulants, cocaine and its metabolites, and a cannabis metabolite in surface water and urban wastewater.

    PubMed

    Bijlsma, Lubertus; Sancho, Juan V; Pitarch, Elena; Ibáñez, Maria; Hernández, Félix

    2009-04-10

    An ultra-high-pressure liquid chromatography-tandem mass spectrometry (UHPLC-MS/MS) method has been developed for the simultaneous quantification and confirmation of 11 basic/acidic illicit drugs and relevant metabolites in surface and urban wastewater at ng/L levels. The sample pre-treatment consisted of a solid-phase extraction using Oasis MCX cartridges. Analyte deuterated compounds were used as surrogate internal standards (except for norbenzoylecgonine and norcocaine) to compensate for possible errors resulting from matrix effects and those associated to the sample preparation procedure. After SPE enrichment, the selected drugs were separated within 6min under UHPLC optimized conditions. To efficiently combine UHPLC with MS/MS, a fast-acquisition triple quadrupole mass analyzer (TQD from Waters) in positive-ion mode (ESI+) was used. The excellent selectivity and sensitivity of the TQD analyzer in selected reaction monitoring mode allowed quantification and reliable identification at the LOQ levels. Satisfactory recoveries (70-120%) and precision (RSD<20%) were obtained for most compounds in different types of water samples, spiked at two concentration levels [limit of quantification (LOQ) and 10LOQ]. Thus, surface water was spiked at 30 ng/L and 300 ng/L (amphetamine and amphetamine-like stimulants), 10 ng/L and 100 ng/L (cocaine and its metabolites), 300 ng/L and 3000 ng/L (tetrahydrocannabinol-COOH). Recovery experiments in effluent and influent wastewater were performed at spiking levels of three and fifteen times higher than the levels spiked in surface water, respectively. The validated method was applied to urban wastewater samples (influent and effluent). The acquisition of three selected reaction monitoring transitions per analyte allowed positive findings to be confirmed by accomplishment of ion ratios between the quantification transition and two additional specific confirmation transitions. In general, drug consumption increased in the weekends and during an important musical event. The highest concentration levels were 27.5 microg/L and 10.5 microg/L, which corresponded to 3,4-methylenedioxymethamphetamine (MDMA, or ecstasy) and to benzoylecgonine (a cocaine metabolite), respectively. The wastewater treatment plants showed good removal efficiency (>99%) for low levels of illicit drugs in water, but some difficulties were observed when high drug levels were present in wastewaters.

  6. Simple and Reliable Method to Quantify the Hepatitis B Viral Load and Replicative Capacity in Liver Tissue and Blood Leukocytes

    PubMed Central

    Minosse, Claudia; Coen, Sabrina; Visco Comandini, Ubaldo; Lionetti, Raffaella; Montalbano, Marzia; Cerilli, Stefano; Vincenti, Donatella; Baiocchini, Andrea; Capobianchi, Maria R.; Menzo, Stefano

    2016-01-01

    Background A functional cure of chronic hepatitis B (CHB) is feasible, but a clear view of the intrahepatic viral dynamics in each patient is needed. Intrahepatic covalently closed circular DNA (cccDNA) is the stable form of the viral genome in infected cells, and represents the ideal marker of parenchymal colonization. Its relationships with easily accessible peripheral parameters need to be elucidated in order to avoid invasive procedures in patients. Objectives The goal of this study was to design, set up, and validate a reliable and straightforward method for the quantification of the cccDNA and total DNA of the hepatitis B virus (HBV) in a variety of clinical samples. Patients and Methods Clinical samples from a cohort of CHB patients, including liver biopsies in some, were collected for the analysis of intracellular HBV molecular markers using novel molecular assays. Results A plasmid construct, including sequences from the HBV genome and from the human gene hTERT, was generated as an isomolar multi-standard for HBV quantitation and normalization to the cellular contents. The specificity of the real-time assay for the cccDNA was assessed using Dane particles isolated on a density gradient. A comparison of liver tissue from 6 untreated and 6 treated patients showed that the treatment deeply reduced the replicative capacity (total DNA/cccDNA), but had limited impact on the parenchymal colonization. The peripheral blood mononuclear cells (PBMCs) and granulocytes from the treated and untreated patients were also analyzed. Conclusions A straightforward method for the quantification of intracellular HBV molecular parameters in clinical samples was developed and validated. The widespread use of such versatile assays could better define the prognosis of CHB, and allow a more rational approach to time-limited tailored treatment strategies. PMID:27882060

  7. Reliable estimates of predictive uncertainty for an Alpine catchment using a non-parametric methodology

    NASA Astrophysics Data System (ADS)

    Matos, José P.; Schaefli, Bettina; Schleiss, Anton J.

    2017-04-01

    Uncertainty affects hydrological modelling efforts from the very measurements (or forecasts) that serve as inputs to the more or less inaccurate predictions that are produced. Uncertainty is truly inescapable in hydrology and yet, due to the theoretical and technical hurdles associated with its quantification, it is at times still neglected or estimated only qualitatively. In recent years the scientific community has made a significant effort towards quantifying this hydrologic prediction uncertainty. Despite this, most of the developed methodologies can be computationally demanding, are complex from a theoretical point of view, require substantial expertise to be employed, and are constrained by a number of assumptions about the model error distribution. These assumptions limit the reliability of many methods in case of errors that show particular cases of non-normality, heteroscedasticity, or autocorrelation. The present contribution builds on a non-parametric data-driven approach that was developed for uncertainty quantification in operational (real-time) forecasting settings. The approach is based on the concept of Pareto optimality and can be used as a standalone forecasting tool or as a postprocessor. By virtue of its non-parametric nature and a general operating principle, it can be applied directly and with ease to predictions of streamflow, water stage, or even accumulated runoff. Also, it is a methodology capable of coping with high heteroscedasticity and seasonal hydrological regimes (e.g. snowmelt and rainfall driven events in the same catchment). Finally, the training and operation of the model are very fast, making it a tool particularly adapted to operational use. To illustrate its practical use, the uncertainty quantification method is coupled with a process-based hydrological model to produce statistically reliable forecasts for an Alpine catchment located in Switzerland. Results are presented and discussed in terms of their reliability and resolution.

  8. Brown Adipose Tissue Quantification in Human Neonates Using Water-Fat Separated MRI

    PubMed Central

    Rasmussen, Jerod M.; Entringer, Sonja; Nguyen, Annie; van Erp, Theo G. M.; Guijarro, Ana; Oveisi, Fariba; Swanson, James M.; Piomelli, Daniele; Wadhwa, Pathik D.

    2013-01-01

    There is a major resurgence of interest in brown adipose tissue (BAT) biology, particularly regarding its determinants and consequences in newborns and infants. Reliable methods for non-invasive BAT measurement in human infants have yet to be demonstrated. The current study first validates methods for quantitative BAT imaging of rodents post mortem followed by BAT excision and re-imaging of excised tissues. Identical methods are then employed in a cohort of in vivo infants to establish the reliability of these measures and provide normative statistics for BAT depot volume and fat fraction. Using multi-echo water-fat MRI, fat- and water-based images of rodents and neonates were acquired and ratios of fat to the combined signal from fat and water (fat signal fraction) were calculated. Neonatal scans (n = 22) were acquired during natural sleep to quantify BAT and WAT deposits for depot volume and fat fraction. Acquisition repeatability was assessed based on multiple scans from the same neonate. Intra- and inter-rater measures of reliability in regional BAT depot volume and fat fraction quantification were determined based on multiple segmentations by two raters. Rodent BAT was characterized as having significantly higher water content than WAT in both in situ as well as ex vivo imaging assessments. Human neonate deposits indicative of bilateral BAT in spinal, supraclavicular and axillary regions were observed. Pairwise, WAT fat fraction was significantly greater than BAT fat fraction throughout the sample (ΔWAT-BAT = 38%, p<10−4). Repeated scans demonstrated a high voxelwise correlation for fat fraction (Rall = 0.99). BAT depot volume and fat fraction measurements showed high intra-rater (ICCBAT,VOL = 0.93, ICCBAT,FF = 0.93) and inter-rater reliability (ICCBAT,VOL = 0.86, ICCBAT,FF = 0.93). This study demonstrates the reliability of using multi-echo water-fat MRI in human neonates for quantification throughout the torso of BAT depot volume and fat fraction measurements. PMID:24205024

  9. Species identification and quantification in meat and meat products using droplet digital PCR (ddPCR).

    PubMed

    Floren, C; Wiedemann, I; Brenig, B; Schütz, E; Beck, J

    2015-04-15

    Species fraud and product mislabelling in processed food, albeit not being a direct health issue, often results in consumer distrust. Therefore methods for quantification of undeclared species are needed. Targeting mitochondrial DNA, e.g. CYTB gene, for species quantification is unsuitable, due to a fivefold inter-tissue variation in mtDNA content per cell resulting in either an under- (-70%) or overestimation (+160%) of species DNA contents. Here, we describe a reliable two-step droplet digital PCR (ddPCR) assay targeting the nuclear F2 gene for precise quantification of cattle, horse, and pig in processed meat products. The ddPCR assay is advantageous over qPCR showing a limit of quantification (LOQ) and detection (LOD) in different meat products of 0.01% and 0.001%, respectively. The specificity was verified in 14 different species. Hence, determining F2 in food by ddPCR can be recommended for quality assurance and control in production systems. Copyright © 2014 The Authors. Published by Elsevier Ltd.. All rights reserved.

  10. Evaluation of Cu(i) binding to the E2 domain of the amyloid precursor protein - a lesson in quantification of metal binding to proteins via ligand competition.

    PubMed

    Young, Tessa R; Wedd, Anthony G; Xiao, Zhiguang

    2018-01-24

    The extracellular domain E2 of the amyloid precursor protein (APP) features a His-rich metal-binding site (denoted as the M1 site). In conjunction with surrounding basic residues, the site participates in interactions with components of the extracellular matrix including heparins, a class of negatively charged polysaccharide molecules of varying length. This work studied the chemistry of Cu(i) binding to APP E2 with the probe ligands Bcs, Bca, Fz and Fs. APP E2 forms a stable Cu(i)-mediated ternary complex with each of these anionic ligands. The complex with Bca was selected for isolation and characterization and was demonstrated, by native ESI-MS analysis, to have the stoichiometry E2 : Cu(i) : Bca = 1 : 1 : 1. Formation of these ternary complexes is specific for the APP E2 domain and requires Cu(i) coordination to the M1 site. Mutation of the M1 site was consistent with the His ligands being part of the E2 ligand set. It is likely that interactions between the negatively charged probe ligands and a positively charged patch on the surface of APP E2 are one aspect of the generation of the stable ternary complexes. Their formation prevented meaningful quantification of the affinity of Cu(i) binding to the M1 site with these probe ligands. However, the ternary complexes are disrupted by heparin, allowing reliable determination of a picomolar Cu(i) affinity for the E2/heparin complex with the Fz or Bca probe ligands. This is the first documented example of the formation of stable ternary complexes between a Cu(i) binding protein and a probe ligand. The ready disruption of the complexes by heparin identified clear 'tell-tale' signs for diagnosis of ternary complex formation and allowed a systematic review of conditions and criteria for reliable determination of affinities for metal binding via ligand competition. This study also provides new insights into a potential correlation of APP functions regulated by copper binding and heparin interaction.

  11. Reliable measurement of E. coli single cell fluorescence distribution using a standard microscope set-up.

    PubMed

    Cortesi, Marilisa; Bandiera, Lucia; Pasini, Alice; Bevilacqua, Alessandro; Gherardi, Alessandro; Furini, Simone; Giordano, Emanuele

    2017-01-01

    Quantifying gene expression at single cell level is fundamental for the complete characterization of synthetic gene circuits, due to the significant impact of noise and inter-cellular variability on the system's functionality. Commercial set-ups that allow the acquisition of fluorescent signal at single cell level (flow cytometers or quantitative microscopes) are expensive apparatuses that are hardly affordable by small laboratories. A protocol that makes a standard optical microscope able to acquire quantitative, single cell, fluorescent data from a bacterial population transformed with synthetic gene circuitry is presented. Single cell fluorescence values, acquired with a microscope set-up and processed with custom-made software, are compared with results that were obtained with a flow cytometer in a bacterial population transformed with the same gene circuitry. The high correlation between data from the two experimental set-ups, with a correlation coefficient computed over the tested dynamic range > 0.99, proves that a standard optical microscope- when coupled with appropriate software for image processing- might be used for quantitative single-cell fluorescence measurements. The calibration of the set-up, together with its validation, is described. The experimental protocol described in this paper makes quantitative measurement of single cell fluorescence accessible to laboratories equipped with standard optical microscope set-ups. Our method allows for an affordable measurement/quantification of intercellular variability, whose better understanding of this phenomenon will improve our comprehension of cellular behaviors and the design of synthetic gene circuits. All the required software is freely available to the synthetic biology community (MUSIQ Microscope flUorescence SIngle cell Quantification).

  12. 3D visualization and quantification of bone and teeth mineralization for the study of osteo/dentinogenesis in mice models

    NASA Astrophysics Data System (ADS)

    Marchadier, A.; Vidal, C.; Ordureau, S.; Lédée, R.; Léger, C.; Young, M.; Goldberg, M.

    2011-03-01

    Research on bone and teeth mineralization in animal models is critical for understanding human pathologies. Genetically modified mice represent highly valuable models for the study of osteo/dentinogenesis defects and osteoporosis. Current investigations on mice dental and skeletal phenotype use destructive and time consuming methods such as histology and scanning microscopy. Micro-CT imaging is quicker and provides high resolution qualitative phenotypic description. However reliable quantification of mineralization processes in mouse bone and teeth are still lacking. We have established novel CT imaging-based software for accurate qualitative and quantitative analysis of mouse mandibular bone and molars. Data were obtained from mandibles of mice lacking the Fibromodulin gene which is involved in mineralization processes. Mandibles were imaged with a micro-CT originally devoted to industrial applications (Viscom, X8060 NDT). 3D advanced visualization was performed using the VoxBox software (UsefulProgress) with ray casting algorithms. Comparison between control and defective mice mandibles was made by applying the same transfer function for each 3D data, thus allowing to detect shape, colour and density discrepencies. The 2D images of transverse slices of mandible and teeth were similar and even more accurate than those obtained with scanning electron microscopy. Image processing of the molars allowed the 3D reconstruction of the pulp chamber, providing a unique tool for the quantitative evaluation of dentinogenesis. This new method is highly powerful for the study of oro-facial mineralizations defects in mice models, complementary and even competitive to current histological and scanning microscopy appoaches.

  13. [18F]CFT [(18F)WIN 35,428], a radioligand to study the dopamine transporter with PET: characterization in human subjects.

    PubMed

    Laakso, A; Bergman, J; Haaparanta, M; Vilkman, H; Solin, O; Hietala, J

    1998-03-01

    We have characterized the usage of [18F]CFT (also known as [18F]WIN 35,428) as a radioligand for in vivo studies of human dopamine transporter by PET. CFT was labeled with 18F to a high specific activity, and dynamic PET scans were conducted in healthy volunteers at various time points up to 5 h from [18F]CFT injection. The regional distribution of [18F]CFT uptake correlated well with the known distribution of dopaminergic nerve terminals in the human brain and also with that of other dopamine transporter radioligands. Striatal binding peaked at 225 min after injection and declined thereafter, demonstrating the reversible nature of the binding to the dopamine transporter. Therefore, due to the relatively long half-life of 18F (109.8 min), PET scans with [18F]CFT could easily be conducted during the binding equilibrium, allowing estimation of Bmax/Kd values (i.e., binding potential). Binding potentials for putamen and caudate measured at equilibrium were 4.79+/-0.11 and 4.50+/-0.23, respectively. We were able to also visualize midbrain dopaminergic neurons (substantia nigra) with [18F]CFT in some subjects. In conclusion, the labeling of CFT with 18F allows PET scans to be conducted at binding equilibrium, and therefore a high signal-to-noise ratio and reliable quantification of binding potential can be achieved. With a high resolution 3D PET scanner, the quantification of extrastriatal dopamine transporters should become possible.

  14. Rapid quantification of iodopropynyl butylcarbamate as the preservative in cosmetic formulations using high-performance liquid chromatography-electrospray mass spectrometry.

    PubMed

    Frauen, M; Steinhart, H; Rapp, C; Hintze, U

    2001-07-01

    A simple, rapid and reproducible method for identification and quantification of iodopropynyl butylcarbamate (IPBC) in different cosmetic formulations is presented. The determination was carried out using a high-performance liquid chromatography (HPLC) procedure on a reversed phase column coupled to a single quadrupole mass spectrometer (MS) via an electrospray ionization (ESI) interface. Detection was performed in the positive selected ion-monitoring mode. In methanol/water extracts from different cosmetic formulations a detection limit between 50 and 100 ng/g could be achieved. A routine analytical procedure could be set up with good quantification reliability (relative standard deviation between 0.9 and 2.9%).

  15. A Sensitive Branched DNA HIV-1 Signal Amplification Viral Load Assay with Single Day Turnaround

    PubMed Central

    Baumeister, Mark A.; Zhang, Nan; Beas, Hilda; Brooks, Jesse R.; Canchola, Jesse A.; Cosenza, Carlo; Kleshik, Felix; Rampersad, Vinod; Surtihadi, Johan; Battersby, Thomas R.

    2012-01-01

    Branched DNA (bDNA) is a signal amplification technology used in clinical and research laboratories to quantitatively detect nucleic acids. An overnight incubation is a significant drawback of highly sensitive bDNA assays. The VERSANT® HIV-1 RNA 3.0 Assay (bDNA) (“Versant Assay”) currently used in clinical laboratories was modified to allow shorter target incubation, enabling the viral load assay to be run in a single day. To dramatically reduce the target incubation from 16–18 h to 2.5 h, composition of only the “Lysis Diluent” solution was modified. Nucleic acid probes in the assay were unchanged. Performance of the modified assay (assay in development; not commercially available) was evaluated and compared to the Versant Assay. Dilution series replicates (>950 results) were used to demonstrate that analytical sensitivity, linearity, accuracy, and precision for the shorter modified assay are comparable to the Versant Assay. HIV RNA-positive clinical specimens (n = 135) showed no significant difference in quantification between the modified assay and the Versant Assay. Equivalent relative quantification of samples of eight genotypes was demonstrated for the two assays. Elevated levels of several potentially interfering endogenous substances had no effect on quantification or specificity of the modified assay. The modified assay with drastically improved turnaround time demonstrates the viability of signal-amplifying technology, such as bDNA, as an alternative to the PCR-based assays dominating viral load monitoring in clinical laboratories. Highly sensitive bDNA assays with a single day turnaround may be ideal for laboratories with especially stringent cost, contamination, or reliability requirements. PMID:22479381

  16. A sensitive branched DNA HIV-1 signal amplification viral load assay with single day turnaround.

    PubMed

    Baumeister, Mark A; Zhang, Nan; Beas, Hilda; Brooks, Jesse R; Canchola, Jesse A; Cosenza, Carlo; Kleshik, Felix; Rampersad, Vinod; Surtihadi, Johan; Battersby, Thomas R

    2012-01-01

    Branched DNA (bDNA) is a signal amplification technology used in clinical and research laboratories to quantitatively detect nucleic acids. An overnight incubation is a significant drawback of highly sensitive bDNA assays. The VERSANT® HIV-1 RNA 3.0 Assay (bDNA) ("Versant Assay") currently used in clinical laboratories was modified to allow shorter target incubation, enabling the viral load assay to be run in a single day. To dramatically reduce the target incubation from 16-18 h to 2.5 h, composition of only the "Lysis Diluent" solution was modified. Nucleic acid probes in the assay were unchanged. Performance of the modified assay (assay in development; not commercially available) was evaluated and compared to the Versant Assay. Dilution series replicates (>950 results) were used to demonstrate that analytical sensitivity, linearity, accuracy, and precision for the shorter modified assay are comparable to the Versant Assay. HIV RNA-positive clinical specimens (n = 135) showed no significant difference in quantification between the modified assay and the Versant Assay. Equivalent relative quantification of samples of eight genotypes was demonstrated for the two assays. Elevated levels of several potentially interfering endogenous substances had no effect on quantification or specificity of the modified assay. The modified assay with drastically improved turnaround time demonstrates the viability of signal-amplifying technology, such as bDNA, as an alternative to the PCR-based assays dominating viral load monitoring in clinical laboratories. Highly sensitive bDNA assays with a single day turnaround may be ideal for laboratories with especially stringent cost, contamination, or reliability requirements.

  17. Pore network quantification of sandstones under experimental CO2 injection using image analysis

    NASA Astrophysics Data System (ADS)

    Berrezueta, Edgar; González-Menéndez, Luís; Ordóñez-Casado, Berta; Olaya, Peter

    2015-04-01

    Automated-image identification and quantification of minerals, pores and textures together with petrographic analysis can be applied to improve pore system characterization in sedimentary rocks. Our case study is focused on the application of these techniques to study the evolution of rock pore network subjected to super critical CO2-injection. We have proposed a Digital Image Analysis (DIA) protocol that guarantees measurement reproducibility and reliability. This can be summarized in the following stages: (i) detailed description of mineralogy and texture (before and after CO2-injection) by optical and scanning electron microscopy (SEM) techniques using thin sections; (ii) adjustment and calibration of DIA tools; (iii) data acquisition protocol based on image capture with different polarization conditions (synchronized movement of polarizers); (iv) study and quantification by DIA that allow (a) identification and isolation of pixels that belong to the same category: minerals vs. pores in each sample and (b) measurement of changes in pore network, after the samples have been exposed to new conditions (in our case: SC-CO2-injection). Finally, interpretation of the petrography and the measured data by an automated approach were done. In our applied study, the DIA results highlight the changes observed by SEM and microscopic techniques, which consisted in a porosity increase when CO2 treatment occurs. Other additional changes were minor: variations in the roughness and roundness of pore edges, and pore aspect ratio, shown in the bigger pore population. Additionally, statistic tests of pore parameters measured were applied to verify that the differences observed between samples before and after CO2-injection were significant.

  18. Identification and quantification of VOCs by proton transfer reaction time of flight mass spectrometry: An experimental workflow for the optimization of specificity, sensitivity, and accuracy

    PubMed Central

    Hanna, George B.

    2018-01-01

    Abstract Proton transfer reaction time of flight mass spectrometry (PTR‐ToF‐MS) is a direct injection MS technique, allowing for the sensitive and real‐time detection, identification, and quantification of volatile organic compounds. When aiming to employ PTR‐ToF‐MS for targeted volatile organic compound analysis, some methodological questions must be addressed, such as the need to correctly identify product ions, or evaluating the quantitation accuracy. This work proposes a workflow for PTR‐ToF‐MS method development, addressing the main issues affecting the reliable identification and quantification of target compounds. We determined the fragmentation patterns of 13 selected compounds (aldehydes, fatty acids, phenols). Experiments were conducted under breath‐relevant conditions (100% humid air), and within an extended range of reduced electric field values (E/N = 48–144 Td), obtained by changing drift tube voltage. Reactivity was inspected using H3O+, NO+, and O2 + as primary ions. The results show that a relatively low (<90 Td) E/N often permits to reduce fragmentation enhancing sensitivity and identification capabilities, particularly in the case of aldehydes using NO+, where a 4‐fold increase in sensitivity is obtained by means of drift voltage reduction. We developed a novel calibration methodology, relying on diffusion tubes used as gravimetric standards. For each of the tested compounds, it was possible to define suitable conditions whereby experimental error, defined as difference between gravimetric measurements and calculated concentrations, was 8% or lower. PMID:29336521

  19. Quantification of DNA using the luminescent oxygen channeling assay.

    PubMed

    Patel, R; Pollner, R; de Keczer, S; Pease, J; Pirio, M; DeChene, N; Dafforn, A; Rose, S

    2000-09-01

    Simplified and cost-effective methods for the detection and quantification of nucleic acid targets are still a challenge in molecular diagnostics. Luminescent oxygen channeling assay (LOCI(TM)) latex particles can be conjugated to synthetic oligodeoxynucleotides and hybridized, via linking probes, to different DNA targets. These oligomer-conjugated LOCI particles survive thermocycling in a PCR reaction and allow quantified detection of DNA targets in both real-time and endpoint formats. The endpoint DNA quantification format utilized two sensitizer bead types that are sensitive to separate illumination wavelengths. These two bead types were uniquely annealed to target or control amplicons, and separate illuminations generated time-resolved chemiluminescence, which distinguished the two amplicon types. In the endpoint method, ratios of the two signals allowed determination of the target DNA concentration over a three-log range. The real-time format allowed quantification of the DNA target over a six-log range with a linear relationship between threshold cycle and log of the number of DNA targets. This is the first report of the use of an oligomer-labeled latex particle assay capable of producing DNA quantification and sequence-specific chemiluminescent signals in a homogeneous format. It is also the first report of the generation of two signals from a LOCI assay. The methods described here have been shown to be easily adaptable to new DNA targets because of the generic nature of the oligomer-labeled LOCI particles.

  20. Inductively Coupled Plasma Mass Spectrometry (ICP-MS) Applications in Quantitative Proteomics.

    PubMed

    Chahrour, Osama; Malone, John

    2017-01-01

    Recent advances in inductively coupled plasma mass spectrometry (ICP-MS) hyphenated to different separation techniques have promoted it as a valuable tool in protein/peptide quantification. These emerging ICP-MS applications allow absolute quantification by measuring specific elemental responses. One approach quantifies elements already present in the structure of the target peptide (e.g. phosphorus and sulphur) as natural tags. Quantification of these natural tags allows the elucidation of the degree of protein phosphorylation in addition to absolute protein quantification. A separate approach is based on utilising bi-functional labelling substances (those containing ICP-MS detectable elements), that form a covalent chemical bond with the protein thus creating analogs which are detectable by ICP-MS. Based on the previously established stoichiometries of the labelling reagents, quantification can be achieved. This technique is very useful for the design of precise multiplexed quantitation schemes to address the challenges of biomarker screening and discovery. This review discusses the capabilities and different strategies to implement ICP-MS in the field of quantitative proteomics. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  1. Linear Array Ultrasonic Test Results from Alkali-Silica Reaction (ASR) Specimens

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Clayton, Dwight A; Khazanovich, Dr. Lev; Salles, Lucio

    2016-04-01

    The purpose of the U.S. Department of Energy Office of Nuclear Energy’s Light Water Reactor Sustainability (LWRS) Program is to develop technologies and other solutions that can improve the reliability, sustain the safety, and extend the operating lifetimes of nuclear power plants (NPPs) beyond 60 years. Since many important safety structures in an NPP are constructed of concrete, inspection techniques must be developed and tested to evaluate the internal condition. In-service containment structures generally do not allow for the destructive measures necessary to validate the accuracy of these inspection techniques. This creates a need for comparative testing of the variousmore » nondestructive evaluation (NDE) measurement techniques on concrete specimens with known material properties, voids, internal microstructure flaws, and reinforcement locations.This report presents results of the ultrasound evaluation of four concrete slabs with varying levels of ASR damage present. This included an investigation of the experimental results, as well as a supplemental simulation considering the effect of ASR damage by elasto-dynamic wave propagation using a finite integration technique method. It was found that the Hilbert Transform Indicator (HTI), developed for quantification of freeze/thaw damage in concrete structures, could also be successfully utilized for quantification of ASR damage. internal microstructure flaws, and reinforcement locations.« less

  2. Simultaneous determination of eight major steroids from Polyporus umbellatus by high-performance liquid chromatography coupled with mass spectrometry detections.

    PubMed

    Zhao, Ying-yong; Cheng, Xian-long; Zhang, Yongmin; Zhao, Ye; Lin, Rui-chao; Sun, Wen-ji

    2010-02-01

    Polyporus umbellatus is a widely used diuretic herbal medicine. In this study, a high-performance liquid chromatography coupled with atmospheric pressure chemical ionization-mass spectrometric detection (HPLC-APCI-MS) method was developed for qualitative and quantitative analysis of steroids, as well as for the quality control of Polyporus umbellatus. The selectivity, reproducibility and sensitivity were compared with HPLC with photodiode array detection and evaporative light scattering detection (ELSD). Selective ion monitoring in positive mode was used for qualitative and quantitative analysis of eight major components and beta-ecdysterone was used as the internal standard. Limits of detection and quantification fell in the ranges 7-21 and 18-63 ng/mL for the eight analytes with an injection of 10 microL samples, and all calibration curves showed good linear regression (r(2) > 0.9919) within the test range. The quantitative results demonstrated that samples from different localities showed different qualities. Advantages, in comparison with conventional HPLC-diode array detection and HPLC-ELSD, are that reliable identification of target compounds could be achieved by accurate mass measurements along with characteristic retention time, and the great enhancement in selectivity and sensitivity allows identification and quantification of low levels of constituents in complex Polyporus umbellatus matrixes. (c) 2009 John Wiley & Sons, Ltd.

  3. Quantification of the effects of ocean acidification on sediment microbial communities in the environment: the importance of ecosystem approaches.

    PubMed

    Hassenrück, Christiane; Fink, Artur; Lichtschlag, Anna; Tegetmeyer, Halina E; de Beer, Dirk; Ramette, Alban

    2016-05-01

    To understand how ocean acidification (OA) influences sediment microbial communities, naturally CO2-rich sites are increasingly being used as OA analogues. However, the characterization of these naturally CO2-rich sites is often limited to OA-related variables, neglecting additional environmental variables that may confound OA effects. Here, we used an extensive array of sediment and bottom water parameters to evaluate pH effects on sediment microbial communities at hydrothermal CO2 seeps in Papua New Guinea. The geochemical composition of the sediment pore water showed variations in the hydrothermal signature at seep sites with comparable pH, allowing the identification of sites that may better represent future OA scenarios. At these sites, we detected a 60% shift in the microbial community composition compared with reference sites, mostly related to increases in Chloroflexi sequences. pH was among the factors significantly, yet not mainly, explaining changes in microbial community composition. pH variation may therefore often not be the primary cause of microbial changes when sampling is done along complex environmental gradients. Thus, we recommend an ecosystem approach when assessing OA effects on sediment microbial communities under natural conditions. This will enable a more reliable quantification of OA effects via a reduction of potential confounding effects. © FEMS 2016.

  4. Mass Median Plume Angle: A novel approach to characterize plume geometry in solution based pMDIs.

    PubMed

    Moraga-Espinoza, Daniel; Eshaghian, Eli; Smyth, Hugh D C

    2018-05-30

    High-speed laser imaging (HSLI) is the preferred technique to characterize the geometry of the plume in pressurized metered dose inhalers (pMDIs). However, current methods do not allow for simulation of inhalation airflow and do not use drug mass quantification to determine plume angles. To address these limitations, a Plume Induction Port Evaluator (PIPE) was designed to characterize the plume geometry based on mass deposition patterns. The method is easily adaptable to current pMDI characterization methodologies, uses similar calculations methods, and can be used under airflow. The effect of airflow and formulation on the plume geometry were evaluated using PIPE and HSLI. Deposition patterns in PIPE were highly reproducible and log-normal distributed. Mass Median Plume Angle (MMPA) was a new characterization parameter to describe the effective angle of the droplets deposited in the induction port. Plume angles determined by mass showed a significant decrease in size as ethanol increases which correlates to the decrease on vapor pressure in the formulation. Additionally, airflow significantly decreased the angle of the plumes when cascade impactor was operated under flow. PIPE is an alternative to laser-based characterization methods to evaluate the plume angle of pMDIs based on reliable drug quantification while simulating patient inhalation. Copyright © 2018. Published by Elsevier B.V.

  5. Comparison of viable plate count, turbidity measurement and real-time PCR for quantification of Porphyromonas gingivalis.

    PubMed

    Clais, S; Boulet, G; Van Kerckhoven, M; Lanckacker, E; Delputte, P; Maes, L; Cos, P

    2015-01-01

    The viable plate count (VPC) is considered as the reference method for bacterial enumeration in periodontal microbiology but shows some important limitations for anaerobic bacteria. As anaerobes such as Porphyromonas gingivalis are difficult to culture, VPC becomes time-consuming and less sensitive. Hence, efficient normalization of experimental data to bacterial cell count requires alternative rapid and reliable quantification methods. This study compared the performance of VPC with that of turbidity measurement and real-time PCR (qPCR) in an experimental context using highly concentrated bacterial suspensions. Our TaqMan-based qPCR assay for P. gingivalis 16S rRNA proved to be sensitive and specific. Turbidity measurements offer a fast method to assess P. gingivalis growth, but suffer from high variability and a limited dynamic range. VPC was very time-consuming and less repeatable than qPCR. Our study concludes that qPCR provides the most rapid and precise approach for P. gingivalis quantification. Although our data were gathered in a specific research context, we believe that our conclusions on the inferior performance of VPC and turbidity measurements in comparison to qPCR can be extended to other research and clinical settings and even to other difficult-to-culture micro-organisms. Various clinical and research settings require fast and reliable quantification of bacterial suspensions. The viable plate count method (VPC) is generally seen as 'the gold standard' for bacterial enumeration. However, VPC-based quantification of anaerobes such as Porphyromonas gingivalis is time-consuming due to their stringent growth requirements and shows poor repeatability. Comparison of VPC, turbidity measurement and TaqMan-based qPCR demonstrated that qPCR possesses important advantages regarding speed, accuracy and repeatability. © 2014 The Society for Applied Microbiology.

  6. Targeted Proteomic Quantification on Quadrupole-Orbitrap Mass Spectrometer*

    PubMed Central

    Gallien, Sebastien; Duriez, Elodie; Crone, Catharina; Kellmann, Markus; Moehring, Thomas; Domon, Bruno

    2012-01-01

    There is an immediate need for improved methods to systematically and precisely quantify large sets of peptides in complex biological samples. To date protein quantification in biological samples has been routinely performed on triple quadrupole instruments operated in selected reaction monitoring mode (SRM), and two major challenges remain. Firstly, the number of peptides to be included in one survey experiment needs to be increased to routinely reach several hundreds, and secondly, the degree of selectivity should be improved so as to reliably discriminate the targeted analytes from background interferences. High resolution and accurate mass (HR/AM) analysis on the recently developed Q-Exactive mass spectrometer can potentially address these issues. This instrument presents a unique configuration: it is constituted of an orbitrap mass analyzer equipped with a quadrupole mass filter as the front-end for precursor ion mass selection. This configuration enables new quantitative methods based on HR/AM measurements, including targeted analysis in MS mode (single ion monitoring) and in MS/MS mode (parallel reaction monitoring). The ability of the quadrupole to select a restricted m/z range allows one to overcome the dynamic range limitations associated with trapping devices, and the MS/MS mode provides an additional stage of selectivity. When applied to targeted protein quantification in urine samples and benchmarked with the reference SRM technique, the quadrupole-orbitrap instrument exhibits similar or better performance in terms of selectivity, dynamic range, and sensitivity. This high performance is further enhanced by leveraging the multiplexing capability of the instrument to design novel acquisition methods and apply them to large targeted proteomic studies for the first time, as demonstrated on 770 tryptic yeast peptides analyzed in one 60-min experiment. The increased quality of quadrupole-orbitrap data has the potential to improve existing protein quantification methods in complex samples and address the pressing demand of systems biology or biomarker evaluation studies. PMID:22962056

  7. Empirically based comparisons of the reliability and validity of common quantification approaches for eyeblink startle potentiation in humans

    PubMed Central

    Bradford, Daniel E.; Starr, Mark J.; Shackman, Alexander J.

    2015-01-01

    Abstract Startle potentiation is a well‐validated translational measure of negative affect. Startle potentiation is widely used in clinical and affective science, and there are multiple approaches for its quantification. The three most commonly used approaches quantify startle potentiation as the increase in startle response from a neutral to threat condition based on (1) raw potentiation, (2) standardized potentiation, or (3) percent‐change potentiation. These three quantification approaches may yield qualitatively different conclusions about effects of independent variables (IVs) on affect when within‐ or between‐group differences exist for startle response in the neutral condition. Accordingly, we directly compared these quantification approaches in a shock‐threat task using four IVs known to influence startle response in the no‐threat condition: probe intensity, time (i.e., habituation), alcohol administration, and individual differences in general startle reactivity measured at baseline. We confirmed the expected effects of time, alcohol, and general startle reactivity on affect using self‐reported fear/anxiety as a criterion. The percent‐change approach displayed apparent artifact across all four IVs, which raises substantial concerns about its validity. Both raw and standardized potentiation approaches were stable across probe intensity and time, which supports their validity. However, only raw potentiation displayed effects that were consistent with a priori specifications and/or the self‐report criterion for the effects of alcohol and general startle reactivity. Supplemental analyses of reliability and validity for each approach provided additional evidence in support of raw potentiation. PMID:26372120

  8. HPLC-MRM relative quantification analysis of fatty acids based on a novel derivatization strategy.

    PubMed

    Cai, Tie; Ting, Hu; Xin-Xiang, Zhang; Jiang, Zhou; Jin-Lan, Zhang

    2014-12-07

    Fatty acids (FAs) are associated with a series of diseases including tumors, diabetes, and heart diseases. As potential biomarkers, FAs have attracted increasing attention from both biological researchers and the pharmaceutical industry. However, poor ionization efficiency, extreme diversity, strict dependence on internal standards and complicated multiple reaction monitoring (MRM) optimization protocols have challenged efforts to quantify FAs. In this work, a novel derivatization strategy based on 2,4-bis(diethylamino)-6-hydrazino-1,3,5-triazine was developed to enable quantification of FAs. The sensitivity of FA detection was significantly enhanced as a result of the derivatization procedure. FA quantities as low as 10 fg could be detected by high-performance liquid chromatography coupled with triple-quadrupole mass spectrometry. General MRM conditions were developed for any FA, which facilitated the quantification and extended the application of the method. The FA quantification strategy based on HPLC-MRM was carried out using deuterated derivatization reagents. "Heavy" derivatization reagents were used as internal standards (ISs) to minimize matrix effects. Prior to statistical analysis, amounts of each FA species were normalized by their corresponding IS, which guaranteed the accuracy and reliability of the method. FA changes in plasma induced by ageing were studied using this strategy. Several FA species were identified as potential ageing biomarkers. The sensitivity, accuracy, reliability, and full coverage of the method ensure that this strategy has strong potential for both biomarker discovery and lipidomic research.

  9. Evaluating the intra- and interobserver reliability of three-dimensional ultrasound and power Doppler angiography (3D-PDA) for assessment of placental volume and vascularity in the second trimester of pregnancy.

    PubMed

    Jones, Nia W; Raine-Fenning, Nick J; Mousa, Hatem A; Bradley, Eileen; Bugg, George J

    2011-03-01

    Three-dimensional (3-D) power Doppler angiography (3-D-PDA) allows visualisation of Doppler signals within the placenta and their quantification is possible by the generation of vascular indices by the 4-D View software programme. This study aimed to investigate intra- and interobserver reproducibility of 3-D-PDA analysis of stored datasets at varying gestations with the ultimate goal being to develop a tool for predicting placental dysfunction. Women with an uncomplicated, viable singleton pregnancy were scanned at 12, 16 or 20 weeks gestational age groups. 3-D-PDA datasets acquired of the whole placenta were analysed using the VOCAL software processing tool. Each volume was analysed by three observers twice in the A plane. Intra- and interobserver reliability was assessed by intraclass correlation coefficients (ICCs) and Bland Altman plots. At each gestational age group, 20 low risk women were scanned resulting in 60 datasets in total. The ICC demonstrated a high level of measurement reliability at each gestation with intraobserver values >0.90 and interobserver values of >0.6 for the vascular indices. Bland Altman plots also showed high levels of agreement. Systematic bias was seen at 20 weeks in the vascular indices obtained by different observers. This study demonstrates that 3-D-PDA data can be measured reliably by different observers from stored datasets up to 18 weeks gestation. Measurements become less reliable as gestation advances with bias between observers evident at 20 weeks. Copyright © 2011 World Federation for Ultrasound in Medicine & Biology. Published by Elsevier Inc. All rights reserved.

  10. DNA microsatellite region for a reliable quantification of soft wheat adulteration in durum wheat-based foodstuffs by real-time PCR.

    PubMed

    Sonnante, Gabriella; Montemurro, Cinzia; Morgese, Anita; Sabetta, Wilma; Blanco, Antonio; Pasqualone, Antonella

    2009-11-11

    Italian industrial pasta and durum wheat typical breads must be prepared using exclusively durum wheat semolina. Previously, a microsatellite sequence specific of the wheat D-genome had been chosen for traceability of soft wheat in semolina and bread samples, using qualitative and quantitative Sybr green-based real-time experiments. In this work, we describe an improved method based on the same soft wheat genomic region by means of a quantitative real-time PCR using a dual-labeled probe. Standard curves based on dilutions of 100% soft wheat flour, pasta, or bread were constructed. Durum wheat semolina, pasta, and bread samples were prepared with increasing amounts of soft wheat to verify the accuracy of the method. Results show that reliable quantifications were obtained especially for the samples containing a lower amount of soft wheat DNA, fulfilling the need to verify labeling of pasta and typical durum wheat breads.

  11. Sensitive and reliable multianalyte quantitation of herbal medicine in rat plasma using dynamic triggered multiple reaction monitoring.

    PubMed

    Yan, Zhixiang; Li, Tianxue; Lv, Pin; Li, Xiang; Zhou, Chen; Yang, Xinghao

    2013-06-01

    There is a growing need both clinically and experimentally to improve the determination of the blood levels of multiple chemical constituents in herbal medicines. The conventional multiple reaction monitoring (cMRM), however, is not well suited for multi-component determination and could not provide qualitative information for identity confirmation. Here we apply a dynamic triggered MRM (DtMRM) algorithm for the quantification of 20 constituents in an herbal prescription Bu-Zhong-Yi-Qi-Tang (BZYQT) in rat plasma. Dynamic MRM (DMRM) dramatically reduced the number of concurrent MRM transitions that are monitored during each MS scan. This advantage has been enhanced with the addition of triggered MRM (tMRM) for simultaneous confirmation, which maximizes the dwell time in the primary MRM quantitation phase, and also acquires sufficient MRM data to create a composite product ion spectrum. By allowing optimized collision energy for each product ion and maximizing dwell times, tMRM is significantly more sensitive and reliable than conventional product ion scanning. The DtMRM approach provides much higher sensitivity and reproducibility than cMRM. Copyright © 2013 Elsevier B.V. All rights reserved.

  12. Partial volume correction of magnetic resonance spectroscopic imaging

    NASA Astrophysics Data System (ADS)

    Lu, Yao; Wu, Dee; Magnotta, Vincent A.

    2007-03-01

    The ability to study the biochemical composition of the brain is becoming important to better understand neurodegenerative and neurodevelopmental disorders. Magnetic Resonance Spectroscopy (MRS) can non-invasively provide quantification of brain metabolites in localized regions. The reliability of MRS is limited in part due to partial volume artifacts. This results from the relatively large voxels that are required to acquire sufficient signal-to-noise ratios for the studies. Partial volume artifacts result when a MRS voxel contains a mixture of tissue types. Concentrations of metabolites vary from tissue to tissue. When a voxel contains a heterogeneous tissue composition, the spectroscopic signal acquired from this voxel will consist of the signal from different tissues making reliable measurements difficult. We have developed a novel tool for the estimation of partial volume tissue composition within MRS voxels thus allowing for the correction of partial volume artifacts. In addition, the tool can localize MR spectra to anatomical regions of interest. The tool uses tissue classification information acquired as part of a structural MR scan for the same subject. The tissue classification information is co-registered with the spectroscopic data. The user can quantify the partial volume composition of each voxel and use this information as covariates for metabolite concentrations.

  13. Real-time fiber selection using the Wii remote

    NASA Astrophysics Data System (ADS)

    Klein, Jan; Scholl, Mike; Köhn, Alexander; Hahn, Horst K.

    2010-02-01

    In the last few years, fiber tracking tools have become popular in clinical contexts, e.g., for pre- and intraoperative neurosurgical planning. The efficient, intuitive, and reproducible selection of fiber bundles still constitutes one of the main issues. In this paper, we present a framework for a real-time selection of axonal fiber bundles using a Wii remote control, a wireless controller for Nintendo's gaming console. It enables the user to select fiber bundles without any other input devices. To achieve a smooth interaction, we propose a novel spacepartitioning data structure for efficient 3D range queries in a data set consisting of precomputed fibers. The data structure which is adapted to the special geometry of fiber tracts allows for queries that are many times faster compared with previous state-of-the-art approaches. In order to extract reliably fibers for further processing, e.g., for quantification purposes or comparisons with preoperatively tracked fibers, we developed an expectationmaximization clustering algorithm that can refine the range queries. Our initial experiments have shown that white matter fiber bundles can be reliably selected within a few seconds by the Wii, which has been placed in a sterile plastic bag to simulate usage under surgical conditions.

  14. A hybrid anchored-ANOVA - POD/Kriging method for uncertainty quantification in unsteady high-fidelity CFD simulations

    NASA Astrophysics Data System (ADS)

    Margheri, Luca; Sagaut, Pierre

    2016-11-01

    To significantly increase the contribution of numerical computational fluid dynamics (CFD) simulation for risk assessment and decision making, it is important to quantitatively measure the impact of uncertainties to assess the reliability and robustness of the results. As unsteady high-fidelity CFD simulations are becoming the standard for industrial applications, reducing the number of required samples to perform sensitivity (SA) and uncertainty quantification (UQ) analysis is an actual engineering challenge. The novel approach presented in this paper is based on an efficient hybridization between the anchored-ANOVA and the POD/Kriging methods, which have already been used in CFD-UQ realistic applications, and the definition of best practices to achieve global accuracy. The anchored-ANOVA method is used to efficiently reduce the UQ dimension space, while the POD/Kriging is used to smooth and interpolate each anchored-ANOVA term. The main advantages of the proposed method are illustrated through four applications with increasing complexity, most of them based on Large-Eddy Simulation as a high-fidelity CFD tool: the turbulent channel flow, the flow around an isolated bluff-body, a pedestrian wind comfort study in a full scale urban area and an application to toxic gas dispersion in a full scale city area. The proposed c-APK method (anchored-ANOVA-POD/Kriging) inherits the advantages of each key element: interpolation through POD/Kriging precludes the use of quadrature schemes therefore allowing for a more flexible sampling strategy while the ANOVA decomposition allows for a better domain exploration. A comparison of the three methods is given for each application. In addition, the importance of adding flexibility to the control parameters and the choice of the quantity of interest (QoI) are discussed. As a result, global accuracy can be achieved with a reasonable number of samples allowing computationally expensive CFD-UQ analysis.

  15. Clinical utility of eco-color-power Doppler ultrasonography and contrast enhanced magnetic resonance imaging for interpretation and quantification of joint synovitis: a review.

    PubMed

    Carotti, Marina; Galeazzi, Vittoria; Catucci, Francesca; Zappia, Marcello; Arrigoni, Francesco; Barile, Antonio; Giovagnoni, Andrea

    2018-01-19

    With the introduction of new biologics such as anti-TNF-alpha antibodies and other therapies in the treatment of inflammatory arthritis, capable of halting joint destruction and functional disability, there are new pressures on diagnostic and prognostic imaging. Early demonstration of pre-erosive inflammatory features and monitoring of the long-term effects of treatment are becoming increasingly important. Early detection of synovitis offers advantages in terms of allowing early instigation of therapy and may allow the identification of those patients displaying more aggressive disease who might benefit from early intervention with expensive DMARD therapy. Advanced imaging techniques such as ultrasound (US) and magnetic resonance imaging (MRI) have focussed on the demonstration and quantification of synovitis and allow early diagnosis of inflammatory arthropathies such as rheumatoid arthritis (RA) and psoriatic arthritis (PsA). Synovitis represents a potential surrogate measure of disease activity that can be monitored using either MRI or US; the techniques have, generally, focused on monitoring synovial volume or quality as assessed by its vascularity. However to achieve these goals, standardisation and validation of US and MRI are required to ensure accurate diagnosis, reproducibility and reliability. Each modality has different strengths and weaknesses and levels of validation. This article aims to increase the awareness of radiologists and rheumatologists about this field and to encourage them to participate and contribute to the ongoing development of these modalities. Without this collaboration, it is unlikely that these modalities will reach their full potential in the field of rheumatological imaging. This review is in two parts. The first part addresses the role of US and colour or power Doppler sonography (PDUS) in the detection and monitoring of synovitis in inflammatory arthropathies. The second part will look at advanced MR imaging and Dynamic contrast-enhanced MRI techniques and in particular how they are applied to the monitoring of the disease process.

  16. Overview of Probabilistic Methods for SAE G-11 Meeting for Reliability and Uncertainty Quantification for DoD TACOM Initiative with SAE G-11 Division

    NASA Technical Reports Server (NTRS)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting during October 6-8 at the Best Western Sterling Inn, Sterling Heights (Detroit), Michigan is co-sponsored by US Army Tank-automotive & Armaments Command (TACOM). The meeting will provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11's Probabilistic Methods Committee is to "enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development."

  17. QUESP and QUEST revisited - fast and accurate quantitative CEST experiments.

    PubMed

    Zaiss, Moritz; Angelovski, Goran; Demetriou, Eleni; McMahon, Michael T; Golay, Xavier; Scheffler, Klaus

    2018-03-01

    Chemical exchange saturation transfer (CEST) NMR or MRI experiments allow detection of low concentrated molecules with enhanced sensitivity via their proton exchange with the abundant water pool. Be it endogenous metabolites or exogenous contrast agents, an exact quantification of the actual exchange rate is required to design optimal pulse sequences and/or specific sensitive agents. Refined analytical expressions allow deeper insight and improvement of accuracy for common quantification techniques. The accuracy of standard quantification methodologies, such as quantification of exchange rate using varying saturation power or varying saturation time, is improved especially for the case of nonequilibrium initial conditions and weak labeling conditions, meaning the saturation amplitude is smaller than the exchange rate (γB 1  < k). The improved analytical 'quantification of exchange rate using varying saturation power/time' (QUESP/QUEST) equations allow for more accurate exchange rate determination, and provide clear insights on the general principles to execute the experiments and to perform numerical evaluation. The proposed methodology was evaluated on the large-shift regime of paramagnetic chemical-exchange-saturation-transfer agents using simulated data and data of the paramagnetic Eu(III) complex of DOTA-tetraglycineamide. The refined formulas yield improved exchange rate estimation. General convergence intervals of the methods that would apply for smaller shift agents are also discussed. Magn Reson Med 79:1708-1721, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.

  18. Targeted methods for quantitative analysis of protein glycosylation

    PubMed Central

    Goldman, Radoslav; Sanda, Miloslav

    2018-01-01

    Quantification of proteins by LC-MS/MS-MRM has become a standard method with broad projected clinical applicability. MRM quantification of protein modifications is, however, far less utilized, especially in the case of glycoproteins. This review summarizes current methods for quantitative analysis of protein glycosylation with a focus on MRM methods. We describe advantages of this quantitative approach, analytical parameters that need to be optimized to achieve reliable measurements, and point out the limitations. Differences between major classes of N- and O-glycopeptides are described and class-specific glycopeptide assays are demonstrated. PMID:25522218

  19. Uncertainty Quantification and Statistical Engineering for Hypersonic Entry Applications

    NASA Technical Reports Server (NTRS)

    Cozmuta, Ioana

    2011-01-01

    NASA has invested significant resources in developing and validating a mathematical construct for TPS margin management: a) Tailorable for low/high reliability missions; b) Tailorable for ablative/reusable TPS; c) Uncertainty Quantification and Statistical Engineering are valuable tools not exploited enough; and d) Need to define strategies combining both Theoretical Tools and Experimental Methods. The main reason for this lecture is to give a flavor of where UQ and SE could contribute and hope that the broader community will work with us to improve in these areas.

  20. Identification of Reliable Reference Genes for Quantification of MicroRNAs in Serum Samples of Sulfur Mustard-Exposed Veterans.

    PubMed

    Gharbi, Sedigheh; Shamsara, Mehdi; Khateri, Shahriar; Soroush, Mohammad Reza; Ghorbanmehr, Nassim; Tavallaei, Mahmood; Nourani, Mohammad Reza; Mowla, Seyed Javad

    2015-01-01

    In spite of accumulating information about pathological aspects of sulfur mustard (SM), the precise mechanism responsible for its effects is not well understood. Circulating microRNAs (miRNAs) are promising biomarkers for disease diagnosis and prognosis. Accurate normalization using appropriate reference genes, is a critical step in miRNA expression studies. In this study, we aimed to identify appropriate reference gene for microRNA quantification in serum samples of SM victims. In this case and control experimental study, using quantitative real-time polymerase chain reaction (qRT-PCR), we evaluated the suitability of a panel of small RNAs including SNORD38B, SNORD49A, U6, 5S rRNA, miR-423-3p, miR-191, miR-16 and miR-103 in sera of 28 SM-exposed veterans of Iran-Iraq war (1980-1988) and 15 matched control volunteers. Different statistical algorithms including geNorm, Normfinder, best-keeper and comparative delta-quantification cycle (Cq) method were employed to find the least variable reference gene. miR-423-3p was identified as the most stably expressed reference gene, and miR- 103 and miR-16 ranked after that. We demonstrate that non-miRNA reference genes have the least stabil- ity in serum samples and that some house-keeping miRNAs may be used as more reliable reference genes for miRNAs in serum. In addition, using the geometric mean of two reference genes could increase the reliability of the normalizers.

  1. Aspect-Oriented Programming is Quantification and Implicit Invocation

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Friedman, Daniel P.; Koga, Dennis (Technical Monitor)

    2001-01-01

    We propose that the distinguishing characteristic of Aspect-Oriented Programming (AOP) languages is that they allow programming by making quantified programmatic assertions over programs that lack local notation indicating the invocation of these assertions. This suggests that AOP systems can be analyzed with respect to three critical dimensions: the kinds of quantifications allowed, the nature of the interactions that can be asserted, and the mechanism for combining base-level actions with asserted actions. Consequences of this perspective are the recognition that certain systems are not AOP and that some mechanisms are metabolism: they are sufficiently expressive to allow straightforwardly programming an AOP system within them.

  2. Therapeutic drug monitoring of carbamazepine and its metabolite in children from dried blood spots using liquid chromatography and tandem mass spectrometry.

    PubMed

    Shokry, Engy; Villanelli, Fabio; Malvagia, Sabrina; Rosati, Anna; Forni, Giulia; Funghini, Silvia; Ombrone, Daniela; Della Bona, Maria; Guerrini, Renzo; la Marca, Giancarlo

    2015-05-10

    Carbamazepine (CBZ) is a first-line drug for the treatment of different forms of epilepsy and the first choice drug for trigeminal neuralgia. CBZ is metabolized in the liver by oxidation into carbamazepine-10,11-epoxide (CBZE), its major metabolite which is equipotent and known to contribute to the pharmacological activity of CBZ. The aim of the present study was to develop and validate a reliable, selective and sensitive liquid chromatography-tandem mass spectrometry method for the simultaneous quantification of CBZ and its active metabolite in dried blood spots (DBS). The extraction process was carried out from DBS using methanol-water-formic acid (80:20:0.1, v/v/v). Chromatographic elution was achieved by using a linear gradient with a mobile phase consisting of acetonitrile-water-0.1% formic acid at a flow rate of 0.50mL/min. The method was linear over the range 1-40mg/L and 0.25-20mg/L for CBZ and CBZE, respectively. The limit of quantification was 0.75mg/L and 0.25mg/L for CBZ and CBZE. Intra-day and inter-day assay precisions were found to be lower than 5.13%, 6.46% and 11.76%, 4.72% with mean percentage accuracies of 102.1%, 97.5% and 99.2%, 97.8% for CBZ and CBZE. We successfully applied the method for determining DBS finger-prick samples in paediatric patients and confirmed the results with concentrations measured in matched plasma samples. This novel approach allows quantification of CBZ and its metabolite from only one 3.2mm DBS disc by LC-MS/MS thus combining advantages of DBS technique and LC-MS/MS in clinical practice. Copyright © 2015 Elsevier B.V. All rights reserved.

  3. Determination of eight nitrosamines in water at the ng L(-1) levels by liquid chromatography coupled to atmospheric pressure chemical ionization tandem mass spectrometry.

    PubMed

    Ripollés, Cristina; Pitarch, Elena; Sancho, Juan V; López, Francisco J; Hernández, Félix

    2011-09-19

    In this work, we have developed a sensitive method for detection and quantification of eight N-nitrosamines, N-nitrosodimethylamine (NDMA), N-nitrosomorpholine (NMor), N-nitrosomethylethylamine (NMEA), N-nitrosopirrolidine (NPyr), N-nitrosodiethylamine (NDEA), N-nitrosopiperidine (NPip), N-nitroso-n-dipropylamine (NDPA) and N-nitrosodi-n-butylamine (NDBA) in drinking water. The method is based on liquid chromatography coupled to tandem mass spectrometry, using atmospheric pressure chemical ionization (APCI) in positive mode with a triple quadrupole analyzer (QqQ). The simultaneous acquisition of two MS/MS transitions in selected reaction monitoring mode (SRM) for each compound, together with the evaluation of their relative intensity, allowed the simultaneous quantification and reliable identification in water at ppt levels. Empirical formula of the product ions selected was confirmed by UHPLC-(Q)TOF MS accurate mass measurements from reference standards. Prior to LC-MS/MS QqQ analysis, a preconcentration step by off-line SPE using coconut charcoal EPA 521 cartridges (by passing 500 mL of water sample) was necessary to improve the sensitivity and to meet regulation requirements. For accurate quantification, two isotope labelled nitrosamines (NDMA-d(6) and NDPA-d(14)) were added as surrogate internal standards to the samples. The optimized method was validated at two concentration levels (10 and 100 ng L(-1)) in drinking water samples, obtaining satisfactory recoveries (between 90 and 120%) and precision (RSD<20%). Limits of detection were found to be in the range of 1-8 ng L(-1). The described methodology has been applied to different types of water samples: chlorinated from drinking water and wastewater treatment plants (DWTP and WWTP, respectively), wastewaters subjected to ozonation and tap waters. Copyright © 2011 Elsevier B.V. All rights reserved.

  4. Analytical Performance of a Multiplex Real-Time PCR Assay Using TaqMan Probes for Quantification of Trypanosoma cruzi Satellite DNA in Blood Samples

    PubMed Central

    Abate, Teresa; Cayo, Nelly M.; Parrado, Rudy; Bello, Zoraida Diaz; Velazquez, Elsa; Muñoz-Calderon, Arturo; Juiz, Natalia A.; Basile, Joaquín; Garcia, Lineth; Riarte, Adelina; Nasser, Julio R.; Ocampo, Susana B.; Yadon, Zaida E.; Torrico, Faustino; de Noya, Belkisyole Alarcón; Ribeiro, Isabela; Schijman, Alejandro G.

    2013-01-01

    Background The analytical validation of sensitive, accurate and standardized Real-Time PCR methods for Trypanosoma cruzi quantification is crucial to provide a reliable laboratory tool for diagnosis of recent infections as well as for monitoring treatment efficacy. Methods/Principal Findings We have standardized and validated a multiplex Real-Time quantitative PCR assay (qPCR) based on TaqMan technology, aiming to quantify T. cruzi satellite DNA as well as an internal amplification control (IAC) in a single-tube reaction. IAC amplification allows rule out false negative PCR results due to inhibitory substances or loss of DNA during sample processing. The assay has a limit of detection (LOD) of 0.70 parasite equivalents/mL and a limit of quantification (LOQ) of 1.53 parasite equivalents/mL starting from non-boiled Guanidine EDTA blood spiked with T. cruzi CL-Brener stock. The method was evaluated with blood samples collected from Chagas disease patients experiencing different clinical stages and epidemiological scenarios: 1- Sixteen Venezuelan patients from an outbreak of oral transmission, 2- Sixty three Bolivian patients suffering chronic Chagas disease, 3- Thirty four Argentinean cases with chronic Chagas disease, 4- Twenty seven newborns to seropositive mothers, 5- A seronegative receptor who got infected after transplantation with a cadaveric kidney explanted from an infected subject. Conclusions/Significance The performing parameters of this assay encourage its application to early assessment of T. cruzi infection in cases in which serological methods are not informative, such as recent infections by oral contamination or congenital transmission or after transplantation with organs from seropositive donors, as well as for monitoring Chagas disease patients under etiological treatment. PMID:23350002

  5. Evaluation of peroxidative stress of cancer cells in vitro by real-time quantification of volatile aldehydes in culture headspace.

    PubMed

    Shestivska, Violetta; Rutter, Abigail V; Sulé-Suso, Josep; Smith, David; Španěl, Patrik

    2017-08-30

    Peroxidation of lipids in cellular membranes results in the release of volatile organic compounds (VOCs), including saturated aldehydes. The real-time quantification of trace VOCs produced by cancer cells during peroxidative stress presents a new challenge to non-invasive clinical diagnostics, which as described here, we have met with some success. A combination of selected ion flow tube mass spectrometry (SIFT-MS), a technique that allows rapid, reliable quantification of VOCs in humid air and liquid headspace, and electrochemistry to generate reactive oxygen species (ROS) in vitro has been used. Thus, VOCs present in the headspace of CALU-1 cancer cell line cultures exposed to ROS have been monitored and quantified in real time using SIFT-MS. The CALU-1 lung cancer cells were cultured in 3D collagen to mimic in vivo tissue. Real-time SIFT-MS analyses focused on the volatile aldehydes: propanal, butanal, pentanal, hexanal, heptanal and malondialdehyde (propanedial), that are expected to be products of cellular membrane peroxidation. All six aldehydes were identified in the culture headspace, each reaching peak concentrations during the time of exposure to ROS and eventually reducing as the reactants were depleted in the culture. Pentanal and hexanal were the most abundant, reaching concentrations of a few hundred parts-per-billion by volume, ppbv, in the culture headspace. The results of these experiments demonstrate that peroxidation of cancer cells in vitro can be monitored and evaluated by direct real-time analysis of the volatile aldehydes produced. The combination of adopted methodology potentially has value for the study of other types of VOCs that may be produced by cellular damage. Copyright © 2017 John Wiley & Sons, Ltd.

  6. Real-time polymerase chain reaction-based approach for quantification of the pat gene in the T25 Zea mays event.

    PubMed

    Weighardt, Florian; Barbati, Cristina; Paoletti, Claudia; Querci, Maddalena; Kay, Simon; De Beuckeleer, Marc; Van den Eede, Guy

    2004-01-01

    In Europe, a growing interest for reliable techniques for the quantification of genetically modified component(s) of food matrixes is arising from the need to comply with the European legislative framework on novel food products. Real-time polymerase chain reaction (PCR) is currently the most powerful technique for the quantification of specific nucleic acid sequences. Several real-time PCR methodologies based on different molecular principles have been developed for this purpose. The most frequently used approach in the field of genetically modified organism (GMO) quantification in food or feed samples is based on the 5'-3'-exonuclease activity of Taq DNA polymerase on specific degradation probes (TaqMan principle). A novel approach was developed for the establishment of a TaqMan quantification system assessing GMO contents around the 1% threshold stipulated under European Union (EU) legislation for the labeling of food products. The Zea mays T25 elite event was chosen as a model for the development of the novel GMO quantification approach. The most innovative aspect of the system is represented by the use of sequences cloned in plasmids as reference standards. In the field of GMO quantification, plasmids are an easy to use, cheap, and reliable alternative to Certified Reference Materials (CRMs), which are only available for a few of the GMOs authorized in Europe, have a relatively high production cost, and require further processing to be suitable for analysis. Strengths and weaknesses of the use of novel plasmid-based standards are addressed in detail. In addition, the quantification system was designed to avoid the use of a reference gene (e.g., a single copy, species-specific gene) as normalizer, i.e., to perform a GMO quantification based on an absolute instead of a relative measurement. In fact, experimental evidences show that the use of reference genes adds variability to the measurement system because a second independent real-time PCR-based measurement must be performed. Moreover, for some reference genes no sufficient information on copy number in and among genomes of different lines is available, making adequate quantification difficult. Once developed, the method was subsequently validated according to IUPAC and ISO 5725 guidelines. Thirteen laboratories from 8 EU countries participated in the trial. Eleven laboratories provided results complying with the predefined study requirements. Repeatability (RSDr) values ranged from 8.7 to 15.9%, with a mean value of 12%. Reproducibility (RSDR) values ranged from 16.3 to 25.5%, with a mean value of 21%. Following Codex Alimentarius Committee guidelines, both the limits of detection and quantitation were determined to be <0.1%.

  7. Improving the estimation of complete field soil water characteristic curves through field monitoring data

    NASA Astrophysics Data System (ADS)

    Bordoni, M.; Bittelli, M.; Valentino, R.; Chersich, S.; Meisina, C.

    2017-09-01

    In this work, Soil Water Characteristic Curves (SWCCs) were reconstructed through simultaneous field measurements of soil pore water pressure and water content. The objective was to evaluate whether field-based monitoring can allow for the improvement of the accuracy in SWCCs estimation with respect to the use of laboratory techniques. Moreover, field assessment of SWCCs allowed to: a) quantify the hydrological hysteresis affecting SWCCs through field data; b) analyze the effect of different temporal resolution of field measures; c) highlight the differences in SWCCs reconstructed for a particular soil during different hydrological years; d) evaluate the reliability of field reconstructed SWCCs, by the comparison between assessed and measured trends of a component of the soil water balance. These aspects were fundamental for assessing the reliability of the field reconstructed SWCCs. Field data at two Italian test-sites were measured. These test-sites were used to evaluate the goodness of field reconstructed SWCCs for soils characterized by different geomorphological, geological, physical and pedological features. Field measured or laboratory measured SWCCs data of 5 soil horizons (3 in a predominantly silty soil, 2 in a predominantly clayey one) were fitted by Van Genuchten model. Different field drying and wetting periods were identified, based on monthly meteorological conditions, in terms of rainfall and evapotranspiration amounts, of different cycles. This method allowed for a correct discrimination of the main drying and the main wetting paths from field data related and for a more reliable quantification of soil hydrological properties with respect to laboratory methodologies. Particular patterns of changes in SWCCs forms along depth could be also identified. Field SWCCs estimation is not affected by the temporal resolution of the acquisition (hours or days), as testified by similar values of Van Genuchten equation fitting parameters. Instead, hourly data may offer a clearer vision of the drying and wetting paths, due to the highest number of experimental data points. Moreover, in temperate climate situations as those of the test-sites, main drying curves and main wetting curves of a particular soil were substantially similar also for different hydrological cycles with peculiar meteorological conditions. SWCCs parameters were implemented in a numerical code (HYDRUS-1D) to simulate soil water storage for different soil horizons. Field reconstructed SWCCs allowed for simulating with a higher precision these trends, confirming the reliability of the reconstructed field curves by a quantitative point of view. Moreover, best results were obtained considering hysteresis in the modeling.

  8. Quantitative PCR for Genetic Markers of Human Fecal Pollution

    EPA Science Inventory

    Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires reliable host-specific analytical methods and a rapid quantificationapproach. We report the development of quantitative PCR assays for quantification of two recently described human-...

  9. High levels of exosomes expressing CD63 and caveolin-1 in plasma of melanoma patients.

    PubMed

    Logozzi, Mariantonia; De Milito, Angelo; Lugini, Luana; Borghi, Martina; Calabrò, Luana; Spada, Massimo; Perdicchio, Maurizio; Marino, Maria Lucia; Federici, Cristina; Iessi, Elisabetta; Brambilla, Daria; Venturi, Giulietta; Lozupone, Francesco; Santinami, Mario; Huber, Veronica; Maio, Michele; Rivoltini, Licia; Fais, Stefano

    2009-01-01

    Metastatic melanoma is an untreatable cancer lacking reliable and non-invasive markers of disease progression. Exosomes are small vesicles secreted by normal as well as tumor cells. Human tumor-derived exosomes are involved in malignant progression and we evaluated the presence of exosomes in plasma of melanoma patients as a potential tool for cancer screening and follow-up. We designed an in-house sandwich ELISA (Exotest) to capture and quantify exosomes in plasma based on expression of housekeeping proteins (CD63 and Rab-5b) and a tumor-associated marker (caveolin-1). Western blot and flow cytometry analysis of exosomes were used to confirm the Exotest-based findings. The Exotest allowed sensitive detection and quantification of exosomes purified from human tumor cell culture supernatants and plasma from SCID mice engrafted with human melanoma. Plasma levels of exosomes in melanoma-engrafted SCID mice correlated to tumor size. We evaluated the levels of plasma exosomes expressing CD63 and caveolin-1 in melanoma patients (n = 90) and healthy donors (n = 58). Consistently, plasma exosomes expressing CD63 (504+/-315) or caveolin-1 (619+/-310) were significantly increased in melanoma patients as compared to healthy donors (223+/-125 and 228+/-102, respectively). While the Exotest for CD63+ plasma exosomes had limited sensitivity (43%) the Exotest for detection of caveolin-1+ plasma exosomes showed a higher sensitivity (68%). Moreover, caveolin-1+ plasma exosomes were significantly increased with respect to CD63+ exosomes in the patients group. We describe a new non-invasive assay allowing detection and quantification of human exosomes in plasma of melanoma patients. Our results suggest that the Exotest for detection of plasma exosomes carrying tumor-associated antigens may represent a novel tool for clinical management of cancer patients.

  10. Simulating the mobility of meteoric 10Be in the landscape through a coupled soil-hillslope model (Be2D)

    NASA Astrophysics Data System (ADS)

    Campforts, Benjamin; Vanacker, Veerle; Vanderborght, Jan; Baken, Stijn; Smolders, Erik; Govers, Gerard

    2016-04-01

    Meteoric 10Be allows for the quantification of vertical and lateral soil fluxes over long time scales (103-105 yr). However, the mobility of meteoric 10Be in the soil system makes a translation of meteoric 10Be inventories into erosion and deposition rates complex. Here, we present a spatially explicit 2D model simulating the behaviour of meteoric 10Be on a hillslope. The model consists of two parts. The first component deals with advective and diffusive mobility of meteoric 10Be within the soil profile, and the second component describes lateral soil and meteoric 10Be fluxes over the hillslope. Soil depth is calculated dynamically, accounting for soil production through weathering as well as downslope fluxes of soil due to creep, water and tillage erosion. Synthetic model simulations show that meteoric 10Be inventories can be related to erosion and deposition across a wide range of geomorphological and pedological settings. Our results also show that meteoric 10Be can be used as a tracer to detect human impact on soil fluxes for soils with a high affinity for meteoric 10Be. However, the quantification of vertical mobility is essential for a correct interpretation of the observed variations in meteoric 10Be profiles and inventories. Application of the Be2D model to natural conditions using data sets from the Southern Piedmont (Bacon et al., 2012) and Appalachian Mountains (Jungers et al., 2009; West et al., 2013) allows to reliably constrain parameter values. Good agreement between simulated and observed meteoric 10Be concentrations and inventories is obtained with realistic parameter values. Furthermore, our results provide detailed insights into the processes redistributing meteoric 10Be at the soil-hillslope scale.

  11. Multiplex Droplet Digital PCR Protocols for Quantification of GM Maize Events.

    PubMed

    Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Štebih, Dejan; Morisset, Dany; Holst-Jensen, Arne; Žel, Jana

    2018-01-01

    The standard-curve based simplex quantitative polymerase chain reaction (qPCR) has been the gold standard for DNA target quantification for more than a decade. The large and growing number of individual analyses needed to test for genetically modified organisms (GMOs) is reducing the cost-effectiveness of qPCR. Droplet digital PCR (ddPCR) enables absolute quantification without standard curves, avoids the amplification efficiency bias observed with qPCR, allows more accurate estimations at low target copy numbers and, in combination with multiplexing, significantly improves cost efficiency. Here we describe two protocols for multiplex quantification of GM maize events: (1) nondiscriminating, with multiplex quantification of targets as a group (12 GM maize lines) and (2) discriminating, with multiplex quantification of individual targets (events). The first enables the quantification of twelve European Union authorized GM maize events as a group with only two assays, but does not permit determination of the individual events present. The second protocol enables the quantification of four individual targets (three GM events and one endogene) in a single reaction. Both protocols can be modified for quantification of any other DNA target.

  12. An overview of the mathematical and statistical analysis component of RICIS

    NASA Technical Reports Server (NTRS)

    Hallum, Cecil R.

    1987-01-01

    Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.

  13. Review of Reliability-Based Design Optimization Approach and Its Integration with Bayesian Method

    NASA Astrophysics Data System (ADS)

    Zhang, Xiangnan

    2018-03-01

    A lot of uncertain factors lie in practical engineering, such as external load environment, material property, geometrical shape, initial condition, boundary condition, etc. Reliability method measures the structural safety condition and determine the optimal design parameter combination based on the probabilistic theory. Reliability-based design optimization (RBDO) is the most commonly used approach to minimize the structural cost or other performance under uncertainty variables which combines the reliability theory and optimization. However, it cannot handle the various incomplete information. The Bayesian approach is utilized to incorporate this kind of incomplete information in its uncertainty quantification. In this paper, the RBDO approach and its integration with Bayesian method are introduced.

  14. Identification of spectral regions for the quantification of red wine tannins with fourier transform mid-infrared spectroscopy.

    PubMed

    Jensen, Jacob S; Egebo, Max; Meyer, Anne S

    2008-05-28

    Accomplishment of fast tannin measurements is receiving increased interest as tannins are important for the mouthfeel and color properties of red wines. Fourier transform mid-infrared spectroscopy allows fast measurement of different wine components, but quantification of tannins is difficult due to interferences from spectral responses of other wine components. Four different variable selection tools were investigated for the identification of the most important spectral regions which would allow quantification of tannins from the spectra using partial least-squares regression. The study included the development of a new variable selection tool, iterative backward elimination of changeable size intervals PLS. The spectral regions identified by the different variable selection methods were not identical, but all included two regions (1485-1425 and 1060-995 cm(-1)), which therefore were concluded to be particularly important for tannin quantification. The spectral regions identified from the variable selection methods were used to develop calibration models. All four variable selection methods identified regions that allowed an improved quantitative prediction of tannins (RMSEP = 69-79 mg of CE/L; r = 0.93-0.94) as compared to a calibration model developed using all variables (RMSEP = 115 mg of CE/L; r = 0.87). Only minor differences in the performance of the variable selection methods were observed.

  15. Quantifying NMR relaxation correlation and exchange in articular cartilage with time domain analysis

    NASA Astrophysics Data System (ADS)

    Mailhiot, Sarah E.; Zong, Fangrong; Maneval, James E.; June, Ronald K.; Galvosas, Petrik; Seymour, Joseph D.

    2018-02-01

    Measured nuclear magnetic resonance (NMR) transverse relaxation data in articular cartilage has been shown to be multi-exponential and correlated to the health of the tissue. The observed relaxation rates are dependent on experimental parameters such as solvent, data acquisition methods, data analysis methods, and alignment to the magnetic field. In this study, we show that diffusive exchange occurs in porcine articular cartilage and impacts the observed relaxation rates in T1-T2 correlation experiments. By using time domain analysis of T2-T2 exchange spectroscopy, the diffusive exchange time can be quantified by measurements that use a single mixing time. Measured characteristic times for exchange are commensurate with T1 in this material and so impacts the observed T1 behavior. The approach used here allows for reliable quantification of NMR relaxation behavior in cartilage in the presence of diffusive fluid exchange between two environments.

  16. Integration and global analysis of isothermal titration calorimetry data for studying macromolecular interactions.

    PubMed

    Brautigam, Chad A; Zhao, Huaying; Vargas, Carolyn; Keller, Sandro; Schuck, Peter

    2016-05-01

    Isothermal titration calorimetry (ITC) is a powerful and widely used method to measure the energetics of macromolecular interactions by recording a thermogram of differential heating power during a titration. However, traditional ITC analysis is limited by stochastic thermogram noise and by the limited information content of a single titration experiment. Here we present a protocol for bias-free thermogram integration based on automated shape analysis of the injection peaks, followed by combination of isotherms from different calorimetric titration experiments into a global analysis, statistical analysis of binding parameters and graphical presentation of the results. This is performed using the integrated public-domain software packages NITPIC, SEDPHAT and GUSSI. The recently developed low-noise thermogram integration approach and global analysis allow for more precise parameter estimates and more reliable quantification of multisite and multicomponent cooperative and competitive interactions. Titration experiments typically take 1-2.5 h each, and global analysis usually takes 10-20 min.

  17. Direct estimation of diffuse gaseous emissions from coal fires: current methods and future directions

    USGS Publications Warehouse

    Engle, Mark A.; Olea, Ricardo A.; O'Keefe, Jennifer M. K.; Hower, James C.; Geboy, Nicholas J.

    2013-01-01

    Coal fires occur in nature spontaneously, contribute to increases in greenhouse gases, and emit atmospheric toxicants. Increasing interest in quantifying coal fire emissions has resulted in the adaptation and development of specialized approaches and adoption of numerical modeling techniques. Overview of these methods for direct estimation of diffuse gas emissions from coal fires is presented in this paper. Here we take advantage of stochastic Gaussian simulation to interpolate CO2 fluxes measured using a dynamic closed chamber at the Ruth Mullins coal fire in Perry County, Kentucky. This approach allows for preparing a map of diffuse gas emissions, one of the two primary ways that gases emanate from coal fires, and establishing the reliability of the study both locally and for the entire fire. Future research directions include continuous and automated sampling to improve quantification of gaseous coal fire emissions.

  18. Computer-aided detection and quantification of endolymphatic hydrops within the mouse cochlea in vivo using optical coherence tomography

    NASA Astrophysics Data System (ADS)

    Liu, George S.; Kim, Jinkyung; Applegate, Brian E.; Oghalai, John S.

    2017-07-01

    Diseases that cause hearing loss and/or vertigo in humans such as Meniere's disease are often studied using animal models. The volume of endolymph within the inner ear varies with these diseases. Here, we used a mouse model of increased endolymph volume, endolymphatic hydrops, to develop a computer-aided objective approach to measure endolymph volume from images collected in vivo using optical coherence tomography. The displacement of Reissner's membrane from its normal position was measured in cochlear cross sections. We validated our computer-aided measurements with manual measurements and with trained observer labels. This approach allows for computer-aided detection of endolymphatic hydrops in mice, with test performance showing sensitivity of 91% and specificity of 87% using a running average of five measurements. These findings indicate that this approach is accurate and reliable for classifying endolymphatic hydrops and quantifying endolymph volume.

  19. Digital Quantification of Goldmann Visual Fields (GVF) as a Means for Genotype-Phenotype Comparisons and Detection of Progression in Retinal Degenerations

    PubMed Central

    Zahid, Sarwar; Peeler, Crandall; Khan, Naheed; Davis, Joy; Mahmood, Mahdi; Heckenlively, John; Jayasundera, Thiran

    2015-01-01

    Purpose To develop a reliable and efficient digital method to quantify planimetric Goldmann visual field (GVF) data to monitor disease course and treatment responses in retinal degenerative diseases. Methods A novel method to digitally quantify GVF using Adobe Photoshop CS3 was developed for comparison to traditional digital planimetry (Placom 45C digital planimeter; EngineerSupply, Lynchburg, Virginia, USA). GVFs from 20 eyes from 10 patients with Stargardt disease were quantified to assess the difference between the two methods (a total of 230 measurements per method). This quantification approach was also applied to 13 patients with X-linked retinitis pigmentosa (XLRP) with mutations in RPGR. Results Overall, measurements using Adobe Photoshop were more rapidly performed than those using conventional planimetry. Photoshop measurements also exhibited less inter- and intra-observer variability. GVF areas for the I4e isopter in patients with the same mutation in RPGR who were nearby in age had similar qualitative and quantitative areas. Conclusions Quantification of GVF using Adobe Photoshop is quicker, more reliable, and less-user dependent than conventional digital planimetry. It will be a useful tool for both retrospective and prospective studies of disease course as well as for monitoring treatment response in clinical trials for retinal degenerative diseases. PMID:24664690

  20. Digital quantification of Goldmann visual fields (GVFs) as a means for genotype-phenotype comparisons and detection of progression in retinal degenerations.

    PubMed

    Zahid, Sarwar; Peeler, Crandall; Khan, Naheed; Davis, Joy; Mahmood, Mahdi; Heckenlively, John R; Jayasundera, Thiran

    2014-01-01

    To develop a reliable and efficient digital method to quantify planimetric Goldmann visual field (GVF) data to monitor disease course and treatment responses in retinal degenerative diseases. A novel method to digitally quantify GVFs using Adobe Photoshop CS3 was developed for comparison to traditional digital planimetry (Placom 45C digital planimeter; Engineer Supply, Lynchburg, Virginia, USA). GVFs from 20 eyes from 10 patients with Stargardt disease were quantified to assess the difference between the two methods (a total of 230 measurements per method). This quantification approach was also applied to 13 patients with X-linked retinitis pigmentosa (XLRP) with mutations in RPGR. Overall, measurements using Adobe Photoshop were more rapidly performed than those using conventional planimetry. Photoshop measurements also exhibited less inter- and intraobserver variability. GVF areas for the I4e isopter in patients with the same mutation in RPGR who were nearby in age had similar qualitative and quantitative areas. Quantification of GVFs using Adobe Photoshop is quicker, more reliable, and less user dependent than conventional digital planimetry. It will be a useful tool for both retrospective and prospective studies of disease course as well as for monitoring treatment response in clinical trials for retinal degenerative diseases.

  1. Reliable quantification of phthalates in environmental matrices (air, water, sludge, sediment and soil): a review.

    PubMed

    Net, Sopheak; Delmont, Anne; Sempéré, Richard; Paluselli, Andrea; Ouddane, Baghdad

    2015-05-15

    Because of their widespread application, phthalates or phthalic acid esters (PAEs) are ubiquitous in the environment. Their presence has attracted considerable attention due to their potential impacts on ecosystem functioning and on public health, so their quantification has become a necessity. Various extraction procedures as well as gas/liquid chromatography and mass spectrometry detection techniques are found as suitable for reliable detection of such compounds. However, PAEs are ubiquitous in the laboratory environment including ambient air, reagents, sampling equipment, and various analytical devices, that induces difficult analysis of real samples with a low PAE background. Therefore, accurate PAE analysis in environmental matrices is a challenging task. This paper reviews the extensive literature data on the techniques for PAE quantification in natural media. Sampling, sample extraction/pretreatment and detection for quantifying PAEs in different environmental matrices (air, water, sludge, sediment and soil) have been reviewed and compared. The concept of "green analytical chemistry" for PAE determination is also discussed. Moreover useful information about the material preparation and the procedures of quality control and quality assurance are presented to overcome the problem of sample contamination and these encountered due to matrix effects in order to avoid overestimating PAE concentrations in the environment. Copyright © 2015 Elsevier B.V. All rights reserved.

  2. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  3. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  4. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  5. Arkas: Rapid reproducible RNAseq analysis

    PubMed Central

    Colombo, Anthony R.; J. Triche Jr, Timothy; Ramsingh, Giridharan

    2017-01-01

    The recently introduced Kallisto pseudoaligner has radically simplified the quantification of transcripts in RNA-sequencing experiments.  We offer cloud-scale RNAseq pipelines Arkas-Quantification, and Arkas-Analysis available within Illumina’s BaseSpace cloud application platform which expedites Kallisto preparatory routines, reliably calculates differential expression, and performs gene-set enrichment of REACTOME pathways .  Due to inherit inefficiencies of scale, Illumina's BaseSpace computing platform offers a massively parallel distributive environment improving data management services and data importing.   Arkas-Quantification deploys Kallisto for parallel cloud computations and is conveniently integrated downstream from the BaseSpace Sequence Read Archive (SRA) import/conversion application titled SRA Import.  Arkas-Analysis annotates the Kallisto results by extracting structured information directly from source FASTA files with per-contig metadata, calculates the differential expression and gene-set enrichment analysis on both coding genes and transcripts. The Arkas cloud pipeline supports ENSEMBL transcriptomes and can be used downstream from the SRA Import facilitating raw sequencing importing, SRA FASTQ conversion, RNA quantification and analysis steps. PMID:28868134

  6. Quantitative PCR for genetic markers of human fecal pollution

    EPA Science Inventory

    Assessment of health risk and fecal bacteria loads associated with human fecal pollution requires reliable host-specific analytical methods and a rapid quantification approach. We report the development of quantitative PCR assays for enumeration of two recently described hum...

  7. High-Throughput HPLC-MS/MS Method for Quantification of Ibuprofen Enantiomers in Human Plasma: Focus on Investigation of Metabolite Interference.

    PubMed

    Nakov, Natalija; Bogdanovska, Liljana; Acevska, Jelena; Tonic-Ribarska, Jasmina; Petkovska, Rumenka; Dimitrovska, Aneta; Kasabova, Lilia; Svinarov, Dobrin

    2016-11-01

    In this research, as a part of the development of fast and reliable HPLC-MS/MS method for quantification of ibuprofen (IBP) enantiomers in human plasma, the possibility of IBP acylglucoronide (IBP-Glu) back-conversion was assessed. This involved investigation of in source and in vitro back-conversion. The separation of IBP enantiomers, its metabolite and rac-IBP-d3 (internal standard), was achieved within 6 min using Chiracel OJ-RH chromatographic column (150 × 2.1 mm, 5 μm). The followed selected reaction monitoring transitions for IBP-Glu (m/z 381.4 → 205.4, m/z 381.4 → 161.4 and m/z 205.4 → 161.4) implied that under the optimized electrospray ionization parameters, in source back-conversion of IBP-Glu was insignificant. The results obtained after liquid-liquid extraction of plasma samples spiked with IBP-Glu revealed that the amount of IBP enantiomers generated by IBP-Glu back-conversion was far <20% of lower limit of quantification sample. These results indicate that the presence of IBP-Glu in real samples will not affect the quantification of the IBP enantiomers; thereby reliability of the method was improved. Additional advantage of the method is the short analysis time making it suitable for the large number of samples. The method was fully validated according to the EMA guideline and was shown to meet all requirements to be applied in a pharmacokinetic study. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  8. A rapid, ideal, and eco-friendlier protocol for quantifying proline.

    PubMed

    Shabnam, Nisha; Tripathi, Indu; Sharmila, P; Pardha-Saradhi, P

    2016-11-01

    Proline, a stress marker, is routinely quantified by a protocol that essentially uses hazardous toluene. Negative impacts of toluene on human health prompted us to develop a reliable alternate protocol for proline quantification. Absorbance of the proline-ninhydrin condensation product formed by reaction of proline with ninhydrin at 100 °C in the reaction mixture was significantly higher than that recorded after its transfer to toluene, revealing that toluene lowers sensitivity of this assay. λ max of the proline-ninhydrin complex in the reaction mixture and toluene were 508 and 513 nm, respectively. Ninhydrin in glacial acetic acid yielded higher quantity of the proline-ninhydrin condensation product compared to ninhydrin in mixture of glacial acetic acid and H 3 PO 4 , indicating negative impact of H 3 PO 4 on proline quantification. Further, maximum yield of the proline-ninhydrin complex with ninhydrin in glacial acetic acid and ninhydrin in mixture of glacial acetic acid and H 3 PO 4 was achieved within 30 and 60 min, respectively. This revealed that H 3 PO 4 has negative impact on the reaction rate and quantity of the proline-ninhydrin complex formed. In brief, our proline quantification protocol involves reaction of a 1-ml proline sample with 2 ml of 1.25 % ninhydrin in glacial acetic acid at 100 °C for 30 min, followed by recording absorbance of the proline-ninhydrin condensation product in the reaction mixture itself at 508 nm. Amongst proline quantification protocols known till date, our protocol is the most simple, rapid, reliable, cost-effective, and eco-friendlier.

  9. Noise Propagation and Uncertainty Quantification in Hybrid Multiphysics Models: Initiation and Reaction Propagation in Energetic Materials

    DTIC Science & Technology

    2016-05-23

    general model for heterogeneous granular media under compaction and (ii) the lack of a reliable multiscale discrete -to-continuum framework for...dynamics. These include a continuum- discrete model of heat dissipation/diffusion and a continuum- discrete model of compaction of a granular material with...the lack of a general model for het- erogeneous granular media under compac- tion and (ii) the lack of a reliable multi- scale discrete -to-continuum

  10. Uncertainty Quantification of CFD Data Generated for a Model Scramjet Isolator Flowfield

    NASA Technical Reports Server (NTRS)

    Baurle, R. A.; Axdahl, E. L.

    2017-01-01

    Computational fluid dynamics is now considered to be an indispensable tool for the design and development of scramjet engine components. Unfortunately, the quantification of uncertainties is rarely addressed with anything other than sensitivity studies, so the degree of confidence associated with the numerical results remains exclusively with the subject matter expert that generated them. This practice must be replaced with a formal uncertainty quantification process for computational fluid dynamics to play an expanded role in the system design, development, and flight certification process. Given the limitations of current hypersonic ground test facilities, this expanded role is believed to be a requirement by some in the hypersonics community if scramjet engines are to be given serious consideration as a viable propulsion system. The present effort describes a simple, relatively low cost, nonintrusive approach to uncertainty quantification that includes the basic ingredients required to handle both aleatoric (random) and epistemic (lack of knowledge) sources of uncertainty. The nonintrusive nature of the approach allows the computational fluid dynamicist to perform the uncertainty quantification with the flow solver treated as a "black box". Moreover, a large fraction of the process can be automated, allowing the uncertainty assessment to be readily adapted into the engineering design and development workflow. In the present work, the approach is applied to a model scramjet isolator problem where the desire is to validate turbulence closure models in the presence of uncertainty. In this context, the relevant uncertainty sources are determined and accounted for to allow the analyst to delineate turbulence model-form errors from other sources of uncertainty associated with the simulation of the facility flow.

  11. Probabilistic risk assessment for a loss of coolant accident in McMaster Nuclear Reactor and application of reliability physics model for modeling human reliability

    NASA Astrophysics Data System (ADS)

    Ha, Taesung

    A probabilistic risk assessment (PRA) was conducted for a loss of coolant accident, (LOCA) in the McMaster Nuclear Reactor (MNR). A level 1 PRA was completed including event sequence modeling, system modeling, and quantification. To support the quantification of the accident sequence identified, data analysis using the Bayesian method and human reliability analysis (HRA) using the accident sequence evaluation procedure (ASEP) approach were performed. Since human performance in research reactors is significantly different from that in power reactors, a time-oriented HRA model (reliability physics model) was applied for the human error probability (HEP) estimation of the core relocation. This model is based on two competing random variables: phenomenological time and performance time. The response surface and direct Monte Carlo simulation with Latin Hypercube sampling were applied for estimating the phenomenological time, whereas the performance time was obtained from interviews with operators. An appropriate probability distribution for the phenomenological time was assigned by statistical goodness-of-fit tests. The human error probability (HEP) for the core relocation was estimated from these two competing quantities: phenomenological time and operators' performance time. The sensitivity of each probability distribution in human reliability estimation was investigated. In order to quantify the uncertainty in the predicted HEPs, a Bayesian approach was selected due to its capability of incorporating uncertainties in model itself and the parameters in that model. The HEP from the current time-oriented model was compared with that from the ASEP approach. Both results were used to evaluate the sensitivity of alternative huinan reliability modeling for the manual core relocation in the LOCA risk model. This exercise demonstrated the applicability of a reliability physics model supplemented with a. Bayesian approach for modeling human reliability and its potential usefulness of quantifying model uncertainty as sensitivity analysis in the PRA model.

  12. Aspect-Oriented Programming is Quantification and Obliviousness

    NASA Technical Reports Server (NTRS)

    Filman, Robert E.; Friedman, Daniel P.; Norvig, Peter (Technical Monitor)

    2000-01-01

    This paper proposes that the distinguishing characteristic of Aspect-Oriented Programming (AOP) systems is that they allow programming by making quantified programmatic assertions over programs written by programmers oblivious to such assertions. Thus, AOP systems can be analyzed with respect to three critical dimensions: the kinds of quantifications allowed, the nature of the actions that can be asserted, and the mechanism for combining base-level actions with asserted actions. Consequences of this perspective are the recognition that certain systems are not AOP and that some mechanisms are expressive enough to allow programming an AOP system within them. A corollary is that while AOP can be applied to Object-Oriented Programming, it is an independent concept applicable to other programming styles.

  13. Uncertainty Assessment of Synthetic Design Hydrographs for Gauged and Ungauged Catchments

    NASA Astrophysics Data System (ADS)

    Brunner, Manuela I.; Sikorska, Anna E.; Furrer, Reinhard; Favre, Anne-Catherine

    2018-03-01

    Design hydrographs described by peak discharge, hydrograph volume, and hydrograph shape are essential for engineering tasks involving storage. Such design hydrographs are inherently uncertain as are classical flood estimates focusing on peak discharge only. Various sources of uncertainty contribute to the total uncertainty of synthetic design hydrographs for gauged and ungauged catchments. These comprise model uncertainties, sampling uncertainty, and uncertainty due to the choice of a regionalization method. A quantification of the uncertainties associated with flood estimates is essential for reliable decision making and allows for the identification of important uncertainty sources. We therefore propose an uncertainty assessment framework for the quantification of the uncertainty associated with synthetic design hydrographs. The framework is based on bootstrap simulations and consists of three levels of complexity. On the first level, we assess the uncertainty due to individual uncertainty sources. On the second level, we quantify the total uncertainty of design hydrographs for gauged catchments and the total uncertainty of regionalizing them to ungauged catchments but independently from the construction uncertainty. On the third level, we assess the coupled uncertainty of synthetic design hydrographs in ungauged catchments, jointly considering construction and regionalization uncertainty. We find that the most important sources of uncertainty in design hydrograph construction are the record length and the choice of the flood sampling strategy. The total uncertainty of design hydrographs in ungauged catchments depends on the catchment properties and is not negligible in our case.

  14. Flow cytometry for receptor analysis from ex-vivo brain tissue in adult rat.

    PubMed

    Benoit, A; Guillamin, M; Aitken, P; Smith, P F; Philoxene, B; Sola, B; Poulain, L; Coquerel, A; Besnard, S

    2018-07-01

    Flow cytometry allows single-cell analysis of peripheral biological samples and is useful in many fields of research and clinical applications, mainly in hematology, immunology, and oncology. In the neurosciences, the flow cytometry separation method was first applied to stem cell extraction from healthy or cerebral tumour tissue and was more recently tested in order to phenotype brain cells, hippocampal neurogenesis, and to detect prion proteins. However, it remains sparsely applied in quantifying membrane receptors in relation to synaptic plasticity. We aimed to optimize a flow cytometric procedure for receptor quantification in neurons and non-neurons. A neural dissociation process, myelin separation, fixation, and membrane permeability procedures were optimized to maximize cell survival and analysis in hippocampal tissue obtained from adult rodents. We then aimed to quantify membrane muscarinic acetylcholine receptors (mAChRs) in rats with and without bilateral vestibular loss (BVL). mAChR's were quantified for neuronal and non-neuronal cells in the hippocampus and striatum following BVL. At day 30 but not at day 7 following BVL, there was a significant increase (P ≤ 0.05) in the percentage of neurons expressing M 2/4 mAChRs in both the hippocampus and the striatum. Here, we showed that flow cytometry appears to be a reliable method of membrane receptor quantification in ex-vivo brain tissue. Copyright © 2018 Elsevier B.V. All rights reserved.

  15. A simple and rapid method for optical visualization and quantification of bacteria on textiles

    PubMed Central

    Stiefel, Philipp; Schneider, Jana; Amberg, Caroline; Maniura-Weber, Katharina; Ren, Qun

    2016-01-01

    To prevent bacterial contamination on textiles and the associated undesired effects different biocidal coatings have been investigated and applied. However, due to health and environmental concerns anti-adhesive coatings preventing the binding of bacteria would be favored. To develop such anti-adhesive coatings simple assays for reliable and fast screening are beneficial. Here an easy-to-handle, robust and rapid assay to assess bacteria on textiles utilizing a tetrazolium salt was reported. The assay allowed direct eye visualization of the color change of the textiles containing bacteria, facilitating fast screening. Quantification of the adhered bacteria could be done by generating standard curves which correlate the staining intensity to cell numbers. An additional advantage of the described assay is that with the same detection method anti-adhesive and biocidal effects can be investigated. The method was applied to different coatings, using Pseudomonas aeruginosa and Staphylococcus aureus as model organisms. The detection limit was found to be between 2.5 * 106 and 9.4 * 108 for P. aeruginosa and between 1 * 106 and 3.3 * 108 for S. aureus. The anti-adhesive coating PLUMA was demonstrated to reduce bacterial adhesion without killing them, whereas the biocidal coating TH22-27 caused a clear reduction in the number of viable cells. PMID:28004762

  16. The role of attention in the tinnitus decompensation: reinforcement of a large-scale neural decompensation measure.

    PubMed

    Low, Yin Fen; Trenado, Carlos; Delb, Wolfgang; Corona-Strauss, Farah I; Strauss, Daniel J

    2007-01-01

    Large-scale neural correlates of the tinnitus decompensation have been identified by using wavelet phase stability criteria of single sweep sequences of auditory late responses (ALRs). The suggested measure provided an objective quantification of the tinnitus decompensation and allowed for a reliable discrimination between a group of compensated and decompensated tinnitus patients. By interpreting our results with an oscillatory tinnitus model, our synchronization stability measure of ALRs can be linked to the focus of attention on the tinnitus signal. In the following study, we examined in detail the correlates of this attentional mechanism in healthy subjects. The results support our previous findings of the phase synchronization stability measure that reflected neural correlates of the fixation of attention to the tinnitus signal. In this case, enabling the differentiation between the attended and unattended conditions. It is concluded that the wavelet phase synchronization stability of ALRs single sweeps can be used as objective tinnitus decompensation measure and can be interpreted in the framework of the Jastreboff tinnitus model and adaptive resonance theory. Our studies confirm that the synchronization stability in ALR sequences is linked to attention. This measure is not only able to serve as objective quantification of the tinnitus decompensation, but also can be applied in all online and real time neurofeedback therapeutic approach where a direct stimulus locked attention monitoring is compulsory as if it based on a single sweeps processing.

  17. Reproducibility measurements of three methods for calculating in vivo MR-based knee kinematics.

    PubMed

    Lansdown, Drew A; Zaid, Musa; Pedoia, Valentina; Subburaj, Karupppasamy; Souza, Richard; Benjamin, C; Li, Xiaojuan

    2015-08-01

    To describe three quantification methods for magnetic resonance imaging (MRI)-based knee kinematic evaluation and to report on the reproducibility of these algorithms. T2 -weighted, fast-spin echo images were obtained of the bilateral knees in six healthy volunteers. Scans were repeated for each knee after repositioning to evaluate protocol reproducibility. Semiautomatic segmentation defined regions of interest for the tibia and femur. The posterior femoral condyles and diaphyseal axes were defined using the previously defined tibia and femur. All segmentation was performed twice to evaluate segmentation reliability. Anterior tibial translation (ATT) and internal tibial rotation (ITR) were calculated using three methods: a tibial-based registration system, a combined tibiofemoral-based registration method with all manual segmentation, and a combined tibiofemoral-based registration method with automatic definition of condyles and axes. Intraclass correlation coefficients and standard deviations across multiple measures were determined. Reproducibility of segmentation was excellent (ATT = 0.98; ITR = 0.99) for both combined methods. ATT and ITR measurements were also reproducible across multiple scans in the combined registration measurements with manual (ATT = 0.94; ITR = 0.94) or automatic (ATT = 0.95; ITR = 0.94) condyles and axes. The combined tibiofemoral registration with automatic definition of the posterior femoral condyle and diaphyseal axes allows for improved knee kinematics quantification with excellent in vivo reproducibility. © 2014 Wiley Periodicals, Inc.

  18. Screening and confirmatory methods for the analysis of macrocyclic lactone mycotoxins by CE with amperometric detection.

    PubMed

    Arribas, Alberto Sánchez; Bermejo, Esperanza; Zapardiel, Antonio; Téllez, Helena; Rodríguez-Flores, Juana; Zougagh, Mohammed; Ríos, Angel; Chicharro, Manuel

    2009-02-01

    A simple analytical scheme for the screening and quantification of zearalenone and its metabolites, alpha-zearalenol and beta-zearalenol, is reported. Extracts from maize flour samples were collected by supercritical fluid extraction and afterwards, they were analyzed by CE with amperometric detection. This scheme allowed a rapid and reliable identification of contaminated flour samples according to the reference value established for zearalenone by directive 2005/38/EC (200 microg/kg). The sample screening method was carried out by CZE using 25 mM borate separation buffer at pH 9.2 and 25.0 kV as separation voltage, monitoring the amperometric signal at +700 mV with a carbon paste electrode. In this way, total amount of mycotoxins was determined and samples were processed in 4 min with a detection limit of 12 microg/L, enough to discriminate between positive (more than 200 microg/L total mycotoxins) and negative samples (less than 200 microg/L total mycotoxins). Positive samples were then subjected to CZE separation and quantification of each analyte was done with 50 mM borate running buffer modified with 30% methanol at pH 9.7 and 17.5 kV as separation voltage. Under these conditions, separation was achieved in 15 min with detection limits from 20 to 35 microg/L for each analyte.

  19. Quantification of the methylation status of the PWS/AS imprinted region: comparison of two approaches based on bisulfite sequencing and methylation-sensitive MLPA.

    PubMed

    Dikow, Nicola; Nygren, Anders Oh; Schouten, Jan P; Hartmann, Carolin; Krämer, Nikola; Janssen, Bart; Zschocke, Johannes

    2007-06-01

    Standard methods used for genomic methylation analysis allow the detection of complete absence of either methylated or non-methylated alleles but are usually unable to detect changes in the proportion of methylated and unmethylated alleles. We compare two methods for quantitative methylation analysis, using the chromosome 15q11-q13 imprinted region as model. Absence of the non-methylated paternal allele in this region leads to Prader-Willi syndrome (PWS) whilst absence of the methylated maternal allele results in Angelman syndrome (AS). A proportion of AS is caused by mosaic imprinting defects which may be missed with standard methods and require quantitative analysis for their detection. Sequence-based quantitative methylation analysis (SeQMA) involves quantitative comparison of peaks generated through sequencing reactions after bisulfite treatment. It is simple, cost-effective and can be easily established for a large number of genes. However, our results support previous suggestions that methods based on bisulfite treatment may be problematic for exact quantification of methylation status. Methylation-specific multiplex ligation-dependent probe amplification (MS-MLPA) avoids bisulfite treatment. It detects changes in both CpG methylation as well as copy number of up to 40 chromosomal sequences in one simple reaction. Once established in a laboratory setting, the method is more accurate, reliable and less time consuming.

  20. AccuCopy quantification combined with pre-amplification of long-distance PCR for fast analysis of intron 22 inversion in haemophilia A.

    PubMed

    Ding, Qianlan; Wu, Xi; Lu, Yeling; Chen, Changming; Shen, Rui; Zhang, Xi; Jiang, Zhengwen; Wang, Xuefeng

    2016-07-01

    To develop a digitalized intron 22 inversion (Inv22) detection in patients with severe haemophilia A. The design included two tests: A genotyping test included two multiplex pre-amplification of LD-PCR (PLP) with two combinations of five primers to amplify wild-type and chimeric int22h alleles; a carrier mosaicism test was similar to the genotyping test except only amplification of chimeric int22h alleles by removing one primer from each of two combinations. AccuCopy detection was used to quantify PLP products. PLP product patterns in the genotyping test allowed identifying all known Inv22. Quantitative patterns accurately represented the product patterns. The results of 164 samples detected by the genotyping test were consistent with those obtained by LD-PCR detection. Limit of detection (LOD) of the carrier mosaicism test was at least 2% of heterozygous cells with Inv22. Performing the test in two obligate mothers with negative Inv22 from two sporadic pedigrees mosaic rates of blood and hair root of the mother from pedigree 1 were 8.3% and >20%, respectively and negative results were obtained in pedigree 2. AccuCopy quantification combined with PLP (AQ-PLP) method was confirmed to be rapid and reliable for genotyping Inv22 and highly sensitive to carrier mosaicism detection. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Preschoolers' Number Sense

    ERIC Educational Resources Information Center

    Moomaw, Sally; Carr, Victoria; Boat, Mary; Barnett, David

    2010-01-01

    A child's demonstration of his conceptual understanding of number bodes well for his future success in school mathematics. As youngsters' thinking becomes more logical, they apply one-to-one correspondence relationships to quantification. Yet, reliable assessment of young children's mathematical ability is difficult because of social and emotional…

  2. High-throughput hydrophilic interaction chromatography coupled to tandem mass spectrometry for the optimized quantification of the anti-Gram-negatives antibiotic colistin A/B and its pro-drug colistimethate.

    PubMed

    Mercier, Thomas; Tissot, Fréderic; Gardiol, Céline; Corti, Natascia; Wehrli, Stéphane; Guidi, Monia; Csajka, Chantal; Buclin, Thierry; Couet, William; Marchetti, Oscar; Decosterd, Laurent A

    2014-11-21

    Colistin is a last resort's antibacterial treatment in critically ill patients with multi-drug resistant Gram-negative infections. As appropriate colistin exposure is the key for maximizing efficacy while minimizing toxicity, individualized dosing optimization guided by therapeutic drug monitoring is a top clinical priority. Objective of the present work was to develop a rapid and robust HPLC-MS/MS assay for quantification of colistin plasma concentrations. This novel methodology validated according to international standards simultaneously quantifies the microbiologically active compounds colistin A and B, plus the pro-drug colistin methanesulfonate (colistimethate, CMS). 96-well micro-Elution SPE on Oasis Hydrophilic-Lipophilic-Balanced (HLB) followed by direct analysis by Hydrophilic Interaction Liquid Chromatography (HILIC) with Ethylene Bridged Hybrid--BEH--Amide phase column coupled to tandem mass spectrometry allows a high-throughput with no significant matrix effect. The technique is highly sensitive (limit of quantification 0.014 and 0.006 μg/mL for colistin A and B), precise (intra-/inter-assay CV 0.6-8.4%) and accurate (intra-/inter-assay deviation from nominal concentrations -4.4 to +6.3%) over the clinically relevant analytical range 0.05-20 μg/mL. Colistin A and B in plasma and whole blood samples are reliably quantified over 48 h at room temperature and at +4°C (<6% deviation from nominal values) and after three freeze-thaw cycles. Colistimethate acidic hydrolysis (1M H2SO4) to colistin A and B in plasma was completed in vitro after 15 min of sonication while the pro-drug hydrolyzed spontaneously in plasma ex vivo after 4 h at room temperature: this information is of utmost importance for interpretation of analytical results. Quantification is precise and accurate when using serum, citrated or EDTA plasma as biological matrix, while use of heparin plasma is not appropriate. This new analytical technique providing optimized quantification in real-life conditions of the microbiologically active compounds colistin A and B offers a highly efficient tool for routine therapeutic drug monitoring aimed at individualizing drug dosing against life-threatening infections. Copyright © 2014 Elsevier B.V. All rights reserved.

  3. A Statistics-based Platform for Quantitative N-terminome Analysis and Identification of Protease Cleavage Products*

    PubMed Central

    auf dem Keller, Ulrich; Prudova, Anna; Gioia, Magda; Butler, Georgina S.; Overall, Christopher M.

    2010-01-01

    Terminal amine isotopic labeling of substrates (TAILS), our recently introduced platform for quantitative N-terminome analysis, enables wide dynamic range identification of original mature protein N-termini and protease cleavage products. Modifying TAILS by use of isobaric tag for relative and absolute quantification (iTRAQ)-like labels for quantification together with a robust statistical classifier derived from experimental protease cleavage data, we report reliable and statistically valid identification of proteolytic events in complex biological systems in MS2 mode. The statistical classifier is supported by a novel parameter evaluating ion intensity-dependent quantification confidences of single peptide quantifications, the quantification confidence factor (QCF). Furthermore, the isoform assignment score (IAS) is introduced, a new scoring system for the evaluation of single peptide-to-protein assignments based on high confidence protein identifications in the same sample prior to negative selection enrichment of N-terminal peptides. By these approaches, we identified and validated, in addition to known substrates, low abundance novel bioactive MMP-2 targets including the plasminogen receptor S100A10 (p11) and the proinflammatory cytokine proEMAP/p43 that were previously undescribed. PMID:20305283

  4. Normal Databases for the Relative Quantification of Myocardial Perfusion

    PubMed Central

    Rubeaux, Mathieu; Xu, Yuan; Germano, Guido; Berman, Daniel S.; Slomka, Piotr J.

    2016-01-01

    Purpose of review Myocardial perfusion imaging (MPI) with SPECT is performed clinically worldwide to detect and monitor coronary artery disease (CAD). MPI allows an objective quantification of myocardial perfusion at stress and rest. This established technique relies on normal databases to compare patient scans against reference normal limits. In this review, we aim to introduce the process of MPI quantification with normal databases and describe the associated perfusion quantitative measures that are used. Recent findings New equipment and new software reconstruction algorithms have been introduced which require the development of new normal limits. The appearance and regional count variations of normal MPI scan may differ between these new scanners and standard Anger cameras. Therefore, these new systems may require the determination of new normal limits to achieve optimal accuracy in relative myocardial perfusion quantification. Accurate diagnostic and prognostic results rivaling those obtained by expert readers can be obtained by this widely used technique. Summary Throughout this review, we emphasize the importance of the different normal databases and the need for specific databases relative to distinct imaging procedures. use of appropriate normal limits allows optimal quantification of MPI by taking into account subtle image differences due to the hardware and software used, and the population studied. PMID:28138354

  5. Being Something: Prospects for a Property-Based Approach to Predicative Quantification

    ERIC Educational Resources Information Center

    Rieppel, Michael Olivier

    2013-01-01

    Few questions concerning the character of our talk about the world are more basic than how predicates combine with names to form truth-evaluable sentences. One particularly intriguing fact that any account of predication needs to make room for is that natural language allows for quantification into predicate position, through constructions like…

  6. USING CARBOHYDRATES AS MOLECULAR MARKERS TO DETERMINE THE CONTRIBUTION OF AGRICULTURAL SOIL TO AMBIENT FINE AND COURSE PM

    EPA Science Inventory

    Project research optimized the quantification technique for carbohydrates that also allows quantification of other non-polar molecular markers based on using an isotopically labeled internal standard (D-glucose-1,2,3,4,5,6,6-d7) to monitor extraction efficiency, extraction usi...

  7. Quantification of benign lesion regression as a function of 532-nm pulsed potassium titanyl phosphate laser parameter selection.

    PubMed

    Mallur, Pavan S; Tajudeen, Bobby A; Aaronson, Nicole; Branski, Ryan C; Amin, Milan R

    2011-03-01

    Although the potassium titanyl phosphate (KTP) laser is versatile, the variability in laser parameters for laryngeal pathologies and the lack of clinical efficacy data remain problematic. We provide preliminary data regarding these parameters for benign lesion regression. In addition, we describe a novel method for the quantification of the effects of the KTP laser on vocal fold (VF) lesions. Retrospective chart review. Images were captured from examinations before and after in-office KTP treatment in patients with a range of benign lesions. Laser settings were noted for each patient. Imaging software was then used to calculate a ratio of lesion area to VF length. Ten percent of images were requantified to determine inter-rater reliability. Thirty-two patients underwent 47 procedures for lesions including hemorrhagic polyp, nonhemorrhagic polyp, vocal process granuloma, Reinke's edema, cyst/pseudocyst, leukoplakia, and squamous cell carcinoma in situ. No statistically significant differences were observed with regard to the laser parameters used as a function of lesion type. Regardless, by 1 month following treatment, all lesions had significantly decreased in size, except nonhemorrhagic polyps. Similar data were obtained at 2-month follow-up. We then compared the pre-KTP lesion size with the smallest lesion size quantified during the 1-year follow-up period. All lesions were significantly smaller, with the exception of Reinke's edema. Inter-rater reliability was quite good. KTP laser effectively reduced VF lesion size, irrespective of the laser parameters used. In addition, our quantification method for lesion size appeared to be both viable and reliable. Copyright © 2011 The American Laryngological, Rhinological, and Otological Society, Inc.

  8. Evaluation of droplet digital PCR for characterizing plasmid reference material used for quantifying ammonia oxidizers and denitrifiers.

    PubMed

    Dong, Lianhua; Meng, Ying; Wang, Jing; Liu, Yingying

    2014-02-01

    DNA reference materials of certified value have a critical function in many analytical processes of DNA measurement. Quantification of amoA genes in ammonia oxidizing bacteria (AOB) and archaea (AOA), and of nirS and nosZ genes in the denitrifiers is very important for determining their distribution and abundance in the natural environment. A plasmid reference material containing nirS, nosZ, amoA-AOB, and amoA-AOA is developed to provide a DNA standard with copy number concentration for ensuring comparability and reliability of quantification of these genes. Droplet digital PCR (ddPCR) was evaluated for characterization of the plasmid reference material. The result revealed that restriction endonuclease digestion of plasmids can improve amplification efficiency and minimize the measurement bias of ddPCR. Compared with the conformation of the plasmid, the size of the DNA fragment containing the target sequence and the location of the restriction site relative to the target sequence are not significant factors affecting plasmid quantification by ddPCR. Liquid chromatography-isotope dilution mass spectrometry (LC-IDMS) was used to provide independent data for quantifying the plasmid reference material. The copy number concentration of the digested plasmid determined by ddPCR agreed well with that determined by LC-IDMS, improving both the accuracy and reliability of the plasmid reference material. The reference value, with its expanded uncertainty (k = 2), of the plasmid reference material was determined to be (5.19 ± 0.41) × 10(9) copies μL(-1) by averaging the results of two independent measurements. Consideration of the factors revealed in this study can improve the reliability and accuracy of ddPCR; thus, this method has the potential to accurately quantify DNA reference materials.

  9. Quantitative Rapid Assessment of Leukoaraiosis in CT : Comparison to Gold Standard MRI.

    PubMed

    Hanning, Uta; Sporns, Peter Bernhard; Schmidt, Rene; Niederstadt, Thomas; Minnerup, Jens; Bier, Georg; Knecht, Stefan; Kemmling, André

    2017-10-20

    The severity of white matter lesions (WML) is a risk factor of hemorrhage and predictor of clinical outcome after ischemic stroke; however, in contrast to magnetic resonance imaging (MRI) reliable quantification for this surrogate marker is limited for computed tomography (CT), the leading stroke imaging technique. We aimed to present and evaluate a CT-based automated rater-independent method for quantification of microangiopathic white matter changes. Patients with suspected minor stroke (National Institutes of Health Stroke scale, NIHSS < 4) were screened for the analysis of non-contrast computerized tomography (NCCT) at admission and compared to follow-up MRI. The MRI-based WML volume and visual Fazekas scores were assessed as the gold standard reference. We employed a recently published probabilistic brain segmentation algorithm for CT images to determine the tissue-specific density of WM space. All voxel-wise densities were quantified in WM space and weighted according to partial probabilistic WM content. The resulting mean weighted density of WM space in NCCT, the surrogate of WML, was correlated with reference to MRI-based WML parameters. The process of CT-based tissue-specific segmentation was reliable in 79 cases with varying severity of microangiopathy. Voxel-wise weighted density within WM spaces showed a noticeable correlation (r = -0.65) with MRI-based WML volume. Particularly in patients with moderate or severe lesion load according to the visual Fazekas score the algorithm provided reliable prediction of MRI-based WML volume. Automated observer-independent quantification of voxel-wise WM density in CT significantly correlates with microangiopathic WM disease in gold standard MRI. This rapid surrogate of white matter lesion load in CT may support objective WML assessment and therapeutic decision-making during acute stroke triage.

  10. Application of PCR and real-time PCR for monitoring cyanobacteria, Microcystis spp. and Cylindrospermopsis raciborskii in Macau freshwater reservoir

    NASA Astrophysics Data System (ADS)

    Zhang, Weiying; Lou, Inchio; Ung, Wai Kin; Kong, Yijun; Mok, Kai Meng

    2014-06-01

    Freshwater algal blooms have become a growing concern world-wide. They are caused by a high level of cyanobacteria, predominantly Microcystis spp. and Cylindrospermopsis raciborskii, which can produce microcystin and cylindrospermopsin, respectively. Longtime exposure to these cyanotoxins may affect public health, thus reliable detection, quantification, and enumeration of these harmful algae species has become a priority in water quality management. Traditional manual enumeration of algal bloom cells primarily involves microscopic identification which limited by inaccuracy and time-consumption.With the development of molecular techniques and an increasing number of microbial sequences available in the Genbank database, the use of molecular methods can be used for more rapid, reliable, and accurate detection and quantification. In this study, multiplex polymerase chain reaction (PCR) and real-time quantitative PCR (qPCR) techniques were developed and applied for monitoring cyanobacteria Microcystis spp. and C. raciborskii in the Macau Storage Reservoir (MSR). The results showed that the techniques were successful for identifying and quantifying the species in pure cultures and mixed cultures, and proved to be a potential application for water sampling in MSR. When the target species were above 1 million cells/L, similar cell numbers estimated by microscopic enumeration and qPCR were obtained. Further quantification in water samples indicated that the ratio of the estimated number of cell by microscopy and qPCR was 0.4-12.9 for cyanobacteria and 0.2-3.9 for C. raciborskii. However, Microcystis spp. was not observed by manual enumeration, while it was detected at low levels by qPCR, suggesting that qPCR is more sensitive and accurate. Thus the molecular approaches provide an additional reliable monitoring option to traditional microscopic enumeration for the ecosystems monitoring program.

  11. Objective quantification of seizure frequency and treatment success via long-term outpatient video-EEG monitoring: a feasibility study.

    PubMed

    Stefan, H; Kreiselmeyer, G; Kasper, B; Graf, W; Pauli, E; Kurzbuch, K; Hopfengärtner, R

    2011-03-01

    A reliable method for the estimation of seizure frequency and severity is indispensable in assessing the efficacy of drug treatment in epilepsies. These quantities are usually deduced from subjective patient reports, which may cause considerable problems due to insufficient or false descriptions of seizures and their frequency. We present data from two difficult-to-treat patients with intractable epilepsy. Pat. 1 has had an unknown number of CP seizures. Here, a prolonged outpatient video-EEG monitoring over 160 h and 137 h (over an interval of three months) was performed with an automated seizure detection method. Pat. 2 suffered exclusively from nocturnal seizures originating from the frontal lobe. In this case, an objective quantification of the efficacy of drug treatment over a time period of 22 weeks was established. For the reliable quantification of seizures, a prolonged outpatient video/video-EEG monitoring was appended after a short-term inpatient monitoring period. Patient 1: The seizure detection algorithm was capable of detecting 10 out of 11 seizures. The number of false-positive events was <0.03/h. It was clearly demonstrated that the patient showed more seizures than originally reported. Patient 2: The add-on medication of lacosamide led to a significant reduction in seizure frequency and to a marked decrease in the mean duration of seizures. The severity of seizures was reduced from numerous hypermotoric seizures to few mild, head-turning seizures. Outpatient monitoring may be helpful to guide treatment for severe epilepsies and offers the possibility to more reliably quantify the efficacy of treatment in the long-term, even over several months. Copyright © 2010 British Epilepsy Association. Published by Elsevier Ltd. All rights reserved.

  12. Can skills assessment on a virtual reality trainer predict a surgical trainee's talent in laparoscopic surgery?

    PubMed

    Rosenthal, R; Gantert, W A; Scheidegger, D; Oertli, D

    2006-08-01

    A number of studies have investigated several aspects of feasibility and validity of performance assessments with virtual reality surgical simulators. However, the validity of performance assessments is limited by the reliability of such measurements, and some issues of reliability still need to be addressed. This study aimed to evaluate the hypothesis that test subjects show logarithmic performance curves on repetitive trials for a component task of laparoscopic cholecystectomy on a virtual reality simulator, and that interindividual differences in performance after considerable training are significant. According to kinesiologic theory, logarithmic performance curves are expected and an individual's learning capacity for a specific task can be extrapolated, allowing quantification of a person's innate ability to develop task-specific skills. In this study, 20 medical students at the University of Basel Medical School performed five trials of a standardized task on the LS 500 virtual reality simulator for laparoscopic surgery. Task completion time, number of errors, economy of instrument movements, and maximum speed of instrument movements were measured. The hypothesis was confirmed by the fact that the performance curves for some of the simulator measurements were very close to logarithmic curves, and there were significant interindividual differences in performance at the end of the repetitive trials. Assessment of perceptual motor skills and the innate ability of an individual with no prior experience in laparoscopic surgery to develop such skills using the LS 500 VR surgical simulator is feasible and reliable.

  13. Characterization and improvement of RNA-Seq precision in quantitative transcript expression profiling.

    PubMed

    Łabaj, Paweł P; Leparc, Germán G; Linggi, Bryan E; Markillie, Lye Meng; Wiley, H Steven; Kreil, David P

    2011-07-01

    Measurement precision determines the power of any analysis to reliably identify significant signals, such as in screens for differential expression, independent of whether the experimental design incorporates replicates or not. With the compilation of large-scale RNA-Seq datasets with technical replicate samples, however, we can now, for the first time, perform a systematic analysis of the precision of expression level estimates from massively parallel sequencing technology. This then allows considerations for its improvement by computational or experimental means. We report on a comprehensive study of target identification and measurement precision, including their dependence on transcript expression levels, read depth and other parameters. In particular, an impressive recall of 84% of the estimated true transcript population could be achieved with 331 million 50 bp reads, with diminishing returns from longer read lengths and even less gains from increased sequencing depths. Most of the measurement power (75%) is spent on only 7% of the known transcriptome, however, making less strongly expressed transcripts harder to measure. Consequently, <30% of all transcripts could be quantified reliably with a relative error<20%. Based on established tools, we then introduce a new approach for mapping and analysing sequencing reads that yields substantially improved performance in gene expression profiling, increasing the number of transcripts that can reliably be quantified to over 40%. Extrapolations to higher sequencing depths highlight the need for efficient complementary steps. In discussion we outline possible experimental and computational strategies for further improvements in quantification precision. rnaseq10@boku.ac.at

  14. The Infeasibility of Quantifying the Reliability of Life-Critical Real-Time Software

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Finelli, George B.

    1991-01-01

    This paper affirms that the quantification of life-critical software reliability is infeasible using statistical methods whether applied to standard software or fault-tolerant software. The classical methods of estimating reliability are shown to lead to exhorbitant amounts of testing when applied to life-critical software. Reliability growth models are examined and also shown to be incapable of overcoming the need for excessive amounts of testing. The key assumption of software fault tolerance separately programmed versions fail independently is shown to be problematic. This assumption cannot be justified by experimentation in the ultrareliability region and subjective arguments in its favor are not sufficiently strong to justify it as an axiom. Also, the implications of the recent multiversion software experiments support this affirmation.

  15. Are LOD and LOQ Reliable Parameters for Sensitivity Evaluation of Spectroscopic Methods?

    PubMed

    Ershadi, Saba; Shayanfar, Ali

    2018-03-22

    The limit of detection (LOD) and the limit of quantification (LOQ) are common parameters to assess the sensitivity of analytical methods. In this study, the LOD and LOQ of previously reported terbium sensitized analysis methods were calculated by different methods, and the results were compared with sensitivity parameters [lower limit of quantification (LLOQ)] of U.S. Food and Drug Administration guidelines. The details of the calibration curve and standard deviation of blank samples of three different terbium-sensitized luminescence methods for the quantification of mycophenolic acid, enrofloxacin, and silibinin were used for the calculation of LOD and LOQ. A comparison of LOD and LOQ values calculated by various methods and LLOQ shows a considerable difference. The significant difference of the calculated LOD and LOQ with various methods and LLOQ should be considered in the sensitivity evaluation of spectroscopic methods.

  16. Lamb wave-based damage quantification and probability of detection modeling for fatigue life assessment of riveted lap joint

    NASA Astrophysics Data System (ADS)

    He, Jingjing; Wang, Dengjiang; Zhang, Weifang

    2015-03-01

    This study presents an experimental and modeling study for damage detection and quantification in riveted lap joints. Embedded lead zirconate titanate piezoelectric (PZT) ceramic wafer-type sensors are employed to perform in-situ non-destructive testing during fatigue cyclical loading. A multi-feature integration method is developed to quantify the crack size using signal features of correlation coefficient, amplitude change, and phase change. In addition, probability of detection (POD) model is constructed to quantify the reliability of the developed sizing method. Using the developed crack size quantification method and the resulting POD curve, probabilistic fatigue life prediction can be performed to provide comprehensive information for decision-making. The effectiveness of the overall methodology is demonstrated and validated using several aircraft lap joint specimens from different manufactures and under different loading conditions.

  17. Bayesian forecasting and uncertainty quantifying of stream flows using Metropolis-Hastings Markov Chain Monte Carlo algorithm

    NASA Astrophysics Data System (ADS)

    Wang, Hongrui; Wang, Cheng; Wang, Ying; Gao, Xiong; Yu, Chen

    2017-06-01

    This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLE confidence interval and thus more precise estimation by using the related information from regional gage stations. The Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.

  18. Assessment of the interlaboratory variability and robustness of JAK2V617F mutation assays: A study involving a consortium of 19 Italian laboratories

    PubMed Central

    Perricone, Margherita; Palandri, Francesca; Ottaviani, Emanuela; Angelini, Mario; Bagli, Laura; Bellesia, Enrica; Donati, Meris; Gemmati, Donato; Zucchini, Patrizia; Mancini, Stefania; Marchica, Valentina; Trubini, Serena; Matteis, Giovanna De; Zacomo, Silvia Di; Favarato, Mosè; Fioroni, Annamaria; Bolzonella, Caterina; Maccari, Giorgia; Navaglia, Filippo; Gatti, Daniela; Toffolatti, Luisa; Orlandi, Linda; Laloux, Vèronique; Manfrini, Marco; Galieni, Piero; Giannini, Barbara; Tieghi, Alessia; Barulli, Sara; Serino, Maria Luisa; Maccaferri, Monica; Scortechini, Anna Rita; Giuliani, Nicola; Vallisa, Daniele; Bonifacio, Massimiliano; Accorsi, Patrizia; Salbe, Cristina; Fazio, Vinicio; Gusella, Milena; Toffoletti, Eleonora; Salvucci, Marzia; Svaldi, Mirija; Gherlinzoni, Filippo; Cassavia, Francesca; Orsini, Francesco; Martinelli, Giovanni

    2017-01-01

    To date, a plenty of techniques for the detection of JAK2V617F is used over different laboratories, with substantial differences in specificity and sensitivity. Therefore, to provide reliable and comparable results, the standardization of molecular techniques is mandatory. A network of 19 centers was established to 1) evaluate the inter- and intra-laboratory variability in JAK2V617F quantification, 2) identify the most robust assay for the standardization of the molecular test and 3) allow consistent interpretation of individual patient analysis results. The study was conceived in 3 different rounds, in which all centers had to blindly test DNA samples with different JAK2V617F allele burden (AB) using both quantitative and qualitative assays. The positivity of samples with an AB < 1% was not detected by qualitative assays. Conversely, laboratories performing the quantitative approach were able to determine the expected JAK2V617F AB. Quantitative results were reliable across all mutation loads with moderate variability at low AB (0.1 and 1%; CV = 0.46 and 0.77, respectively). Remarkably, all laboratories clearly distinguished between the 0.1 and 1% mutated samples. In conclusion, a qualitative approach is not sensitive enough to detect the JAK2V617F mutation, especially at low AB. On the contrary, the ipsogen JAK2 MutaQuant CE-IVD kit resulted in a high, efficient and sensitive quantification detection of all mutation loads. This study sets the basis for the standardization of molecular techniques for JAK2V617F determination, which will require the employment of approved operating procedures and the use of certificated standards, such as the recent WHO 1st International Reference Panel for Genomic JAK2V617F. PMID:28427233

  19. Assessment of the interlaboratory variability and robustness of JAK2V617F mutation assays: A study involving a consortium of 19 Italian laboratories.

    PubMed

    Perricone, Margherita; Palandri, Francesca; Ottaviani, Emanuela; Angelini, Mario; Bagli, Laura; Bellesia, Enrica; Donati, Meris; Gemmati, Donato; Zucchini, Patrizia; Mancini, Stefania; Marchica, Valentina; Trubini, Serena; De Matteis, Giovanna; Di Zacomo, Silvia; Favarato, Mosè; Fioroni, Annamaria; Bolzonella, Caterina; Maccari, Giorgia; Navaglia, Filippo; Gatti, Daniela; Toffolatti, Luisa; Orlandi, Linda; Laloux, Vèronique; Manfrini, Marco; Galieni, Piero; Giannini, Barbara; Tieghi, Alessia; Barulli, Sara; Serino, Maria Luisa; Maccaferri, Monica; Scortechini, Anna Rita; Giuliani, Nicola; Vallisa, Daniele; Bonifacio, Massimiliano; Accorsi, Patrizia; Salbe, Cristina; Fazio, Vinicio; Gusella, Milena; Toffoletti, Eleonora; Salvucci, Marzia; Svaldi, Mirija; Gherlinzoni, Filippo; Cassavia, Francesca; Orsini, Francesco; Martinelli, Giovanni

    2017-05-16

    To date, a plenty of techniques for the detection of JAK2V617F is used over different laboratories, with substantial differences in specificity and sensitivity. Therefore, to provide reliable and comparable results, the standardization of molecular techniques is mandatory.A network of 19 centers was established to 1) evaluate the inter- and intra-laboratory variability in JAK2V617F quantification, 2) identify the most robust assay for the standardization of the molecular test and 3) allow consistent interpretation of individual patient analysis results. The study was conceived in 3 different rounds, in which all centers had to blindly test DNA samples with different JAK2V617F allele burden (AB) using both quantitative and qualitative assays.The positivity of samples with an AB < 1% was not detected by qualitative assays. Conversely, laboratories performing the quantitative approach were able to determine the expected JAK2V617F AB. Quantitative results were reliable across all mutation loads with moderate variability at low AB (0.1 and 1%; CV = 0.46 and 0.77, respectively). Remarkably, all laboratories clearly distinguished between the 0.1 and 1% mutated samples.In conclusion, a qualitative approach is not sensitive enough to detect the JAK2V617F mutation, especially at low AB. On the contrary, the ipsogen JAK2 MutaQuant CE-IVD kit resulted in a high, efficient and sensitive quantification detection of all mutation loads. This study sets the basis for the standardization of molecular techniques for JAK2V617F determination, which will require the employment of approved operating procedures and the use of certificated standards, such as the recent WHO 1st International Reference Panel for Genomic JAK2V617F.

  20. Empirical Evidence for Childhood Depression.

    ERIC Educational Resources Information Center

    Lachar, David

    Although several theoretical positions deal with the concept of childhood depression, accurate measurement of depression can only occur if valid and reliable measures are available. Current efforts emphasize direct questioning of the child and quantification of parents' observations. One scale used to study childhood depression, the Personality…

  1. A New Enzyme-linked Sorbent Assay (ELSA) to Quantify Syncytiotrophoblast Extracellular Vesicles in Biological Fluids.

    PubMed

    Göhner, Claudia; Weber, Maja; Tannetta, Dionne S; Groten, Tanja; Plösch, Torsten; Faas, Marijke M; Scherjon, Sicco A; Schleußner, Ekkehard; Markert, Udo R; Fitzgerald, Justine S

    2015-06-01

    The pregnancy-associated disease preeclampsia is related to the release of syncytiotrophoblast extracellular vesicles (STBEV) by the placenta. To improve functional research on STBEV, reliable and specific methods are needed to quantify them. However, only a few quantification methods are available and accepted, though imperfect. For this purpose, we aimed to provide an enzyme-linked sorbent assay (ELSA) to quantify STBEV in fluid samples based on their microvesicle characteristics and placental origin. Ex vivo placenta perfusion provided standards and samples for the STBEV quantification. STBEV were captured by binding of extracellular phosphatidylserine to immobilized annexin V. The membranous human placental alkaline phosphatase on the STBEV surface catalyzed a colorimetric detection reaction. The described ELSA is a rapid and simple method to quantify STBEV in diverse liquid samples, such as blood or perfusion suspension. The reliability of the ELSA was proven by comparison with nanoparticle tracking analysis. © 2015 John Wiley & Sons A/S. Published by John Wiley & Sons Ltd.

  2. Automated muscle fiber type population analysis with ImageJ of whole rat muscles using rapid myosin heavy chain immunohistochemistry.

    PubMed

    Bergmeister, Konstantin D; Gröger, Marion; Aman, Martin; Willensdorfer, Anna; Manzano-Szalai, Krisztina; Salminger, Stefan; Aszmann, Oskar C

    2016-08-01

    Skeletal muscle consists of different fiber types which adapt to exercise, aging, disease, or trauma. Here we present a protocol for fast staining, automatic acquisition, and quantification of fiber populations with ImageJ. Biceps and lumbrical muscles were harvested from Sprague-Dawley rats. Quadruple immunohistochemical staining was performed on single sections using antibodies against myosin heavy chains and secondary fluorescent antibodies. Slides were scanned automatically with a slide scanner. Manual and automatic analyses were performed and compared statistically. The protocol provided rapid and reliable staining for automated image acquisition. Analyses between manual and automatic data indicated Pearson correlation coefficients for biceps of 0.645-0.841 and 0.564-0.673 for lumbrical muscles. Relative fiber populations were accurate to a degree of ± 4%. This protocol provides a reliable tool for quantification of muscle fiber populations. Using freely available software, it decreases the required time to analyze whole muscle sections. Muscle Nerve 54: 292-299, 2016. © 2016 Wiley Periodicals, Inc.

  3. Shape matters: animal colour patterns as signals of individual quality

    PubMed Central

    2017-01-01

    Colour patterns (e.g. irregular, spotted or barred forms) are widespread in the animal kingdom, yet their potential role as signals of quality has been mostly neglected. However, a review of the published literature reveals that pattern itself (irrespective of its size or colour intensity) is a promising signal of individual quality across species of many different taxa. We propose at least four main pathways whereby patterns may reliably reflect individual quality: (i) as conventional signals of status, (ii) as indices of developmental homeostasis, (iii) by amplifying cues of somatic integrity and (iv) by amplifying individual investment in maintenance activities. Methodological constraints have traditionally hampered research on the signalling potential of colour patterns. To overcome this, we report a series of tools (e.g. colour adjacency and pattern regularity analyses, Fourier and granularity approaches, fractal geometry, geometric morphometrics) that allow objective quantification of pattern variability. We discuss how information provided by these methods should consider the visual system of the model species and behavioural responses to pattern metrics, in order to allow biologically meaningful conclusions. Finally, we propose future challenges in this research area that will require a multidisciplinary approach, bringing together inputs from genetics, physiology, behavioural ecology and evolutionary-developmental biology. PMID:28228513

  4. Calibration transfer of a Raman spectroscopic quantification method for the assessment of liquid detergent compositions between two at-line instruments installed at two liquid detergent production plants.

    PubMed

    Brouckaert, D; Uyttersprot, J-S; Broeckx, W; De Beer, T

    2017-09-01

    Calibration transfer of partial least squares (PLS) quantification models is established between two Raman spectrometers located at two liquid detergent production plants. As full recalibration of existing calibration models is time-consuming, labour-intensive and costly, it is investigated whether the use of mathematical correction methods requiring only a handful of standardization samples can overcome the dissimilarities in spectral response observed between both measurement systems. Univariate and multivariate standardization approaches are investigated, ranging from simple slope/bias correction (SBC), local centring (LC) and single wavelength standardization (SWS) to more complex direct standardization (DS) and piecewise direct standardization (PDS). The results of these five calibration transfer methods are compared reciprocally, as well as with regard to a full recalibration. Four PLS quantification models, each predicting the concentration of one of the four main ingredients in the studied liquid detergent composition, are aimed at transferring. Accuracy profiles are established from the original and transferred quantification models for validation purposes. A reliable representation of the calibration models performance before and after transfer is thus established, based on β-expectation tolerance intervals. For each transferred model, it is investigated whether every future measurement that will be performed in routine will be close enough to the unknown true value of the sample. From this validation, it is concluded that instrument standardization is successful for three out of four investigated calibration models using multivariate (DS and PDS) transfer approaches. The fourth transferred PLS model could not be validated over the investigated concentration range, due to a lack of precision of the slave instrument. Comparing these transfer results to a full recalibration on the slave instrument allows comparison of the predictive power of both Raman systems and leads to the formulation of guidelines for further standardization projects. It is concluded that it is essential to evaluate the performance of the slave instrument prior to transfer, even when it is theoretically identical to the master apparatus. Copyright © 2017 Elsevier B.V. All rights reserved.

  5. Simultaneous Quantification of Seven Bioactive Flavonoids in Citri Reticulatae Pericarpium by Ultra-Fast Liquid Chromatography Coupled with Tandem Mass Spectrometry.

    PubMed

    Zhao, Lian-Hua; Zhao, Hong-Zheng; Zhao, Xue; Kong, Wei-Jun; Hu, Yi-Chen; Yang, Shi-Hai; Yang, Mei-Hua

    2016-05-01

    Citri Reticulatae Pericarpium (CRP) is a commonly-used traditional Chinese medicine with flavonoids as the major bioactive components. Nevertheless, the contents of the flavonoids in CRP of different sources may significantly vary affecting their therapeutic effects. Thus, the setting up of a reliable and comprehensive quality assessment method for flavonoids in CRP is necessary. To set up a rapid and sensitive ultra-fast liquid chromatography coupled with tandem mass spectrometry (UFLC-MS/MS) method for simultaneous quantification of seven bioactive flavonoids in CRP. A UFLC-MS/MS method coupled to ultrasound-assisted extraction was developed for simultaneous separation and quantification of seven flavonoids including hesperidin, neohesperidin, naringin, narirutin, tangeretin, nobiletin and sinensetin in 16 batches of CRP samples from different sources in China. The established method showed good linearity for all analytes with correlation coefficient (R) over 0.9980, together with satisfactory accuracy, precision and reproducibility. Furthermore, the recoveries at the three spiked levels were higher than 89.71% with relative standard deviations (RSDs) lower than 5.19%. The results indicated that the contents of seven bioactive flavonoids in CRP varied significantly among different sources. Among the samples under study, hesperidin showed the highest contents in 16 samples ranged from 27.50 to 86.30 mg/g, the contents of hesperidin in CRP-15 and CRP-9 were 27.50 and 86.30 mg/g, respectively, while, the amount of narirutin was too low to be measured in some samples. This study revealed that the developed UFLC-MS/MS method was simple, sensitive and reliable for simultaneous quantification of multi-components in CRP with potential perspective for quality control of complex matrices. Copyright © 2016 John Wiley & Sons, Ltd. Copyright © 2016 John Wiley & Sons, Ltd.

  6. Reliability Analysis of a Glacier Lake Warning System Using a Bayesian Net

    NASA Astrophysics Data System (ADS)

    Sturny, Rouven A.; Bründl, Michael

    2013-04-01

    Beside structural mitigation measures like avalanche defense structures, dams and galleries, warning and alarm systems have become important measures for dealing with Alpine natural hazards. Integrating them into risk mitigation strategies and comparing their effectiveness with structural measures requires quantification of the reliability of these systems. However, little is known about how reliability of warning systems can be quantified and which methods are suitable for comparing their contribution to risk reduction with that of structural mitigation measures. We present a reliability analysis of a warning system located in Grindelwald, Switzerland. The warning system was built for warning and protecting residents and tourists from glacier outburst floods as consequence of a rapid drain of the glacier lake. We have set up a Bayesian Net (BN, BPN) that allowed for a qualitative and quantitative reliability analysis. The Conditional Probability Tables (CPT) of the BN were determined according to manufacturer's reliability data for each component of the system as well as by assigning weights for specific BN nodes accounting for information flows and decision-making processes of the local safety service. The presented results focus on the two alerting units 'visual acoustic signal' (VAS) and 'alerting of the intervention entities' (AIE). For the summer of 2009, the reliability was determined to be 94 % for the VAS and 83 % for the AEI. The probability of occurrence of a major event was calculated as 0.55 % per day resulting in an overall reliability of 99.967 % for the VAS and 99.906 % for the AEI. We concluded that a failure of the VAS alerting unit would be the consequence of a simultaneous failure of the four probes located in the lake and the gorge. Similarly, we deduced that the AEI would fail either if there were a simultaneous connectivity loss of the mobile and fixed network in Grindelwald, an Internet access loss or a failure of the regional operations centre. However, the probability of a common failure of these components was assumed to be low. Overall it can be stated that due to numerous redundancies, the investigated warning system is highly reliable and its influence on risk reduction is very high. Comparable studies in the future are needed to classify these results and to gain more experience how the reliability of warning systems could be determined in practice.

  7. A Facile and Sensitive Method for Quantification of Cyclic Nucleotide Monophosphates in Mammalian Organs: Basal Levels of Eight cNMPs and Identification of 2',3'-cIMP

    PubMed Central

    Jia, Xin; Fontaine, Benjamin M.; Strobel, Fred; Weinert, Emily E.

    2014-01-01

    A sensitive, versatile and economical method to extract and quantify cyclic nucleotide monophosphates (cNMPs) using LC-MS/MS, including both 3',5'-cNMPs and 2',3'-cNMPs, in mammalian tissues and cellular systems has been developed. Problems, such as matrix effects from complex biological samples, are addressed and have been optimized. This protocol allows for comparison of multiple cNMPs in the same system and was used to examine the relationship between tissue levels of cNMPs in a panel of rat organs. In addition, the study reports the first identification and quantification of 2',3'-cIMP. The developed method will allow for quantification of cNMPs levels in cells and tissues with varying disease states, which will provide insight into the role(s) and interplay of cNMP signalling pathways. PMID:25513747

  8. A facile and sensitive method for quantification of cyclic nucleotide monophosphates in mammalian organs: basal levels of eight cNMPs and identification of 2',3'-cIMP.

    PubMed

    Jia, Xin; Fontaine, Benjamin M; Strobel, Fred; Weinert, Emily E

    2014-12-12

    A sensitive, versatile and economical method to extract and quantify cyclic nucleotide monophosphates (cNMPs) using LC-MS/MS, including both 3',5'-cNMPs and 2',3'-cNMPs, in mammalian tissues and cellular systems has been developed. Problems, such as matrix effects from complex biological samples, are addressed and have been optimized. This protocol allows for comparison of multiple cNMPs in the same system and was used to examine the relationship between tissue levels of cNMPs in a panel of rat organs. In addition, the study reports the first identification and quantification of 2',3'-cIMP. The developed method will allow for quantification of cNMPs levels in cells and tissues with varying disease states, which will provide insight into the role(s) and interplay of cNMP signalling pathways.

  9. Quantification of Bacterial Twitching Motility in Dense Colonies Using Transmitted Light Microscopy and Computational Image Analysis.

    PubMed

    Smith, Benjamin; Li, Jianfang; Metruccio, Matteo; Wan, Stephanie; Evans, David; Fleiszig, Suzanne

    2018-04-20

    A method was developed to allow the quantification and mapping of relative bacterial twitching motility in dense samples, where tracking of individual bacteria was not feasible. In this approach, movies of bacterial films were acquired using differential interference contrast microscopy (DIC), and bacterial motility was then indirectly quantified by the degree to which the bacteria modulated the intensity of light in the field-of-view over time. This allowed the mapping of areas of relatively high and low motility within a single field-of-view, and comparison of the total distribution of motility between samples.

  10. Toward decentralized analysis of mercury (II) in real samples. A critical review on nanotechnology-based methodologies.

    PubMed

    Botasini, Santiago; Heijo, Gonzalo; Méndez, Eduardo

    2013-10-24

    In recent years, it has increased the number of works focused on the development of novel nanoparticle-based sensors for mercury detection, mainly motivated by the need of low cost portable devices capable of giving fast and reliable analytical response, thus contributing to the analytical decentralization. Methodologies employing colorimetric, fluorometric, magnetic, and electrochemical output signals allowed reaching detection limits within the pM and nM ranges. Most of these developments proved their suitability in detecting and quantifying mercury (II) ions in synthetic solutions or spiked water samples. However, the state of art in these technologies is still behind the standard methods of mercury quantification, such as cold vapor atomic absorption spectrometry and inductively coupled plasma techniques, in terms of reliability and sensitivity. This is mainly because the response of nanoparticle-based sensors is highly affected by the sample matrix. The developed analytical nanosystems may fail in real samples because of the negative incidence of the ionic strength and the presence of exchangeable ligands. The aim of this review is to critically consider the recently published innovations in this area, and highlight the needs to include more realistic assays in future research in order to make these advances suitable for on-site analysis. Copyright © 2013 Elsevier B.V. All rights reserved.

  11. Multiscale Reconstruction for Magnetic Resonance Fingerprinting

    PubMed Central

    Pierre, Eric Y.; Ma, Dan; Chen, Yong; Badve, Chaitra; Griswold, Mark A.

    2015-01-01

    Purpose To reduce acquisition time needed to obtain reliable parametric maps with Magnetic Resonance Fingerprinting. Methods An iterative-denoising algorithm is initialized by reconstructing the MRF image series at low image resolution. For subsequent iterations, the method enforces pixel-wise fidelity to the best-matching dictionary template then enforces fidelity to the acquired data at slightly higher spatial resolution. After convergence, parametric maps with desirable spatial resolution are obtained through template matching of the final image series. The proposed method was evaluated on phantom and in-vivo data using the highly-undersampled, variable-density spiral trajectory and compared with the original MRF method. The benefits of additional sparsity constraints were also evaluated. When available, gold standard parameter maps were used to quantify the performance of each method. Results The proposed approach allowed convergence to accurate parametric maps with as few as 300 time points of acquisition, as compared to 1000 in the original MRF work. Simultaneous quantification of T1, T2, proton density (PD) and B0 field variations in the brain was achieved in vivo for a 256×256 matrix for a total acquisition time of 10.2s, representing a 3-fold reduction in acquisition time. Conclusions The proposed iterative multiscale reconstruction reliably increases MRF acquisition speed and accuracy. PMID:26132462

  12. High throughput quantification of N-glycans using one-pot sialic acid modification and matrix assisted laser desorption ionization time of flight mass spectrometry

    PubMed Central

    Gil, Geun-Cheol; Iliff, Bryce; Cerny, Ron; Velander, William H.; Van Cott, Kevin E.

    2010-01-01

    Appropriate glycosylation of recombinant therapeutic glycoproteins has been emphasized in biopharmaceutical industries because the carbohydrate component can affect safety, efficacy, and consistency of the glycoproteins. Reliable quantification methods are essential to ensure consistency of their products with respect to glycosylation, particularly sialylation. Mass spectrometry (MS) has become a popular tool to analyze glycan profiles and structures, showing high resolution and sensitivity with structure identification ability. However, quantification of sialylated glycans using MS is not as reliable because of the different ionization efficiency between neutral and acidic glycans. We report here that amidation in mild acidic conditions can be used to neutralize acidic N-glycans still attached to the protein. The resulting amidated N-glycans can then released from the protein using PNGase F, and labeled with permanent charges on the reducing end to avoid any modification and the formation of metal adducts during MS analysis. The N-glycan modification, digestion, and desalting steps were performed using a single-pot method that can be done in microcentrifuge tubes or 96-well microfilter plates, enabling high throughput glycan analysis. Using this method we were able to perform quantitative MALDI-TOF MS of a recombinant human glycoprotein to determine changes in fucosylation and changes in sialylation that were in very good agreement with a normal phase HPLC oligosaccharide mapping method. PMID:20586471

  13. Determination of multi-walled carbon nanotube bioaccumulation in earthworms measured by a microwave-based detection technique

    EPA Science Inventory

    Reliable quantification techniques for carbon nanotubes (CNTs) are limited. In this study, a new procedure was developed for quantifying multi-walled carbon nanotubes (MWNTs) in earthworms (Eisenia fetida) based on freeze drying and microwave-induced heating. Specifically, earthw...

  14. Quantitative PCR for Detection and Enumeration of Genetic Markers of Bovine Fecal Pollution

    EPA Science Inventory

    Accurate assessment of health risks associated with bovine (cattle) fecal pollution requires a reliable host-specific genetic marker and a rapid quantification method. We report the development of quantitative PCR assays for the detection of two recently described cow feces-spec...

  15. Protein quantification using a cleavable reporter peptide.

    PubMed

    Duriez, Elodie; Trevisiol, Stephane; Domon, Bruno

    2015-02-06

    Peptide and protein quantification based on isotope dilution and mass spectrometry analysis are widely employed for the measurement of biomarkers and in system biology applications. The accuracy and reliability of such quantitative assays depend on the quality of the stable-isotope labeled standards. Although the quantification using stable-isotope labeled peptides is precise, the accuracy of the results can be severely biased by the purity of the internal standards, their stability and formulation, and the determination of their concentration. Here we describe a rapid and cost-efficient method to recalibrate stable isotope labeled peptides in a single LC-MS analysis. The method is based on the equimolar release of a protein reference peptide (used as surrogate for the protein of interest) and a universal reporter peptide during the trypsinization of a concatenated polypeptide standard. The quality and accuracy of data generated with such concatenated polypeptide standards are highlighted by the quantification of two clinically important proteins in urine samples and compared with results obtained with conventional stable isotope labeled reference peptides. Furthermore, the application of the UCRP standards in complex samples is described.

  16. Lamb Wave Damage Quantification Using GA-Based LS-SVM.

    PubMed

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-06-12

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  17. Lamb Wave Damage Quantification Using GA-Based LS-SVM

    PubMed Central

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-01-01

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification. PMID:28773003

  18. 1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.

    PubMed

    Dagnino, Denise; Schripsema, Jan

    2005-08-01

    A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.

  19. 18O-labeled proteome reference as global internal standards for targeted quantification by selected reaction monitoring-mass spectrometry.

    PubMed

    Kim, Jong-Seo; Fillmore, Thomas L; Liu, Tao; Robinson, Errol; Hossain, Mahmud; Champion, Boyd L; Moore, Ronald J; Camp, David G; Smith, Richard D; Qian, Wei-Jun

    2011-12-01

    Selected reaction monitoring (SRM)-MS is an emerging technology for high throughput targeted protein quantification and verification in biomarker discovery studies; however, the cost associated with the application of stable isotope-labeled synthetic peptides as internal standards can be prohibitive for screening a large number of candidate proteins as often required in the preverification phase of discovery studies. Herein we present a proof of concept study using an (18)O-labeled proteome reference as global internal standards (GIS) for SRM-based relative quantification. The (18)O-labeled proteome reference (or GIS) can be readily prepared and contains a heavy isotope ((18)O)-labeled internal standard for every possible tryptic peptide. Our results showed that the percentage of heavy isotope ((18)O) incorporation applying an improved protocol was >99.5% for most peptides investigated. The accuracy, reproducibility, and linear dynamic range of quantification were further assessed based on known ratios of standard proteins spiked into the labeled mouse plasma reference. Reliable quantification was observed with high reproducibility (i.e. coefficient of variance <10%) for analyte concentrations that were set at 100-fold higher or lower than those of the GIS based on the light ((16)O)/heavy ((18)O) peak area ratios. The utility of (18)O-labeled GIS was further illustrated by accurate relative quantification of 45 major human plasma proteins. Moreover, quantification of the concentrations of C-reactive protein and prostate-specific antigen was illustrated by coupling the GIS with standard additions of purified protein standards. Collectively, our results demonstrated that the use of (18)O-labeled proteome reference as GIS provides a convenient, low cost, and effective strategy for relative quantification of a large number of candidate proteins in biological or clinical samples using SRM.

  20. Quantification of free fatty acids in human stratum corneum using tandem mass spectrometry and surrogate analyte approach.

    PubMed

    Dapic, Irena; Kobetic, Renata; Brkljacic, Lidija; Kezic, Sanja; Jakasa, Ivone

    2018-02-01

    The free fatty acids (FFAs) are one of the major components of the lipids in the stratum corneum (SC), the uppermost layer of the skin. Relative composition of FFAs has been proposed as a biomarker of the skin barrier status in patients with atopic dermatitis (AD). Here, we developed an LC-ESI-MS/MS method for simultaneous quantification of a range of FFAs with long and very long chain length in the SC collected by adhesive tape (D-Squame). The method, based on derivatization with 2-bromo-1-methylpyridinium iodide and 3-carbinol-1-methylpyridinium iodide, allowed highly sensitive detection and quantification of FFAs using multiple reaction monitoring. For the quantification, we applied a surrogate analyte approach and internal standardization using isotope labeled derivatives of FFAs. Adhesive tapes showed the presence of several FFAs, which are also present in the SC, a problem encountered in previous studies. Therefore, the levels of FFAs in the SC were corrected using C12:0, which was present on the adhesive tape, but not detected in the SC. The method was applied to SC samples from patients with atopic dermatitis and healthy subjects. Quantification using multiple reaction monitoring allowed sufficient sensitivity to analyze FFAs of chain lengths C16-C28 in the SC collected on only one tape strip. Copyright © 2017 John Wiley & Sons, Ltd.

  1. Uncertainty Quantification in Alchemical Free Energy Methods.

    PubMed

    Bhati, Agastya P; Wan, Shunzhou; Hu, Yuan; Sherborne, Brad; Coveney, Peter V

    2018-06-12

    Alchemical free energy methods have gained much importance recently from several reports of improved ligand-protein binding affinity predictions based on their implementation using molecular dynamics simulations. A large number of variants of such methods implementing different accelerated sampling techniques and free energy estimators are available, each claimed to be better than the others in its own way. However, the key features of reproducibility and quantification of associated uncertainties in such methods have barely been discussed. Here, we apply a systematic protocol for uncertainty quantification to a number of popular alchemical free energy methods, covering both absolute and relative free energy predictions. We show that a reliable measure of error estimation is provided by ensemble simulation-an ensemble of independent MD simulations-which applies irrespective of the free energy method. The need to use ensemble methods is fundamental and holds regardless of the duration of time of the molecular dynamics simulations performed.

  2. Bayesian forecasting and uncertainty quantifying of stream flows using Metropolis–Hastings Markov Chain Monte Carlo algorithm

    DOE PAGES

    Wang, Hongrui; Wang, Cheng; Wang, Ying; ...

    2017-04-05

    This paper presents a Bayesian approach using Metropolis-Hastings Markov Chain Monte Carlo algorithm and applies this method for daily river flow rate forecast and uncertainty quantification for Zhujiachuan River using data collected from Qiaotoubao Gage Station and other 13 gage stations in Zhujiachuan watershed in China. The proposed method is also compared with the conventional maximum likelihood estimation (MLE) for parameter estimation and quantification of associated uncertainties. While the Bayesian method performs similarly in estimating the mean value of daily flow rate, it performs over the conventional MLE method on uncertainty quantification, providing relatively narrower reliable interval than the MLEmore » confidence interval and thus more precise estimation by using the related information from regional gage stations. As a result, the Bayesian MCMC method might be more favorable in the uncertainty analysis and risk management.« less

  3. Quantification of Cell-Penetrating Peptide Associated with Polymeric Nanoparticles Using Isobaric-Tagging and MALDI-TOF MS/MS

    NASA Astrophysics Data System (ADS)

    Chiu, Jasper Z. S.; Tucker, Ian G.; McDowell, Arlene

    2016-11-01

    High sensitivity quantification of the putative cell-penetrating peptide di-arginine-histidine (RRH) associated with poly (ethyl-cyanoacrylate) (PECA) nanoparticles was achieved without analyte separation, using a novel application of isobaric-tagging and high matrix-assisted laser desorption/ionization coupled to time-of-flight (MALDI-TOF) mass spectrometry. Isobaric-tagging reaction equilibrium was reached after 5 min, with 90% or greater RRH peptide successfully isobaric-tagged after 60 min. The accuracy was greater than 90%, which indicates good reliability of using isobaric-tagged RRH as an internal standard for RRH quantification. The sample intra- and inter-spot coefficients of variations were less than 11%, which indicate good repeatability. The majority of RRH peptides in the nanoparticle formulation were physically associated with the nanoparticles (46.6%), whereas only a small fraction remained unassociated (13.7%). The unrecovered RRH peptide (~40%) was assumed to be covalently associated with PECA nanoparticles.

  4. A fault tree model to assess probability of contaminant discharge from shipwrecks.

    PubMed

    Landquist, H; Rosén, L; Lindhe, A; Norberg, T; Hassellöv, I-M; Lindgren, J F; Dahllöf, I

    2014-11-15

    Shipwrecks on the sea floor around the world may contain hazardous substances that can cause harm to the marine environment. Today there are no comprehensive methods for environmental risk assessment of shipwrecks, and thus there is poor support for decision-making on prioritization of mitigation measures. The purpose of this study was to develop a tool for quantitative risk estimation of potentially polluting shipwrecks, and in particular an estimation of the annual probability of hazardous substance discharge. The assessment of the probability of discharge is performed using fault tree analysis, facilitating quantification of the probability with respect to a set of identified hazardous events. This approach enables a structured assessment providing transparent uncertainty and sensitivity analyses. The model facilitates quantification of risk, quantification of the uncertainties in the risk calculation and identification of parameters to be investigated further in order to obtain a more reliable risk calculation. Copyright © 2014 Elsevier Ltd. All rights reserved.

  5. Effect of detergent on the quantification of grapevine downy mildew Sporangia from leaf discs

    USDA-ARS?s Scientific Manuscript database

    Grapevine downy mildew (DM), caused by the oomycete Plasmopara viticola (Berk. & Curt.) Berlese & de Toni, is a major disease, especially in humid viticultural areas. Development of resistant cultivars is an important objective for grapevine breeding. In order to establish a reliable and inexpensive...

  6. Chemotaxis of cancer cells in three-dimensional environment monitored label-free by quantitative phase digital holographic microscopy

    NASA Astrophysics Data System (ADS)

    Kemper, Björn; Schnekenburger, Jürgen; Ketelhut, Steffi

    2017-02-01

    We investigated the capabilities of digital holographic microscopy (DHM) for label-free quantification of the response of living single cells to chemical stimuli in 3D assays. Fibro sarcoma cells were observed in a collagen matrix inside 3D chemotaxis chambers with a Mach-Zehnder interferometer-based DHM setup. From the obtained series of quantitative phase images, the migration trajectories of single cells were retrieved by automated cell tracking and subsequently analyzed for maximum migration distance and motility. Our results demonstrate DHM as a highly reliable and efficient tool for label-free quantification of chemotaxis in 2D and 3D environments.

  7. Evaluation of the performance of quantitative detection of the Listeria monocytogenes prfA locus with droplet digital PCR.

    PubMed

    Witte, Anna Kristina; Fister, Susanne; Mester, Patrick; Schoder, Dagmar; Rossmanith, Peter

    2016-11-01

    Fast and reliable pathogen detection is an important issue for human health. Since conventional microbiological methods are rather slow, there is growing interest in detection and quantification using molecular methods. The droplet digital polymerase chain reaction (ddPCR) is a relatively new PCR method for absolute and accurate quantification without external standards. Using the Listeria monocytogenes specific prfA assay, we focused on the questions of whether the assay was directly transferable to ddPCR and whether ddPCR was suitable for samples derived from heterogeneous matrices, such as foodstuffs that often included inhibitors and a non-target bacterial background flora. Although the prfA assay showed suboptimal cluster formation, use of ddPCR for quantification of L. monocytogenes from pure bacterial cultures, artificially contaminated cheese, and naturally contaminated foodstuff was satisfactory over a relatively broad dynamic range. Moreover, results demonstrated the outstanding detection limit of one copy. However, while poorer DNA quality, such as resulting from longer storage, can impair ddPCR, internal amplification control (IAC) of prfA by ddPCR, that is integrated in the genome of L. monocytogenes ΔprfA, showed even slightly better quantification over a broader dynamic range. Graphical Abstract Evaluating the absolute quantification potential of ddPCR targeting Listeria monocytogenes prfA.

  8. Quantitative evaluation of microvascular blood flow by contrast-enhanced ultrasound (CEUS).

    PubMed

    Greis, Christian

    2011-01-01

    Ultrasound contrast agents consist of tiny gas-filled microbubbles the size of red blood cells. Due to their size distribution, they are purely intravascular tracers which do not extravasate into the interstitial fluid, and thus they are perfect agents for imaging blood distribution and flow. Using ultrasound scanners with contrast-specific software, the specific microbubble-derived echo signals can be separated from tissue signals in realtime, allowing selective imaging of the contrast agent. The signal intensity obtained lies in a linear relationship to the amount of microbubbles in the target organ, which allows easy and reliable assessment of relative blood volume. Imaging of the contrast wash-in and wash-out after bolus injection, or more precisely using the flash-replenishment technique, allows assessment of regional blood flow velocity. Commercially available quantification software packages can calculate time-related intensity values from the contrast wash-in and wash-out phase for each image pixel from stored video clips. After fitting of a mathematical model curve according to the respective kinetic model (bolus or flash-replenishment kinetics), time/intensity curves (TIC) can be calculated from single pixels or user-defined regions of interest (ROI). Characteristic parameters of these TICs (e.g. peak intensity, area under the curve, wash-in rate, etc.) can be displayed as color-coded parametric maps on top of the anatomical image, to identify cold and hot spots with abnormal perfusion.

  9. A new life for the wavelength-dispersive X-ray spectrometer (WDS): incorporation of a silicon drift detector into the WDS for improved quantification and X-ray mapping

    NASA Astrophysics Data System (ADS)

    Wuhrer, R.; Moran, K.

    2018-01-01

    The wavelength-dispersive X-ray spectrometer (WDS) has been around for a long time and the design has not changed much since its original development. The electron microprobe operator using WDS has to be meticulous in monitoring items such as gas flow, gas purity, gas pressure, noise levels of baseline and window, gas flow proportional counter (GFPC) voltage levels, count rate suppression, anode wire contamination and other detector parameters. Recent development and improvements of silicon drift detectors (SDD’s) has allowed the incorporation of a SDD as the X-ray detector in place of the proportional counter (PC) and/or gas flow proportional counter (GFPC). This allows minimal mechanical alteration and no loss of movement range. The superiority of a WDS with a SDD, referred to as SD-WDS, is easily seen once in operation. The SD-WDS removes many artefacts including the worse of all high order diffraction, thus allowing more accurate analysis. The incorporation of the SDD has been found to improve the light and mid element range and consequently improving the detection limit for these elements. It is also possible to obtain much more reliable results at high count rates with almost no change in resolution, gain and zero-peak characteristics of the energy spectrum.

  10. Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.

    PubMed

    Hawkins, Steve F C; Guest, Paul C

    2018-01-01

    The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.

  11. Easy, Fast, and Reproducible Quantification of Cholesterol and Other Lipids in Human Plasma by Combined High Resolution MSX and FTMS Analysis

    NASA Astrophysics Data System (ADS)

    Gallego, Sandra F.; Højlund, Kurt; Ejsing, Christer S.

    2018-01-01

    Reliable, cost-effective, and gold-standard absolute quantification of non-esterified cholesterol in human plasma is of paramount importance in clinical lipidomics and for the monitoring of metabolic health. Here, we compared the performance of three mass spectrometric approaches available for direct detection and quantification of cholesterol in extracts of human plasma. These approaches are high resolution full scan Fourier transform mass spectrometry (FTMS) analysis, parallel reaction monitoring (PRM), and novel multiplexed MS/MS (MSX) technology, where fragments from selected precursor ions are detected simultaneously. Evaluating the performance of these approaches in terms of dynamic quantification range, linearity, and analytical precision showed that the MSX-based approach is superior to that of the FTMS and PRM-based approaches. To further show the efficacy of this approach, we devised a simple routine for extensive plasma lipidome characterization using only 8 μL of plasma, using a new commercially available ready-to-spike-in mixture with 14 synthetic lipid standards, and executing a single 6 min sample injection with combined MSX analysis for cholesterol quantification and FTMS analysis for quantification of sterol esters, glycerolipids, glycerophospholipids, and sphingolipids. Using this simple routine afforded reproducible and absolute quantification of 200 lipid species encompassing 13 lipid classes in human plasma samples. Notably, the analysis time of this procedure can be shortened for high throughput-oriented clinical lipidomics studies or extended with more advanced MSALL technology (Almeida R. et al., J. Am. Soc. Mass Spectrom. 26, 133-148 [1]) to support in-depth structural elucidation of lipid molecules. [Figure not available: see fulltext.

  12. Easy, Fast, and Reproducible Quantification of Cholesterol and Other Lipids in Human Plasma by Combined High Resolution MSX and FTMS Analysis.

    PubMed

    Gallego, Sandra F; Højlund, Kurt; Ejsing, Christer S

    2018-01-01

    Reliable, cost-effective, and gold-standard absolute quantification of non-esterified cholesterol in human plasma is of paramount importance in clinical lipidomics and for the monitoring of metabolic health. Here, we compared the performance of three mass spectrometric approaches available for direct detection and quantification of cholesterol in extracts of human plasma. These approaches are high resolution full scan Fourier transform mass spectrometry (FTMS) analysis, parallel reaction monitoring (PRM), and novel multiplexed MS/MS (MSX) technology, where fragments from selected precursor ions are detected simultaneously. Evaluating the performance of these approaches in terms of dynamic quantification range, linearity, and analytical precision showed that the MSX-based approach is superior to that of the FTMS and PRM-based approaches. To further show the efficacy of this approach, we devised a simple routine for extensive plasma lipidome characterization using only 8 μL of plasma, using a new commercially available ready-to-spike-in mixture with 14 synthetic lipid standards, and executing a single 6 min sample injection with combined MSX analysis for cholesterol quantification and FTMS analysis for quantification of sterol esters, glycerolipids, glycerophospholipids, and sphingolipids. Using this simple routine afforded reproducible and absolute quantification of 200 lipid species encompassing 13 lipid classes in human plasma samples. Notably, the analysis time of this procedure can be shortened for high throughput-oriented clinical lipidomics studies or extended with more advanced MS ALL technology (Almeida R. et al., J. Am. Soc. Mass Spectrom. 26, 133-148 [1]) to support in-depth structural elucidation of lipid molecules. Graphical Abstract ᅟ.

  13. Fully automated system for the quantification of human osteoarthritic knee joint effusion volume using magnetic resonance imaging.

    PubMed

    Li, Wei; Abram, François; Pelletier, Jean-Pierre; Raynauld, Jean-Pierre; Dorais, Marc; d'Anjou, Marc-André; Martel-Pelletier, Johanne

    2010-01-01

    Joint effusion is frequently associated with osteoarthritis (OA) flare-up and is an important marker of therapeutic response. This study aimed at developing and validating a fully automated system based on magnetic resonance imaging (MRI) for the quantification of joint effusion volume in knee OA patients. MRI examinations consisted of two axial sequences: a T2-weighted true fast imaging with steady-state precession and a T1-weighted gradient echo. An automated joint effusion volume quantification system using MRI was developed and validated (a) with calibrated phantoms (cylinder and sphere) and effusion from knee OA patients; (b) with assessment by manual quantification; and (c) by direct aspiration. Twenty-five knee OA patients with joint effusion were included in the study. The automated joint effusion volume quantification was developed as a four stage sequencing process: bone segmentation, filtering of unrelated structures, segmentation of joint effusion, and subvoxel volume calculation. Validation experiments revealed excellent coefficients of variation with the calibrated cylinder (1.4%) and sphere (0.8%) phantoms. Comparison of the OA knee joint effusion volume assessed by the developed automated system and by manual quantification was also excellent (r = 0.98; P < 0.0001), as was the comparison with direct aspiration (r = 0.88; P = 0.0008). The newly developed fully automated MRI-based system provided precise quantification of OA knee joint effusion volume with excellent correlation with data from phantoms, a manual system, and joint aspiration. Such an automated system will be instrumental in improving the reproducibility/reliability of the evaluation of this marker in clinical application.

  14. Automated Segmentability Index for Layer Segmentation of Macular SD-OCT Images.

    PubMed

    Lee, Kyungmoo; Buitendijk, Gabriëlle H S; Bogunovic, Hrvoje; Springelkamp, Henriët; Hofman, Albert; Wahle, Andreas; Sonka, Milan; Vingerling, Johannes R; Klaver, Caroline C W; Abràmoff, Michael D

    2016-03-01

    To automatically identify which spectral-domain optical coherence tomography (SD-OCT) scans will provide reliable automated layer segmentations for more accurate layer thickness analyses in population studies. Six hundred ninety macular SD-OCT image volumes (6.0 × 6.0 × 2.3 mm 3 ) were obtained from one eyes of 690 subjects (74.6 ± 9.7 [mean ± SD] years, 37.8% of males) randomly selected from the population-based Rotterdam Study. The dataset consisted of 420 OCT volumes with successful automated retinal nerve fiber layer (RNFL) segmentations obtained from our previously reported graph-based segmentation method and 270 volumes with failed segmentations. To evaluate the reliability of the layer segmentations, we have developed a new metric, segmentability index SI, which is obtained from a random forest regressor based on 12 features using OCT voxel intensities, edge-based costs, and on-surface costs. The SI was compared with well-known quality indices, quality index (QI), and maximum tissue contrast index (mTCI), using receiver operating characteristic (ROC) analysis. The 95% confidence interval (CI) and the area under the curve (AUC) for the QI are 0.621 to 0.805 with AUC 0.713, for the mTCI 0.673 to 0.838 with AUC 0.756, and for the SI 0.784 to 0.920 with AUC 0.852. The SI AUC is significantly larger than either the QI or mTCI AUC ( P < 0.01). The segmentability index SI is well suited to identify SD-OCT scans for which successful automated intraretinal layer segmentations can be expected. Interpreting the quantification of SD-OCT images requires the underlying segmentation to be reliable, but standard SD-OCT quality metrics do not predict which segmentations are reliable and which are not. The segmentability index SI presented in this study does allow reliable segmentations to be identified, which is important for more accurate layer thickness analyses in research and population studies.

  15. Establishment and evaluation of a bead-based luminex assay allowing simultaneous quantification of equine IL-12 and IFN-γ.

    PubMed

    Duran, Maria Carolina; Willenbrock, Saskia; Müller, Jessika-M V; Nolte, Ingo; Feige, Karsten; Murua Escobar, Hugo

    2013-04-01

    Interleukin-12 (IL-12) and interferon gamma (IFN-γ) are key cytokines in immunemediated equine melanoma therapy. Currently, a method for accurate simultaneous quantification of these equine cytokines is lacking. Therefore, we sought to establish an assay that allows for accurate and simultaneous quantification of equine IL-12 (eIL-12) and IFN-γ (eIFN-γ). Several antibodies were evaluated for cross-reactivity to eIL-12 and eIFN-γ and were used to establish a bead-based Luminex assay, which was subsequently applied to quantify cytokine concentrations in biological samples. Cytokine detection ranged from 31.5-5,000 pg/ml and 15-10,000 pg/ml for eIL-12 and eIFN-γ, respectively. eIL-12 was detected in supernatants of stimulated peripheral blood mononuclear cells (PBMCs) and supernatants/cell lysates of eIL-12 expression plasmid-transfected cells. Low or undetectable cytokine concentrations were measured in negative controls. In equine serum samples, the mean measured eIL-12 concentration was 1,374 ± 8 pg/ml. The bead-based assay and ELISA for eIFN-γ used to measure eIFN-γ concentrations, showed similar concentrations. Results demonstrate, to our knowledge for the first time, that cross-reactive antibody pairs to eIL-12 and eIFN-γ and Luminex bead-based technology allow for accurate, simultaneous and multiplexed quantification of these key cytokines in biological samples.

  16. [Quantitative PCR in the diagnosis of Leishmania].

    PubMed

    Mortarino, M; Franceschi, A; Mancianti, F; Bazzocchi, C; Genchi, C; Bandi, C

    2004-06-01

    Polymerase chain reaction (PCR) is a sensitive and rapid method for the diagnosis of canine Leishmania infection and can be performed on a variety of biological samples, including peripheral blood, lymph node, bone marrow and skin. Standard PCR requires electrophoretic analysis of the amplification products and is usually not suitable for quantification of the template DNA (unless competitor-based or other methods are developed), being of reduced usefulness when accurate monitoring of target DNA is required. Quantitative real-time PCR allows the continuous monitoring of the accumulation of PCR products during the amplification reaction. This allows the identification of the cycle of near-logarithmic PCR product generation (threshold cycle) and, by inference, the relative quantification of the template DNA present at the start of the reaction. Since the amplification product are monitored in "real-time" as they form cycle-by-cycle, no post-amplification handling is required. The absolute quantification is performed according either to an internal standard co-amplified with the sample DNA, or to an external standard curve obtained by parallel amplification of serial known concentrations of a reference DNA sequence. From the quantification of the template DNA, an estimation of the relative load of parasites in the different samples can be obtained. The advantages compared to standard and semi-quantitative PCR techniques are reduction of the assay's time and contamination risks, and improved sensitivity. As for standard PCR, the minimal components of the quantitative PCR reaction mixture are the DNA target of the amplification, an oligonucleotide primer pair flanking the target sequence, a suitable DNA polymerase, deoxynucleotides, buffer and salts. Different technologies have been set up for the monitoring of amplification products, generally based on the use of fluorescent probes. For instance, SYBR Green technology is a non-specific detection system based on a fluorescent dsDNA intercalator and it is applicable to all potential targets. TaqMan technology is more specific since performs the direct assessment of the amount of amplified DNA using a fluorescent probe specific for the target sequence flanked by the primer pair. This probe is an oligonucleotide labelled with a reporter dye (fluorescent) and a quencher (which absorbs the fluorescent signal generated by the reporter). The thermic protocol of amplification allows the binding of the fluorescent probe to the target sequence before the binding of the primers and the starting of the polymerization by Taq polymerase. During polymerization, 5'-3' exonuclease activity of Taq polymerase digests the probe and in this way the reporter dye is released from the probe and a fluorescent signal is detected. The intensity of the signal accumulates at the end of each cycle and is related to the amount of the amplification product. In recent years, quantitative PCR methods based either on SYBR Green or TaqMan technology have been set up for the quantification of Leishmania in mouse liver, mouse skin and human peripheral blood, targeting either single-copy chromosomal or multi-copy minicircle sequences with high sensitivity and reproducibility. In particular, real-time PCR seems to be a reliable, rapid and noninvasive method for the diagnosis and follow up of visceral leishmaniasis in humans. At present, the application of real-time PCR for research and clinical diagnosis of Leishmania infection in dogs is still foreseable. As for standard PCR, the high sensitivity of real-time PCR could allow the use of blood sampling that is less invasive and easily performed for monitoring the status of the dogs. The development of a real-time PCR assay for Leishmania infantum infection in dogs could support the standard and optimized serological and PCR methods currenly in use for the diagnosis and follow-up of canine leishmaniasis, and perhaps prediction of recurrences associated with tissue loads of residual pathogens after treatment. At this regard, a TaqMan Real Time PCR method developed for the quantification of Leishmania infantum minicircle DNA in peripheral blood of naturally infected dogs sampled before and at different time points after the beginning of a standard antileishmanial therapy will be illustrated.

  17. Quantification of Global DNA Methylation Levels by Mass Spectrometry.

    PubMed

    Fernandez, Agustin F; Valledor, Luis; Vallejo, Fernando; Cañal, Maria Jesús; Fraga, Mario F

    2018-01-01

    Global DNA methylation was classically considered the relative percentage of 5-methylcysine (5mC) with respect to total cytosine (C). Early approaches were based on the use of high-performance separation technologies and UV detection. However, the recent development of protocols using mass spectrometry for the detection has increased sensibility and permitted the precise identification of peak compounds based on their molecular masses. This allows work to be conducted with much less genomic DNA starting material and also to quantify 5-hydroxymethyl-cytosine (5hmC), a recently identified form of methylated cytosine that could play an important role in active DNA demethylation. Here, we describe the protocol that we currently use in our laboratory to analyze 5mC and 5hmC by mass spectrometry. The protocol, which is based on the method originally developed by Le and colleagues using Ultra Performance Liquid Chromatography (UPLC) and mass spectrometry (triple Quadrupole (QqQ)) detection, allows for the rapid and accurate quantification of relative global 5mC and 5hmC levels starting from just 1 μg of genomic DNA, which allows for the rapid and accurate quantification of relative global 5mC and 5hmC levels.

  18. Development and validation of a reversed phase liquid chromatographic method for analysis of oxytetracycline and related impurities.

    PubMed

    Kahsay, Getu; Shraim, Fairouz; Villatte, Philippe; Rotger, Jacques; Cassus-Coussère, Céline; Van Schepdael, Ann; Hoogmartens, Jos; Adams, Erwin

    2013-03-05

    A simple, robust and fast high-performance liquid chromatographic method is described for the analysis of oxytetracycline and its related impurities. The principal peak and impurities are all baseline separated in 20 min using an Inertsil C₈ (150 mm × 4.6 mm, 5 μm) column kept at 50 °C. The mobile phase consists of a gradient mixture of mobile phases A (0.05% trifluoroacetic acid in water) and B (acetonitrile-methanol-tetrahydrofuran, 80:15:5, v/v/v) pumped at a flow rate of 1.3 ml/min. UV detection was performed at 254 nm. The developed method was validated for its robustness, sensitivity, precision and linearity in the range from limit of quantification (LOQ) to 120%. The limits of detection (LOD) and LOQ were found to be 0.08 μg/ml and 0.32 μg/ml, respectively. This method allows the separation of oxytetracycline from all known and 5 unknown impurities, which is better than previously reported in the literature. Moreover, the simple mobile phase composition devoid of non-volatile buffers made the method suitable to interface with mass spectrometry for further characterization of unknown impurities. The developed method has been applied for determination of related substances in oxytetracycline bulk samples available from four manufacturers. The validation results demonstrate that the method is reliable for quantification of oxytetracycline and its impurities. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Comparison of alternative approaches for analysing multi-level RNA-seq data

    PubMed Central

    Mohorianu, Irina; Bretman, Amanda; Smith, Damian T.; Fowler, Emily K.; Dalmay, Tamas

    2017-01-01

    RNA sequencing (RNA-seq) is widely used for RNA quantification in the environmental, biological and medical sciences. It enables the description of genome-wide patterns of expression and the identification of regulatory interactions and networks. The aim of RNA-seq data analyses is to achieve rigorous quantification of genes/transcripts to allow a reliable prediction of differential expression (DE), despite variation in levels of noise and inherent biases in sequencing data. This can be especially challenging for datasets in which gene expression differences are subtle, as in the behavioural transcriptomics test dataset from D. melanogaster that we used here. We investigated the power of existing approaches for quality checking mRNA-seq data and explored additional, quantitative quality checks. To accommodate nested, multi-level experimental designs, we incorporated sample layout into our analyses. We employed a subsampling without replacement-based normalization and an identification of DE that accounted for the hierarchy and amplitude of effect sizes within samples, then evaluated the resulting differential expression call in comparison to existing approaches. In a final step to test for broader applicability, we applied our approaches to a published set of H. sapiens mRNA-seq samples, The dataset-tailored methods improved sample comparability and delivered a robust prediction of subtle gene expression changes. The proposed approaches have the potential to improve key steps in the analysis of RNA-seq data by incorporating the structure and characteristics of biological experiments. PMID:28792517

  20. Hyperplex-MRM: a hybrid multiple reaction monitoring method using mTRAQ/iTRAQ labeling for multiplex absolute quantification of human colorectal cancer biomarker.

    PubMed

    Yin, Hong-Rui; Zhang, Lei; Xie, Li-Qi; Huang, Li-Yong; Xu, Ye; Cai, San-Jun; Yang, Peng-Yuan; Lu, Hao-Jie

    2013-09-06

    Novel biomarker verification assays are urgently required to improve the efficiency of biomarker development. Benefitting from lower development costs, multiple reaction monitoring (MRM) has been used for biomarker verification as an alternative to immunoassay. However, in general MRM analysis, only one sample can be quantified in a single experiment, which restricts its application. Here, a Hyperplex-MRM quantification approach, which combined mTRAQ for absolute quantification and iTRAQ for relative quantification, was developed to increase the throughput of biomarker verification. In this strategy, equal amounts of internal standard peptides were labeled with mTRAQ reagents Δ0 and Δ8, respectively, as double references, while 4-plex iTRAQ reagents were used to label four different samples as an alternative to mTRAQ Δ4. From the MRM trace and MS/MS spectrum, total amounts and relative ratios of target proteins/peptides of four samples could be acquired simultaneously. Accordingly, absolute amounts of target proteins/peptides in four different samples could be achieved in a single run. In addition, double references were used to increase the reliability of the quantification results. Using this approach, three biomarker candidates, ademosylhomocysteinase (AHCY), cathepsin D (CTSD), and lysozyme C (LYZ), were successfully quantified in colorectal cancer (CRC) tissue specimens of different stages with high accuracy, sensitivity, and reproducibility. To summarize, we demonstrated a promising quantification method for high-throughput verification of biomarker candidates.

  1. Model Improvement by Assimilating Observations of Storm-Induced Coastal Change

    NASA Astrophysics Data System (ADS)

    Long, J. W.; Plant, N. G.; Sopkin, K.

    2010-12-01

    Discrete, large scale, meteorological events such as hurricanes can cause wide-spread destruction of coastal islands, habitats, and infrastructure. The effects can vary significantly along the coast depending on the configuration of the coastline, variable dune elevations, changes in geomorphology (sandy beach vs. marshland), and alongshore variations in storm hydrodynamic forcing. There are two primary methods of determining the changing state of a coastal system. Process-based numerical models provide highly resolved (in space and time) representations of the dominant dynamics in a physical system but must employ certain parameterizations due to computational limitations. The predictive capability may also suffer from the lack of reliable initial or boundary conditions. On the other hand, observations of coastal topography before and after the storm allow the direct quantification of cumulative storm impacts. Unfortunately these measurements suffer from instrument noise and a lack of necessary temporal resolution. This research focuses on the combination of these two pieces of information to make more reliable forecasts of storm-induced coastal change. Of primary importance is the development of a data assimilation strategy that is efficient, applicable for use with highly nonlinear models, and able to quantify the remaining forecast uncertainty based on the reliability of each individual piece of information used in the assimilation process. We concentrate on an event time-scale and estimate/update unobserved model information (boundary conditions, free parameters, etc.) by assimilating direct observations of coastal change with those simulated by the model. The data assimilation can help estimate spatially varying quantities (e.g. friction coefficients) that are often modeled as homogeneous and identify processes inadequately characterized in the model.

  2. Measuring Food Brand Awareness in Australian Children: Development and Validation of a New Instrument.

    PubMed

    Turner, Laura; Kelly, Bridget; Boyland, Emma; Bauman, Adrian E

    2015-01-01

    Children's exposure to food marketing is one environmental determinant of childhood obesity. Measuring the extent to which children are aware of food brands may be one way to estimate relative prior exposures to food marketing. This study aimed to develop and validate an Australian Brand Awareness Instrument (ABAI) to estimate children's food brand awareness. The ABAI incorporated 30 flashcards depicting food/drink logos and their corresponding products. An abbreviated version was also created using 12 flashcards (ABAI-a). The ABAI was presented to 60 primary school aged children (7-11 yrs) attending two Australian after-school centres. A week later, the full-version was repeated on approximately half the sample (n=27) and the abbreviated-version was presented to the remaining half (n=30). The test-retest reliability of the ABAI was analysed using Intra-class correlation coefficients. The concordance of the ABAI-a and full-version was assessed using Bland-Altman plots. The 'nomological' validity of the full tool was investigated by comparing children's brand awareness with food marketing-related variables (e.g. television habits, intake of heavily promoted foods). Brand awareness increased with age (p<0.01) but was not significantly correlated with other variables. Bland-Altman analyses showed good agreement between the ABAI and ABAI-a. Reliability analyses revealed excellent agreement between the two administrations of the full-ABAI. The ABAI was able to differentiate children's varying levels of brand awareness. It was shown to be a valid and reliable tool and may allow quantification of brand awareness as a proxy measure for children's prior food marketing exposure.

  3. qFibrosis: A fully-quantitative innovative method incorporating histological features to facilitate accurate fibrosis scoring in animal model and chronic hepatitis B patients

    PubMed Central

    Tai, Dean C.S.; Wang, Shi; Cheng, Chee Leong; Peng, Qiwen; Yan, Jie; Chen, Yongpeng; Sun, Jian; Liang, Xieer; Zhu, Youfu; Rajapakse, Jagath C.; Welsch, Roy E.; So, Peter T.C.; Wee, Aileen; Hou, Jinlin; Yu, Hanry

    2014-01-01

    Background & Aims There is increasing need for accurate assessment of liver fibrosis/cirrhosis. We aimed to develop qFibrosis, a fully-automated assessment method combining quantification of histopathological architectural features, to address unmet needs in core biopsy evaluation of fibrosis in chronic hepatitis B (CHB) patients. Methods qFibrosis was established as a combined index based on 87 parameters of architectural features. Images acquired from 25 Thioacetamide-treated rat samples and 162 CHB core biopsies were used to train and test qFibrosis and to demonstrate its reproducibility. qFibrosis scoring was analyzed employing Metavir and Ishak fibrosis staging as standard references, and collagen proportionate area (CPA) measurement for comparison. Results qFibrosis faithfully and reliably recapitulates Metavir fibrosis scores, as it can identify differences between all stages in both animal samples (p <0.001) and human biopsies (p <0.05). It is robust to sampling size, allowing for discrimination of different stages in samples of different sizes (area under the curve (AUC): 0.93–0.99 for animal samples: 1–16 mm2; AUC: 0.84–0.97 for biopsies: 10–44 mm in length). qFibrosis can significantly predict staging underestimation in suboptimal biopsies (<15 mm) and under- and over-scoring by different pathologists (p <0.001). qFibrosis can also differentiate between Ishak stages 5 and 6 (AUC: 0.73, p = 0.008), suggesting the possibility of monitoring intra-stage cirrhosis changes. Best of all, qFibrosis demonstrates superior performance to CPA on all counts. Conclusions qFibrosis can improve fibrosis scoring accuracy and throughput, thus allowing for reproducible and reliable analysis of efficacies of anti-fibrotic therapies in clinical research and practice. PMID:24583249

  4. Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.

    PubMed

    Gerold, Chase T; Bakker, Eric; Henry, Charles S

    2018-04-03

    In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.

  5. Determination of patulin in apple and derived products by UHPLC-MS/MS. Study of matrix effects with atmospheric pressure ionisation sources.

    PubMed

    Beltrán, Eduardo; Ibáñez, María; Sancho, Juan Vicente; Hernández, Félix

    2014-01-01

    Sensitive and reliable analytical methodology has been developed for the measurement of patulin in regulated foodstuffs by using ultra-high-performance liquid chromatography coupled to tandem mass spectrometry (UHPLC-MS/MS) with triple quadrupole analyser. Solid samples were extracted with ethyl acetate, while liquid samples were directly injected into the chromatographic system after dilution and filtration without any clean-up step. Chromatographic separation was achieved in less than 4min. Electrospray (ESI) and atmospheric pressure chemical ionisation (APCI) sources were evaluated, in order to assess matrix effects. The use of ESI source caused strong signal suppression in samples; however, matrix effect was negligible using APCI, allowing quantification with calibration standards prepared in solvent. The method was validated in four different apple matrices (juice, fruit, puree and compote) at two concentrations at the low μgkg(-1) level. Average recoveries (n=5) ranged from 71% to 108%, with RSDs lower than 14%. Copyright © 2013 Elsevier Ltd. All rights reserved.

  6. Advanced magnetic resonance imaging methods for planning and monitoring radiation therapy in patients with high-grade glioma.

    PubMed

    Lupo, Janine M; Nelson, Sarah J

    2014-10-01

    This review explores how the integration of advanced imaging methods with high-quality anatomical images significantly improves the characterization, target definition, assessment of response to therapy, and overall management of patients with high-grade glioma. Metrics derived from diffusion-, perfusion-, and susceptibility-weighted magnetic resonance imaging in conjunction with magnetic resonance spectroscopic imaging, allows us to characterize regions of edema, hypoxia, increased cellularity, and necrosis within heterogeneous tumor and surrounding brain tissue. Quantification of such measures may provide a more reliable initial representation of tumor delineation and response to therapy than changes in the contrast-enhancing or T2 lesion alone and have a significant effect on targeting resection, planning radiation, and assessing treatment effectiveness. In the long term, implementation of these imaging methodologies can also aid in the identification of recurrent tumor and its differentiation from treatment-related confounds and facilitate the detection of radiationinduced vascular injury in otherwise normal-appearing brain tissue.

  7. AutoTag and AutoSnap: Standardized, semi-automatic capture of regions of interest from whole slide images

    PubMed Central

    Marien, Koen M.; Andries, Luc; De Schepper, Stefanie; Kockx, Mark M.; De Meyer, Guido R.Y.

    2015-01-01

    Tumor angiogenesis is measured by counting microvessels in tissue sections at high power magnification as a potential prognostic or predictive biomarker. Until now, regions of interest1 (ROIs) were selected by manual operations within a tumor by using a systematic uniform random sampling2 (SURS) approach. Although SURS is the most reliable sampling method, it implies a high workload. However, SURS can be semi-automated and in this way contribute to the development of a validated quantification method for microvessel counting in the clinical setting. Here, we report a method to use semi-automated SURS for microvessel counting: • Whole slide imaging with Pannoramic SCAN (3DHISTECH) • Computer-assisted sampling in Pannoramic Viewer (3DHISTECH) extended by two self-written AutoHotkey applications (AutoTag and AutoSnap) • The use of digital grids in Photoshop® and Bridge® (Adobe Systems) This rapid procedure allows traceability essential for high throughput protein analysis of immunohistochemically stained tissue. PMID:26150998

  8. Determination of BPA, BPB, BPF, BADGE and BFDGE in canned energy drinks by molecularly imprinted polymer cleaning up and UPLC with fluorescence detection.

    PubMed

    Gallo, Pasquale; Di Marco Pisciottano, Ilaria; Esposito, Francesco; Fasano, Evelina; Scognamiglio, Gelsomina; Mita, Gustavo Damiano; Cirillo, Teresa

    2017-04-01

    A new method for simultaneous determination of five bisphenols in canned energy drinks by UPLC with fluorescence detection, after clean up on molecularly imprinted polymers, is herein described. The method was validated at two concentration levels, calculating trueness, repeatability and within-laboratory reproducibility, specificity, linearity of detector response, the limits of quantifications and the limits of detection for each bisphenol. The method is specific, reliable and very sensitive, allowing for determination of bisphenol F diglycidyl ether (BFDGE), bisphenol A (BPA), bisphenol B (BPB), bisphenol F (BPF) and bisphenol A diglycidyl ether (BADGE) down to 0.50ng/mL; it was employed to determine contamination levels from these bisphenols in forty energy drinks of different brands, collected from the market in Naples. BPA was detected in 17 out of 40 samples (42.5%); in some energy drinks also BPF, BADGE and BFDGE were determined. Copyright © 2016 Elsevier Ltd. All rights reserved.

  9. Distinguishing cause from correlation in tokamak experiments to trigger edge-localised plasma instabilities

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webster, Anthony J.; CCFE, Culham Science Centre, Abingdon OX14 3DB

    2014-11-15

    The generic question is considered: How can we determine the probability of an otherwise quasi-random event, having been triggered by an external influence? A specific problem is the quantification of the success of techniques to trigger, and hence control, edge-localised plasma instabilities (ELMs) in magnetically confined fusion (MCF) experiments. The development of such techniques is essential to ensure tolerable heat loads on components in large MCF fusion devices, and is necessary for their development into economically successful power plants. Bayesian probability theory is used to rigorously formulate the problem and to provide a formal solution. Accurate but pragmatic methods aremore » developed to estimate triggering probabilities, and are illustrated with experimental data. These allow results from experiments to be quantitatively assessed, and rigorously quantified conclusions to be formed. Example applications include assessing whether triggering of ELMs is a statistical or deterministic process, and the establishment of thresholds to ensure that ELMs are reliably triggered.« less

  10. Evidence of a rolling motion of a microparticle on a silicon wafer in a liquid environment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Schiwek, Simon; Stark, Robert W., E-mail: stark@csi.tu-darmstadt.de, E-mail: dietz@csi.tu-darmstadt.de; Dietz, Christian, E-mail: stark@csi.tu-darmstadt.de, E-mail: dietz@csi.tu-darmstadt.de

    2016-05-21

    The interaction of micro- and nanometer-sized particles with surfaces plays a crucial role when small-scale structures are built in a bottom-up approach or structured surfaces are cleaned in the semiconductor industry. For a reliable quantification of the interaction between individual particles and a specific surface, however, the motion type of the particle must be known. We developed an approach to unambiguously distinguish between sliding and rolling particles. To this end, fluorescent particles were partially bleached in a confocal laser scanning microscope to tailor an optical inhomogeneity, which allowed for the identification of the characteristic motion pattern. For the manipulation, themore » water flow generated by a fast moving cantilever-tip of an atomic force microscope enabled the contactless pushing of the particle. We thus experimentally evidenced a rolling motion of a micrometer-sized particle directly with a fluorescence microscope. A similar approach could help to discriminate between rolling and sliding particles in liquid flows of microfluidic systems.« less

  11. Determination of volatile marker compounds in raw ham using headspace-trap gas chromatography.

    PubMed

    Bosse Née Danz, Ramona; Wirth, Melanie; Konstanz, Annette; Becker, Thomas; Weiss, Jochen; Gibis, Monika

    2017-03-15

    A simple, reliable and automated method was developed and optimized for qualification and quantification of aroma-relevant volatile marker compounds of North European raw ham using a headspace (HS)-Trap gas chromatography-mass spectrometry (GC-MS) and GC-flame ionization detector (FID) analysis. A total of 38 volatile compounds were detected with this HS-Trap GC-MS method amongst which the largest groups were ketones (12), alcohols (8), hydrocarbons (7), aldehydes (6) and esters (3). The HS-Trap GC-FID method was optimized for the parameters: thermostatting time and temperature, vial and desorption pressure, number of extraction cycles and salt addition. A validation for 13 volatile marker compounds with limits of detection in ng/g was carried out. The optimized method can serve as alternative to conventional headspace and solid phase micro extraction methods and allows users to determine volatile compounds in raw hams making it of interest to industrial and academic meat scientists. Copyright © 2016 Elsevier Ltd. All rights reserved.

  12. Quantifying Multistate Cytoplasmic Molecular Diffusion in Bacterial Cells via Inverse Transform of Confined Displacement Distribution.

    PubMed

    Chen, Tai-Yen; Jung, Won; Santiago, Ace George; Yang, Feng; Krzemiński, Łukasz; Chen, Peng

    2015-11-12

    Single-molecule tracking (SMT) of fluorescently tagged cytoplasmic proteins can provide valuable information on the underlying biological processes in living cells via subsequent analysis of the displacement distributions; however, the confinement effect originated from the small size of a bacterial cell skews the protein's displacement distribution and complicates the quantification of the intrinsic diffusive behaviors. Using the inverse transformation method, we convert the skewed displacement distribution (for both 2D and 3D imaging conditions) back to that in free space for systems containing one or multiple (non)interconverting Brownian diffusion states, from which we can reliably extract the number of diffusion states as well as their intrinsic diffusion coefficients and respective fractional populations. We further demonstrate a successful application to experimental SMT data of a transcription factor in living E. coli cells. This work allows a direct quantitative connection between cytoplasmic SMT data with diffusion theory for analyzing molecular diffusive behavior in live bacteria.

  13. A generic standard additions based method to determine endogenous analyte concentrations by immunoassays to overcome complex biological matrix interference.

    PubMed

    Pang, Susan; Cowen, Simon

    2017-12-13

    We describe a novel generic method to derive the unknown endogenous concentrations of analyte within complex biological matrices (e.g. serum or plasma) based upon the relationship between the immunoassay signal response of a biological test sample spiked with known analyte concentrations and the log transformed estimated total concentration. If the estimated total analyte concentration is correct, a portion of the sigmoid on a log-log plot is very close to linear, allowing the unknown endogenous concentration to be estimated using a numerical method. This approach obviates conventional relative quantification using an internal standard curve and need for calibrant diluent, and takes into account the individual matrix interference on the immunoassay by spiking the test sample itself. This technique is based on standard additions for chemical analytes. Unknown endogenous analyte concentrations within even 2-fold diluted human plasma may be determined reliably using as few as four reaction wells.

  14. A novel qPCR protocol for the specific detection and quantification of the fuel-deteriorating fungus Hormoconis resinae.

    PubMed

    Martin-Sanchez, Pedro M; Gorbushina, Anna A; Kunte, Hans-Jörg; Toepel, Jörg

    2016-07-01

    A wide variety of fungi and bacteria are known to contaminate fuels and fuel systems. These microbial contaminants have been linked to fuel system fouling and corrosion. The fungus Hormoconis resinae, a common jet fuel contaminant, is used in this study as a model for developing innovative risk assessment methods. A novel qPCR protocol to detect and quantify H. resinae in, and together with, total fungal contamination of fuel systems is reported. Two primer sets, targeting the markers RPB2 and ITS, were selected for their remarkable specificity and sensitivity. These primers were successfully applied on fungal cultures and diesel samples demonstrating the validity and reliability of the established qPCR protocol. This novel tool allows clarification of the current role of H. resinae in fuel contamination cases, as well as providing a technique to detect fungal outbreaks in fuel systems. This tool can be expanded to other well-known fuel-deteriorating microorganisms.

  15. Quantifying Multistate Cytoplasmic Molecular Diffusion in Bacterial Cells via Inverse Transform of Confined Displacement Distribution

    PubMed Central

    2016-01-01

    Single-molecule tracking (SMT) of fluorescently tagged cytoplasmic proteins can provide valuable information on the underlying biological processes in living cells via subsequent analysis of the displacement distributions; however, the confinement effect originated from the small size of a bacterial cell skews the protein’s displacement distribution and complicates the quantification of the intrinsic diffusive behaviors. Using the inverse transformation method, we convert the skewed displacement distribution (for both 2D and 3D imaging conditions) back to that in free space for systems containing one or multiple (non)interconverting Brownian diffusion states, from which we can reliably extract the number of diffusion states as well as their intrinsic diffusion coefficients and respective fractional populations. We further demonstrate a successful application to experimental SMT data of a transcription factor in living E. coli cells. This work allows a direct quantitative connection between cytoplasmic SMT data with diffusion theory for analyzing molecular diffusive behavior in live bacteria. PMID:26491971

  16. Two-dimensional flow nanometry of biological nanoparticles for accurate determination of their size and emission intensity

    NASA Astrophysics Data System (ADS)

    Block, Stephan; Fast, Björn Johansson; Lundgren, Anders; Zhdanov, Vladimir P.; Höök, Fredrik

    2016-09-01

    Biological nanoparticles (BNPs) are of high interest due to their key role in various biological processes and use as biomarkers. BNP size and composition are decisive for their functions, but simultaneous determination of both properties with high accuracy remains challenging. Optical microscopy allows precise determination of fluorescence/scattering intensity, but not the size of individual BNPs. The latter is better determined by tracking their random motion in bulk, but the limited illumination volume for tracking this motion impedes reliable intensity determination. Here, we show that by attaching BNPs to a supported lipid bilayer, subjecting them to hydrodynamic flows and tracking their motion via surface-sensitive optical imaging enable determination of their diffusion coefficients and flow-induced drifts, from which accurate quantification of both BNP size and emission intensity can be made. For vesicles, the accuracy of this approach is demonstrated by resolving the expected radius-squared dependence of their fluorescence intensity for radii down to 15 nm.

  17. Microscopic quantification of bacterial invasion by a novel antibody-independent staining method.

    PubMed

    Agerer, Franziska; Waeckerle, Stephanie; Hauck, Christof R

    2004-10-01

    Microscopic discrimination between extracellular and invasive, intracellular bacteria is a valuable technique in microbiology and immunology. We describe a novel fluorescence staining protocol, called FITC-biotin-avidin (FBA) staining, which allows the differentiation between extracellular and intracellular bacteria and is independent of specific antibodies directed against the microorganisms. FBA staining of eukaryotic cells infected with Gram-negative bacteria of the genus Neisseria or the Gram-positive pathogen Staphylococcus aureus are employed to validate the novel technique. The quantitative evaluation of intracellular pathogens by the FBA staining protocol yields identical results compared to parallel samples stained with conventional, antibody-dependent methods. FBA staining eliminates the need for cell permeabilization resulting in robust and rapid detection of invasive microbes. Taken together, FBA staining provides a reliable and convenient alternative for the differential detection of intracellular and extracellular bacteria and should be a valuable technical tool for the quantitative analysis of the invasive properties of pathogenic bacteria and other microorganisms.

  18. Quantitation of Human Cytochrome P450 2D6 Protein with Immunoblot and Mass Spectrometry Analysis

    PubMed Central

    Yu, Ai-Ming; Qu, Jun; Felmlee, Melanie A.; Cao, Jin; Jiang, Xi-Ling

    2009-01-01

    Accurate quantification of cytochrome P450 (P450) protein contents is essential for reliable assessment of drug safety, including the prediction of in vivo clearance from in vitro metabolism data, which may be hampered by the use of uncharacterized standards and existence of unknown allelic isozymes. Therefore, this study aimed to delineate the variability in absolute quantification of polymorphic CYP2D6 drug-metabolizing enzyme and compare immunoblot and nano liquid chromatography coupled to mass spectrometry (nano-LC/MS) methods in identification and relative quantification of CYP2D6.1 and CYP2D6.2 allelic isozymes. Holoprotein content of in-house purified CYP2D6 isozymes was determined according to carbon monoxide difference spectrum, and total protein was quantified with bicinchoninic acid protein assay. Holoprotein/total CYP2D6 protein ratio was markedly higher for purified CYP2D6.1 (71.0%) than that calculated for CYP2D6.1 Supersomes (35.5%), resulting in distinct linear calibration range (0.05–0.50 versus 0.025–0.25 pmol) that was determined by densitometric analysis of immunoblot bands. Likewise, purified CYP2D6.2 and CYP2D6.10 and the CYP2D6.10 Supersomes all showed different holoprotein/total CYP2D6 protein ratios and distinct immunoblot linear calibration ranges. In contrast to immunoblot, nano-LC/MS readily distinguished CYP2D6.2 (R296C and S486T) from CYP2D6.1 by isoform-specific proteolytic peptides that contain the altered amino acid residues. In addition, relative quantitation of the two allelic isozymes was successfully achieved with label-free protein quantification, consistent with the nominated ratio. Because immunoblot and nano-LC/MS analyses measure total P450 protein (holoprotein and apoprotein) in a sample, complete understanding of holoprotein and apoprotein contents in P450 standards is desired toward reliable quantification. Our data also suggest that nano-LC/MS not only facilitates P450 quantitation but also provides genotypic information. PMID:18832475

  19. Method development and validation for simultaneous quantification of 15 drugs of abuse and prescription drugs and 7 of their metabolites in whole blood relevant in the context of driving under the influence of drugs--usefulness of multi-analyte calibration.

    PubMed

    Steuer, Andrea E; Forss, Anna-Maria; Dally, Annika M; Kraemer, Thomas

    2014-11-01

    In the context of driving under the influence of drugs (DUID), not only common drugs of abuse may have an influence, but also medications with similar mechanisms of action. Simultaneous quantification of a variety of drugs and medications relevant in this context allows faster and more effective analyses. Therefore, multi-analyte approaches have gained more and more popularity in recent years. Usually, calibration curves for such procedures contain a mixture of all analytes, which might lead to mutual interferences. In this study we investigated whether the use of such mixtures leads to reliable results for authentic samples containing only one or two analytes. Five hundred microliters of whole blood were extracted by routine solid-phase extraction (SPE, HCX). Analysis was performed on an ABSciex 3200 QTrap instrument with ESI+ in scheduled MRM mode. The method was fully validated according to international guidelines including selectivity, recovery, matrix effects, accuracy and precision, stabilities, and limit of quantification. The selected SPE provided recoveries >60% for all analytes except 6-monoacetylmorphine (MAM) with coefficients of variation (CV) below 15% or 20% for quality controls (QC) LOW and HIGH, respectively. Ion suppression >30% was found for benzoylecgonine, hydrocodone, hydromorphone, MDA, oxycodone, and oxymorphone at QC LOW, however CVs were always below 10% (n=6 different whole blood samples). Accuracy and precision criteria were fulfilled for all analytes except for MAM. Systematic investigation of accuracy determined for QC MED in a multi-analyte mixture compared to samples containing only single analytes revealed no relevant differences for any analyte, indicating that a multi-analyte calibration is suitable for the presented method. Comparison of approximately 60 samples to a former GC-MS method showed good correlation. The newly validated method was successfully applied to more than 1600 routine samples and 3 proficiency tests. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  20. Towards high-resolution 4D flow MRI in the human aorta using kt-GRAPPA and B1+ shimming at 7T.

    PubMed

    Schmitter, Sebastian; Schnell, Susanne; Uğurbil, Kâmil; Markl, Michael; Van de Moortele, Pierre-François

    2016-08-01

    To evaluate the feasibility of aortic 4D flow magnetic resonance imaging (MRI) at 7T with improved spatial resolution using kt-GRAPPA acceleration while restricting acquisition time and to address radiofrequency (RF) excitation heterogeneities with B1+ shimming. 4D flow MRI data were obtained in the aorta of eight subjects using a 16-channel transmit/receive coil array at 7T. Flow quantification and acquisition time were compared for a kt-GRAPPA accelerated (R = 5) and a standard GRAPPA (R = 2) accelerated protocol. The impact of different dynamic B1+ shimming strategies on flow quantification was investigated. Two kt-GRAPPA accelerated protocols with 1.2 × 1.2 × 1.2 mm(3) and 1.8 × 1.8 × 2.4 mm(3) spatial resolution were compared. Using kt-GRAPPA, we achieved a 4.3-fold reduction in net acquisition time resulting in scan times of about 10 minutes. No significant effect on flow quantification was observed compared to standard GRAPPA with R = 2. Optimizing the B1+ fields for the aorta impacted significantly (P <  0.05) the flow quantification while specific B1+ settings were required for respiration navigators. The high-resolution protocol yielded similar flow quantification, but allowed the depiction of branching vessels. 7T in combination with B1+ shimming allows for high-resolution 4D flow MRI acquisitions in the human aorta, while kt-GRAPPA limits total scan times without affecting flow quantification. J. Magn. Reson. Imaging 2016;44:486-499. © 2016 Wiley Periodicals, Inc.

  1. Symptomology of ozone injury to pine foliage

    Treesearch

    Kenneth Stolte

    1996-01-01

    Symptoms of ozone injury on western pines, ranging from effects on needles to effects on portions of ecosystems, can be differentiated from symptoms induced by other natural biotic and abiotic stressors occurring in the same area. Once identified in laboratory and field studies, quantification and monitoring of these symptoms can be used to provide reliable information...

  2. A dual validation approach to detect anthelmintic residues in bovine liver over an extended concentration range

    USDA-ARS?s Scientific Manuscript database

    This paper describes a method for the detection and quantification of 38 of the most widely used anthelmintics (including benzimidazoles, macrocyclic lactones and flukicides) in bovine liver at MRL and non-MRL level. A dual validation approach was adapted to reliably detect anthelmintic residues ov...

  3. 78 FR 31948 - Government-Owned Inventions; Availability for Licensing

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-05-28

    ... listed below are owned by an agency of the U.S. Government and are available for licensing in the U.S. in... coverage for companies and may also be available for licensing. FOR FURTHER INFORMATION CONTACT: Licensing... precise and reliable quantification. Currently, there is no approved drug to treat FXS. The invention...

  4. Quantification for Complex Assessment: Uncertainty Estimation in Final Year Project Thesis Assessment

    ERIC Educational Resources Information Center

    Kim, Ho Sung

    2013-01-01

    A quantitative method for estimating an expected uncertainty (reliability and validity) in assessment results arising from the relativity between four variables, viz examiner's expertise, examinee's expertise achieved, assessment task difficulty and examinee's performance, was developed for the complex assessment applicable to final…

  5. A Direct Aqueous Derivatization GSMS Method for Determining Benzoylecgonine Concentrations in Human Urine.

    PubMed

    Chericoni, Silvio; Stefanelli, Fabio; Da Valle, Ylenia; Giusiani, Mario

    2015-09-01

    A sensitive and reliable method for extraction and quantification of benzoylecgonine (BZE) and cocaine (COC) in urine is presented. Propyl-chloroformate was used as derivatizing agent, and it was directly added to the urine sample: the propyl derivative and COC were then recovered by liquid-liquid extraction procedure. Gas chromatography-mass spectrometry was used to detect the analytes in selected ion monitoring mode. The method proved to be precise for BZE and COC both in term of intraday and interday analysis, with a coefficient of variation (CV)<6%. Limits of detection (LOD) were 2.7 ng/mL for BZE and 1.4 ng/mL for COC. The calibration curve showed a linear relationship for BZE and COC (r2>0.999 and >0.997, respectively) within the range investigated. The method, applied to thirty authentic samples, showed to be very simple, fast, and reliable, so it can be easily applied in routine analysis for the quantification of BZE and COC in urine samples. © 2015 American Academy of Forensic Sciences.

  6. Design and performance testing of a real-time PCR assay for sensitive and reliable direct quantification of Brettanomyces in wine.

    PubMed

    Tessonnière, H; Vidal, S; Barnavon, L; Alexandre, H; Remize, F

    2009-02-28

    Because the yeast Brettanomyces produces volatile phenols and acetic acid, it is responsible for wine spoilage. The uncontrolled accumulation of these molecules in wine leads to sensorial defects that compromise wine quality. The need for a rapid, specific, sensitive and reliable method to detect this spoilage yeast has increased over the last decade. All these requirements are met by real-time PCR. We here propose improvements of existing methods to enhance the robustness of the assay. Six different protocols to isolate DNA from a wine and three PCR mix compositions were tested, and the best method was selected. Insoluble PVPP addition during DNA extraction by a classical phenol:chloroform protocol succeeded in the relief of PCR inhibitors from wine. We developed an internal control which was efficient to avoid false negative results due to decreases in the efficiency of DNA isolation and/or amplification. The method was evaluated by an intra-laboratory study for its specificity, linearity, repeatability and reproducibility. A standard curve was established from 14 different wines artificially inoculated. The quantification limit was 31 cfu/mL.

  7. Digital pathology: elementary, rapid and reliable automated image analysis.

    PubMed

    Bouzin, Caroline; Saini, Monika L; Khaing, Kyi-Kyi; Ambroise, Jérôme; Marbaix, Etienne; Grégoire, Vincent; Bol, Vanesa

    2016-05-01

    Slide digitalization has brought pathology to a new era, including powerful image analysis possibilities. However, while being a powerful prognostic tool, immunostaining automated analysis on digital images is still not implemented worldwide in routine clinical practice. Digitalized biopsy sections from two independent cohorts of patients, immunostained for membrane or nuclear markers, were quantified with two automated methods. The first was based on stained cell counting through tissue segmentation, while the second relied upon stained area proportion within tissue sections. Different steps of image preparation, such as automated tissue detection, folds exclusion and scanning magnification, were also assessed and validated. Quantification of either stained cells or the stained area was found to be correlated highly for all tested markers. Both methods were also correlated with visual scoring performed by a pathologist. For an equivalent reliability, quantification of the stained area is, however, faster and easier to fine-tune and is therefore more compatible with time constraints for prognosis. This work provides an incentive for the implementation of automated immunostaining analysis with a stained area method in routine laboratory practice. © 2015 John Wiley & Sons Ltd.

  8. Quantitative characterization of fatty liver disease using x-ray scattering

    NASA Astrophysics Data System (ADS)

    Elsharkawy, Wafaa B.; Elshemey, Wael M.

    2013-11-01

    Nonalcoholic fatty liver disease (NAFLD) is a dynamic condition in which fat abnormally accumulates within the hepatocytes. It is believed to be a marker of risk of later chronic liver diseases, such as liver cirrhosis and carcinoma. The fat content in liver biopsies determines its validity for liver transplantation. Transplantation of livers with severe NAFLD is associated with a high risk of primary non-function. Moreover, NAFLD is recognized as a clinically important feature that influences patient morbidity and mortality after hepatic resection. Unfortunately, there is a lack in a precise, reliable and reproducible method for quantification of NAFLD. This work suggests a method for the quantification of NAFLD. The method is based on the fact that fatty liver tissue would have a characteristic x-ray scattering profile with a relatively intense fat peak at a momentum transfer value of 1.1 nm-1 compared to a soft tissue peak at 1.6 nm-1. The fat content in normal and fatty liver is plotted against three profile characterization parameters (ratio of peak intensities, ratio of area under peaks and ratio of area under fat peak to total profile area) for measured and Monte Carlo simulated x-ray scattering profiles. Results show a high linear dependence (R2>0.9) of the characterization parameters on the liver fat content with a reported high correlation coefficient (>0.9) between measured and simulated data. These results indicate that the current method probably offers reliable quantification of fatty liver disease.

  9. Comparing Two Processing Pipelines to Measure Subcortical and Cortical Volumes in Patients with and without Mild Traumatic Brain Injury.

    PubMed

    Reid, Matthew W; Hannemann, Nathan P; York, Gerald E; Ritter, John L; Kini, Jonathan A; Lewis, Jeffrey D; Sherman, Paul M; Velez, Carmen S; Drennon, Ann Marie; Bolzenius, Jacob D; Tate, David F

    2017-07-01

    To compare volumetric results from NeuroQuant® and FreeSurfer in a service member setting. Since the advent of medical imaging, quantification of brain anatomy has been a major research and clinical effort. Rapid advancement of methods to automate quantification and to deploy this information into clinical practice has surfaced in recent years. NeuroQuant® is one such tool that has recently been used in clinical settings. Accurate volumetric data are useful in many clinical indications; therefore, it is important to assess the intermethod reliability and concurrent validity of similar volume quantifying tools. Volumetric data from 148 U.S. service members across three different experimental groups participating in a study of mild traumatic brain injury (mTBI) were examined. Groups included mTBI (n = 71), posttraumatic stress disorder (n = 22), or a noncranial orthopedic injury (n = 55). Correlation coefficients and nonparametric group mean comparisons were used to assess reliability and concurrent validity, respectively. Comparison of these methods across our entire sample demonstrates generally fair to excellent reliability as evidenced by large intraclass correlation coefficients (ICC = .4 to .99), but little concurrent validity as evidenced by significantly different Mann-Whitney U comparisons for 26 of 30 brain structures measured. While reliability between the two segmenting tools is fair to excellent, volumetric outcomes are statistically different between the two methods. As suggested by both developers, structure segmentation should be visually verified prior to clinical use and rigor should be used when interpreting results generated by either method. Copyright © 2017 by the American Society of Neuroimaging.

  10. Quantification of fossil organic matter in contaminated sediments from an industrial watershed: validation of the quantitative multimolecular approach by radiocarbon analysis.

    PubMed

    Jeanneau, Laurent; Faure, Pierre

    2010-09-01

    The quantitative multimolecular approach (QMA) based on an exhaustive identification and quantification of molecules from the extractable organic matter (EOM) has been recently developed in order to investigate organic contamination in sediments by a more complete method than the restrictive quantification of target contaminants. Such an approach allows (i) the comparison between natural and anthropogenic inputs, (ii) between modern and fossil organic matter and (iii) the differentiation between several anthropogenic sources. However QMA is based on the quantification of molecules recovered by organic solvent and then analyzed by gas chromatography-mass spectrometry, which represent a small fraction of sedimentary organic matter (SOM). In order to extend the conclusions of QMA to SOM, radiocarbon analyses have been performed on organic extracts and decarbonated sediments. This analysis allows (i) the differentiation between modern biomass (contemporary (14)C) and fossil organic matter ((14)C-free) and (ii) the calculation of the modern carbon percentage (PMC). At the confluence between Fensch and Moselle Rivers, a catchment highly contaminated by both industrial activities and urbanization, PMC values in decarbonated sediments are well correlated with the percentage of natural molecular markers determined by QMA. It highlights that, for this type of contamination by fossil organic matter inputs, the conclusions of QMA can be scaled up to SOM. QMA is an efficient environmental diagnostic tool that leads to a more realistic quantification of fossil organic matter in sediments. Copyright 2010 Elsevier B.V. All rights reserved.

  11. Quantification of atherosclerotic plaque activity and vascular inflammation using [18-F] fluorodeoxyglucose positron emission tomography/computed tomography (FDG-PET/CT).

    PubMed

    Mehta, Nehal N; Torigian, Drew A; Gelfand, Joel M; Saboury, Babak; Alavi, Abass

    2012-05-02

    Conventional non-invasive imaging modalities of atherosclerosis such as coronary artery calcium (CAC) and carotid intimal medial thickness (C-IMT) provide information about the burden of disease. However, despite multiple validation studies of CAC, and C-IMT, these modalities do not accurately assess plaque characteristics, and the composition and inflammatory state of the plaque determine its stability and, therefore, the risk of clinical events. [(18)F]-2-fluoro-2-deoxy-D-glucose (FDG) imaging using positron-emission tomography (PET)/computed tomography (CT) has been extensively studied in oncologic metabolism. Studies using animal models and immunohistochemistry in humans show that FDG-PET/CT is exquisitely sensitive for detecting macrophage activity, an important source of cellular inflammation in vessel walls. More recently, we and others have shown that FDG-PET/CT enables highly precise, novel measurements of inflammatory activity of activity of atherosclerotic plaques in large and medium-sized arteries. FDG-PET/CT studies have many advantages over other imaging modalities: 1) high contrast resolution; 2) quantification of plaque volume and metabolic activity allowing for multi-modal atherosclerotic plaque quantification; 3) dynamic, real-time, in vivo imaging; 4) minimal operator dependence. Finally, vascular inflammation detected by FDG-PET/CT has been shown to predict cardiovascular (CV) events independent of traditional risk factors and is also highly associated with overall burden of atherosclerosis. Plaque activity by FDG-PET/CT is modulated by known beneficial CV interventions such as short term (12 week) statin therapy as well as longer term therapeutic lifestyle changes (16 months). The current methodology for quantification of FDG uptake in atherosclerotic plaque involves measurement of the standardized uptake value (SUV) of an artery of interest and of the venous blood pool in order to calculate a target to background ratio (TBR), which is calculated by dividing the arterial SUV by the venous blood pool SUV. This method has shown to represent a stable, reproducible phenotype over time, has a high sensitivity for detection of vascular inflammation, and also has high inter-and intra-reader reliability. Here we present our methodology for patient preparation, image acquisition, and quantification of atherosclerotic plaque activity and vascular inflammation using SUV, TBR, and a global parameter called the metabolic volumetric product (MVP). These approaches may be applied to assess vascular inflammation in various study samples of interest in a consistent fashion as we have shown in several prior publications.

  12. High Levels of Exosomes Expressing CD63 and Caveolin-1 in Plasma of Melanoma Patients

    PubMed Central

    Logozzi, Mariantonia; De Milito, Angelo; Lugini, Luana; Borghi, Martina; Calabrò, Luana; Spada, Massimo; Perdicchio, Maurizio; Marino, Maria Lucia; Federici, Cristina; Iessi, Elisabetta; Brambilla, Daria; Venturi, Giulietta; Lozupone, Francesco; Santinami, Mario; Huber, Veronica; Maio, Michele; Rivoltini, Licia; Fais, Stefano

    2009-01-01

    Background Metastatic melanoma is an untreatable cancer lacking reliable and non-invasive markers of disease progression. Exosomes are small vesicles secreted by normal as well as tumor cells. Human tumor-derived exosomes are involved in malignant progression and we evaluated the presence of exosomes in plasma of melanoma patients as a potential tool for cancer screening and follow-up. Methodology/Principal Findings We designed an in-house sandwich ELISA (Exotest) to capture and quantify exosomes in plasma based on expression of housekeeping proteins (CD63 and Rab-5b) and a tumor-associated marker (caveolin-1). Western blot and flow cytometry analysis of exosomes were used to confirm the Exotest-based findings. The Exotest allowed sensitive detection and quantification of exosomes purified from human tumor cell culture supernatants and plasma from SCID mice engrafted with human melanoma. Plasma levels of exosomes in melanoma-engrafted SCID mice correlated to tumor size. We evaluated the levels of plasma exosomes expressing CD63 and caveolin-1 in melanoma patients (n = 90) and healthy donors (n = 58). Consistently, plasma exosomes expressing CD63 (504±315) or caveolin-1 (619±310) were significantly increased in melanoma patients as compared to healthy donors (223±125 and 228±102, respectively). While the Exotest for CD63+ plasma exosomes had limited sensitivity (43%) the Exotest for detection of caveolin-1+ plasma exosomes showed a higher sensitivity (68%). Moreover, caveolin-1+ plasma exosomes were significantly increased with respect to CD63+ exosomes in the patients group. Conclusions/Significance We describe a new non-invasive assay allowing detection and quantification of human exosomes in plasma of melanoma patients. Our results suggest that the Exotest for detection of plasma exosomes carrying tumor-associated antigens may represent a novel tool for clinical management of cancer patients. PMID:19381331

  13. Performance of Different Analytical Software Packages in Quantification of DNA Methylation by Pyrosequencing.

    PubMed

    Grasso, Chiara; Trevisan, Morena; Fiano, Valentina; Tarallo, Valentina; De Marco, Laura; Sacerdote, Carlotta; Richiardi, Lorenzo; Merletti, Franco; Gillio-Tos, Anna

    2016-01-01

    Pyrosequencing has emerged as an alternative method of nucleic acid sequencing, well suited for many applications which aim to characterize single nucleotide polymorphisms, mutations, microbial types and CpG methylation in the target DNA. The commercially available pyrosequencing systems can harbor two different types of software which allow analysis in AQ or CpG mode, respectively, both widely employed for DNA methylation analysis. Aim of the study was to assess the performance for DNA methylation analysis at CpG sites of the two pyrosequencing software which allow analysis in AQ or CpG mode, respectively. Despite CpG mode having been specifically generated for CpG methylation quantification, many investigations on this topic have been carried out with AQ mode. As proof of equivalent performance of the two software for this type of analysis is not available, the focus of this paper was to evaluate if the two modes currently used for CpG methylation assessment by pyrosequencing may give overlapping results. We compared the performance of the two software in quantifying DNA methylation in the promoter of selected genes (GSTP1, MGMT, LINE-1) by testing two case series which include DNA from paraffin embedded prostate cancer tissues (PC study, N = 36) and DNA from blood fractions of healthy people (DD study, N = 28), respectively. We found discrepancy in the two pyrosequencing software-based quality assignment of DNA methylation assays. Compared to the software for analysis in the AQ mode, less permissive criteria are supported by the Pyro Q-CpG software, which enables analysis in CpG mode. CpG mode warns the operators about potential unsatisfactory performance of the assay and ensures a more accurate quantitative evaluation of DNA methylation at CpG sites. The implementation of CpG mode is strongly advisable in order to improve the reliability of the methylation analysis results achievable by pyrosequencing.

  14. High-throughput analysis of sub-visible mAb aggregate particles using automated fluorescence microscopy imaging.

    PubMed

    Paul, Albert Jesuran; Bickel, Fabian; Röhm, Martina; Hospach, Lisa; Halder, Bettina; Rettich, Nina; Handrick, René; Herold, Eva Maria; Kiefer, Hans; Hesse, Friedemann

    2017-07-01

    Aggregation of therapeutic proteins is a major concern as aggregates lower the yield and can impact the efficacy of the drug as well as the patient's safety. It can occur in all production stages; thus, it is essential to perform a detailed analysis for protein aggregates. Several methods such as size exclusion high-performance liquid chromatography (SE-HPLC), light scattering, turbidity, light obscuration, and microscopy-based approaches are used to analyze aggregates. None of these methods allows determination of all types of higher molecular weight (HMW) species due to a limited size range. Furthermore, quantification and specification of different HMW species are often not possible. Moreover, automation is a perspective challenge coming up with automated robotic laboratory systems. Hence, there is a need for a fast, high-throughput-compatible method, which can detect a broad size range and enable quantification and classification. We describe a novel approach for the detection of aggregates in the size range 1 to 1000 μm combining fluorescent dyes for protein aggregate labelling and automated fluorescence microscope imaging (aFMI). After appropriate selection of the dye and method optimization, our method enabled us to detect various types of HMW species of monoclonal antibodies (mAbs). Using 10 μmol L -1 4,4'-dianilino-1,1'-binaphthyl-5,5'-disulfonate (Bis-ANS) in combination with aFMI allowed the analysis of mAb aggregates induced by different stresses occurring during downstream processing, storage, and administration. Validation of our results was performed by SE-HPLC, UV-Vis spectroscopy, and dynamic light scattering. With this new approach, we could not only reliably detect different HMW species but also quantify and classify them in an automated approach. Our method achieves high-throughput requirements and the selection of various fluorescent dyes enables a broad range of applications.

  15. A novel standardized algorithm using SPECT/CT evaluating unhappy patients after unicondylar knee arthroplasty--a combined analysis of tracer uptake distribution and component position.

    PubMed

    Suter, Basil; Testa, Enrique; Stämpfli, Patrick; Konala, Praveen; Rasch, Helmut; Friederich, Niklaus F; Hirschmann, Michael T

    2015-03-20

    The introduction of a standardized SPECT/CT algorithm including a localization scheme, which allows accurate identification of specific patterns and thresholds of SPECT/CT tracer uptake, could lead to a better understanding of the bone remodeling and specific failure modes of unicondylar knee arthroplasty (UKA). The purpose of the present study was to introduce a novel standardized SPECT/CT algorithm for patients after UKA and evaluate its clinical applicability, usefulness and inter- and intra-observer reliability. Tc-HDP-SPECT/CT images of consecutive patients (median age 65, range 48-84 years) with 21 knees after UKA were prospectively evaluated. The tracer activity on SPECT/CT was localized using a specific standardized UKA localization scheme. For tracer uptake analysis (intensity and anatomical distribution pattern) a 3D volumetric quantification method was used. The maximum intensity values were recorded for each anatomical area. In addition, ratios between the respective value in the measured area and the background tracer activity were calculated. The femoral and tibial component position (varus-valgus, flexion-extension, internal and external rotation) was determined in 3D-CT. The inter- and intraobserver reliability of the localization scheme, grading of the tracer activity and component measurements were determined by calculating the intraclass correlation coefficients (ICC). The localization scheme, grading of the tracer activity and component measurements showed high inter- and intra-observer reliabilities for all regions (tibia, femur and patella). For measurement of component position there was strong agreement between the readings of the two observers; the ICC for the orientation of the femoral component was 0.73-1.00 (intra-observer reliability) and 0.91-1.00 (inter-observer reliability). The ICC for the orientation of the tibial component was 0.75-1.00 (intra-observer reliability) and 0.77-1.00 (inter-observer reliability). The SPECT/CT algorithm presented combining the mechanical information on UKA component position, alignment and metabolic data is highly reliable and proved to be a valuable, consistent and useful tool for analysing postoperative knees after UKA. Using this standardized approach in clinical studies might be helpful in establishing the diagnosis in patients with pain after UKA.

  16. Fully automated system for the quantification of human osteoarthritic knee joint effusion volume using magnetic resonance imaging

    PubMed Central

    2010-01-01

    Introduction Joint effusion is frequently associated with osteoarthritis (OA) flare-up and is an important marker of therapeutic response. This study aimed at developing and validating a fully automated system based on magnetic resonance imaging (MRI) for the quantification of joint effusion volume in knee OA patients. Methods MRI examinations consisted of two axial sequences: a T2-weighted true fast imaging with steady-state precession and a T1-weighted gradient echo. An automated joint effusion volume quantification system using MRI was developed and validated (a) with calibrated phantoms (cylinder and sphere) and effusion from knee OA patients; (b) with assessment by manual quantification; and (c) by direct aspiration. Twenty-five knee OA patients with joint effusion were included in the study. Results The automated joint effusion volume quantification was developed as a four stage sequencing process: bone segmentation, filtering of unrelated structures, segmentation of joint effusion, and subvoxel volume calculation. Validation experiments revealed excellent coefficients of variation with the calibrated cylinder (1.4%) and sphere (0.8%) phantoms. Comparison of the OA knee joint effusion volume assessed by the developed automated system and by manual quantification was also excellent (r = 0.98; P < 0.0001), as was the comparison with direct aspiration (r = 0.88; P = 0.0008). Conclusions The newly developed fully automated MRI-based system provided precise quantification of OA knee joint effusion volume with excellent correlation with data from phantoms, a manual system, and joint aspiration. Such an automated system will be instrumental in improving the reproducibility/reliability of the evaluation of this marker in clinical application. PMID:20846392

  17. A phase quantification method based on EBSD data for a continuously cooled microalloyed steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, H.; Wynne, B.P.; Palmiere, E.J., E-mail: e.j

    2017-01-15

    Mechanical properties of steels depend on the phase constitutions of the final microstructures which can be related to the processing parameters. Therefore, accurate quantification of different phases is necessary to investigate the relationships between processing parameters, final microstructures and mechanical properties. Point counting on micrographs observed by optical or scanning electron microscopy is widely used as a phase quantification method, and different phases are discriminated according to their morphological characteristics. However, it is difficult to differentiate some of the phase constituents with similar morphology. Differently, for EBSD based phase quantification methods, besides morphological characteristics, other parameters derived from the orientationmore » information can also be used for discrimination. In this research, a phase quantification method based on EBSD data in the unit of grains was proposed to identify and quantify the complex phase constitutions of a microalloyed steel subjected to accelerated coolings. Characteristics of polygonal ferrite/quasi-polygonal ferrite, acicular ferrite and bainitic ferrite on grain averaged misorientation angles, aspect ratios, high angle grain boundary fractions and grain sizes were analysed and used to develop the identification criteria for each phase. Comparing the results obtained by this EBSD based method and point counting, it was found that this EBSD based method can provide accurate and reliable phase quantification results for microstructures with relatively slow cooling rates. - Highlights: •A phase quantification method based on EBSD data in the unit of grains was proposed. •The critical grain area above which GAM angles are valid parameters was obtained. •Grain size and grain boundary misorientation were used to identify acicular ferrite. •High cooling rates deteriorate the accuracy of this EBSD based method.« less

  18. Recommendations for adaptation and validation of commercial kits for biomarker quantification in drug development.

    PubMed

    Khan, Masood U; Bowsher, Ronald R; Cameron, Mark; Devanarayan, Viswanath; Keller, Steve; King, Lindsay; Lee, Jean; Morimoto, Alyssa; Rhyne, Paul; Stephen, Laurie; Wu, Yuling; Wyant, Timothy; Lachno, D Richard

    2015-01-01

    Increasingly, commercial immunoassay kits are used to support drug discovery and development. Longitudinally consistent kit performance is crucial, but the degree to which kits and reagents are characterized by manufacturers is not standardized, nor are the approaches by users to adapt them and evaluate their performance through validation prior to use. These factors can negatively impact data quality. This paper offers a systematic approach to assessment, method adaptation and validation of commercial immunoassay kits for quantification of biomarkers in drug development, expanding upon previous publications and guidance. These recommendations aim to standardize and harmonize user practices, contributing to reliable biomarker data from commercial immunoassays, thus, enabling properly informed decisions during drug development.

  19. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection

    NASA Astrophysics Data System (ADS)

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-10-01

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets.

  20. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection.

    PubMed

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-10-14

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets.

  1. An alternative method for irones quantification in iris rhizomes using headspace solid-phase microextraction.

    PubMed

    Roger, B; Fernandez, X; Jeannot, V; Chahboun, J

    2010-01-01

    The essential oil obtained from iris rhizomes is one of the most precious raw materials for the perfume industry. Its fragrance is due to irones that are gradually formed by oxidative degradation of iridals during rhizome ageing. The development of an alternative method allowing irone quantification in iris rhizomes using HS-SPME-GC. The development of the method using HS-SPME-GC was achieved using the results obtained from a conventional method, i.e. a solid-liquid extraction (SLE) followed by irone quantification by CG. Among several calibration methods tested, internal calibration gave the best results and was the least sensitive to the matrix effect. The proposed method using HS-SPME-GC is as accurate and reproducible as the conventional one using SLE. These two methods were used to monitor and compare irone concentrations in iris rhizomes that had been stored for 6 months to 9 years. Irone quantification in iris rhizome can be achieved using HS-SPME-GC. This method can thus be used for the quality control of the iris rhizomes. It offers the advantage of combining extraction and analysis with an automated device and thus allows a large number of rhizome batches to be analysed and compared in a limited amount of time. Copyright © 2010 John Wiley & Sons, Ltd.

  2. Ariadne's Thread: A Robust Software Solution Leading to Automated Absolute and Relative Quantification of SRM Data.

    PubMed

    Nasso, Sara; Goetze, Sandra; Martens, Lennart

    2015-09-04

    Selected reaction monitoring (SRM) MS is a highly selective and sensitive technique to quantify protein abundances in complex biological samples. To enhance the pace of SRM large studies, a validated, robust method to fully automate absolute quantification and to substitute for interactive evaluation would be valuable. To address this demand, we present Ariadne, a Matlab software. To quantify monitored targets, Ariadne exploits metadata imported from the transition lists, and targets can be filtered according to mProphet output. Signal processing and statistical learning approaches are combined to compute peptide quantifications. To robustly estimate absolute abundances, the external calibration curve method is applied, ensuring linearity over the measured dynamic range. Ariadne was benchmarked against mProphet and Skyline by comparing its quantification performance on three different dilution series, featuring either noisy/smooth traces without background or smooth traces with complex background. Results, evaluated as efficiency, linearity, accuracy, and precision of quantification, showed that Ariadne's performance is independent of data smoothness and complex background presence and that Ariadne outperforms mProphet on the noisier data set and improved 2-fold Skyline's accuracy and precision for the lowest abundant dilution with complex background. Remarkably, Ariadne could statistically distinguish from each other all different abundances, discriminating dilutions as low as 0.1 and 0.2 fmol. These results suggest that Ariadne offers reliable and automated analysis of large-scale SRM differential expression studies.

  3. Automated quantification of Epstein-Barr Virus in whole blood of hematopoietic stem cell transplant patients using the Abbott m2000 system.

    PubMed

    Salmona, Maud; Fourati, Slim; Feghoul, Linda; Scieux, Catherine; Thiriez, Aline; Simon, François; Resche-Rigon, Matthieu; LeGoff, Jérôme

    2016-08-01

    Accurate quantification of Epstein-Barr virus (EBV) load in blood is essential for the management of post-transplant lymphoproliferative disorders. The automation of DNA extraction and amplification may improve accuracy and reproducibility. We evaluated the EBV PCR Kit V1 with fully automated DNA extraction and amplification on the m2000 system (Abbott assay). Conversion factor between copies and international units (IU), lower limit of quantification, imprecision and linearity were determined in a whole blood (WB) matrix. Results from 339 clinical WB specimens were compared with a home-brew real-time PCR assay used in our laboratory (in-house assay). The conversion factor between copies and IU was 3.22 copies/IU. The lower limit of quantification (LLQ) was 1000 copies/mL. Intra- and inter-assay coefficients of variation were 3.1% and 7.9% respectively for samples with EBV load higher than the LLQ. The comparison between Abbott assay and in-house assay showed a good concordance (kappa = 0.77). Loads were higher with the Abbott assay (mean difference = 0.62 log10 copies/mL). The EBV PCR Kit V1 assay on the m2000 system provides a reliable and easy-to-use method for quantification of EBV DNA in WB. Copyright © 2016 Elsevier Inc. All rights reserved.

  4. Simultaneous measurements of kinematics and fMRI: compatibility assessment and case report on recovery evaluation of one stroke patient.

    PubMed

    Casellato, Claudia; Ferrante, Simona; Gandolla, Marta; Volonterio, Nicola; Ferrigno, Giancarlo; Baselli, Giuseppe; Frattini, Tiziano; Martegani, Alberto; Molteni, Franco; Pedrocchi, Alessandra

    2010-09-23

    Correlating the features of the actual executed movement with the associated cortical activations can enhance the reliability of the functional Magnetic Resonance Imaging (fMRI) data interpretation. This is crucial for longitudinal evaluation of motor recovery in neurological patients and for investigating detailed mutual interactions between activation maps and movement parameters.Therefore, we have explored a new set-up combining fMRI with an optoelectronic motion capture system, which provides a multi-parameter quantification of the performed motor task. The cameras of the motion system were mounted inside the MR room and passive markers were placed on the subject skin, without any risk or encumbrance. The versatile set-up allows 3-dimensional multi-segment acquisitions including recording of possible mirror movements, and it guarantees a high inter-sessions repeatability.We demonstrated the integrated set-up reliability through compatibility tests. Then, an fMRI block-design protocol combined with kinematic recordings was tested on a healthy volunteer performing finger tapping and ankle dorsal- plantar-flexion. A preliminary assessment of clinical applicability and perspectives was carried out by pre- and post rehabilitation acquisitions on a hemiparetic patient performing ankle dorsal- plantar-flexion. For all sessions, the proposed method integrating kinematic data into the model design was compared with the standard analysis. Phantom acquisitions demonstrated the not-compromised image quality. Healthy subject sessions showed the protocols feasibility and the model reliability with the kinematic regressor. The patient results showed that brain activation maps were more consistent when the images analysis included in the regression model, besides the stimuli, the kinematic regressor quantifying the actual executed movement (movement timing and amplitude), proving a significant model improvement. Moreover, concerning motor recovery evaluation, after one rehabilitation month, a greater cortical area was activated during exercise, in contrast to the usual focalization associated with functional recovery. Indeed, the availability of kinematics data allows to correlate this wider area with a higher frequency and a larger amplitude of movement. The kinematic acquisitions resulted to be reliable and versatile to enrich the fMRI images information and therefore the evaluation of motor recovery in neurological patients where large differences between required and performed motion can be expected.

  5. Simultaneous measurements of kinematics and fMRI: compatibility assessment and case report on recovery evaluation of one stroke patient

    PubMed Central

    2010-01-01

    Background Correlating the features of the actual executed movement with the associated cortical activations can enhance the reliability of the functional Magnetic Resonance Imaging (fMRI) data interpretation. This is crucial for longitudinal evaluation of motor recovery in neurological patients and for investigating detailed mutual interactions between activation maps and movement parameters. Therefore, we have explored a new set-up combining fMRI with an optoelectronic motion capture system, which provides a multi-parameter quantification of the performed motor task. Methods The cameras of the motion system were mounted inside the MR room and passive markers were placed on the subject skin, without any risk or encumbrance. The versatile set-up allows 3-dimensional multi-segment acquisitions including recording of possible mirror movements, and it guarantees a high inter-sessions repeatability. We demonstrated the integrated set-up reliability through compatibility tests. Then, an fMRI block-design protocol combined with kinematic recordings was tested on a healthy volunteer performing finger tapping and ankle dorsal- plantar-flexion. A preliminary assessment of clinical applicability and perspectives was carried out by pre- and post rehabilitation acquisitions on a hemiparetic patient performing ankle dorsal- plantar-flexion. For all sessions, the proposed method integrating kinematic data into the model design was compared with the standard analysis. Results Phantom acquisitions demonstrated the not-compromised image quality. Healthy subject sessions showed the protocols feasibility and the model reliability with the kinematic regressor. The patient results showed that brain activation maps were more consistent when the images analysis included in the regression model, besides the stimuli, the kinematic regressor quantifying the actual executed movement (movement timing and amplitude), proving a significant model improvement. Moreover, concerning motor recovery evaluation, after one rehabilitation month, a greater cortical area was activated during exercise, in contrast to the usual focalization associated with functional recovery. Indeed, the availability of kinematics data allows to correlate this wider area with a higher frequency and a larger amplitude of movement. Conclusions The kinematic acquisitions resulted to be reliable and versatile to enrich the fMRI images information and therefore the evaluation of motor recovery in neurological patients where large differences between required and performed motion can be expected. PMID:20863391

  6. Increasing the Reliability of Circulation Model Validation: Quantifying Drifter Slip to See how Currents are Actually Moving

    NASA Astrophysics Data System (ADS)

    Anderson, T.

    2016-02-01

    Ocean circulation forecasts can help answer questions regarding larval dispersal, passive movement of injured sea animals, oil spill mitigation, and search and rescue efforts. Circulation forecasts are often validated with GPS-tracked drifter paths, but how accurately do these drifters actually move with ocean currents? Drifters are not only moved by water, but are also forced by wind and waves acting on the exposed buoy and transmitter; this imperfect movement is referred to as drifter slip. The quantification and further understanding of drifter slip will allow scientists to differentiate between drifter imperfections and actual computer model error when comparing trajectory forecasts with actual drifter tracks. This will avoid falsely accrediting all discrepancies between a trajectory forecast and an actual drifter track to computer model error. During multiple deployments of drifters in Nantucket Sound and using observed wind and wave data, we attempt to quantify the slip of drifters developed by the Northeast Fisheries Science Center's (NEFSC) Student Drifters Program. While similar studies have been conducted previously, very few have directly attached current meters to drifters to quantify drifter slip. Furthermore, none have quantified slip of NEFSC drifters relative to the oceanographic-standard "CODE" drifter. The NEFSC drifter archive has over 1000 drifter tracks primarily off the New England coast. With a better understanding of NEFSC drifter slip, modelers can reliably use these tracks for model validation.

  7. Increasing the Reliability of Circulation Model Validation: Quantifying Drifter Slip to See how Currents are Actually Moving

    NASA Astrophysics Data System (ADS)

    Anderson, T.

    2015-12-01

    Ocean circulation forecasts can help answer questions regarding larval dispersal, passive movement of injured sea animals, oil spill mitigation, and search and rescue efforts. Circulation forecasts are often validated with GPS-tracked drifter paths, but how accurately do these drifters actually move with ocean currents? Drifters are not only moved by water, but are also forced by wind and waves acting on the exposed buoy and transmitter; this imperfect movement is referred to as drifter slip. The quantification and further understanding of drifter slip will allow scientists to differentiate between drifter imperfections and actual computer model error when comparing trajectory forecasts with actual drifter tracks. This will avoid falsely accrediting all discrepancies between a trajectory forecast and an actual drifter track to computer model error. During multiple deployments of drifters in Nantucket Sound and using observed wind and wave data, we attempt to quantify the slip of drifters developed by the Northeast Fisheries Science Center's (NEFSC) Student Drifters Program. While similar studies have been conducted previously, very few have directly attached current meters to drifters to quantify drifter slip. Furthermore, none have quantified slip of NEFSC drifters relative to the oceanographic-standard "CODE" drifter. The NEFSC drifter archive has over 1000 drifter tracks primarily off the New England coast. With a better understanding of NEFSC drifter slip, modelers can reliably use these tracks for model validation.

  8. On the reliable probing of discrete ‘plasma bullet’ propagation

    NASA Astrophysics Data System (ADS)

    Svarnas, P.; Gazeli, K.; Gkelios, A.; Amanatides, E.; Mataras, D.

    2018-04-01

    This report is devoted to the imaging of the spatiotemporal evolution of ‘plasma bullets’ during their propagation at atmospheric pressure. Although numerous studies have been realized on this topic with high gating rate cameras, triggering issues and statistical analyses of single-shot events over different cycles of the driving high voltage have not been discussed properly. The present work demonstrates the related difficulties faced due to the inherently erratic propagation of the bullets. A way of capturing and statistically analysing discrete bullet events is introduced, which is reliable even when low gating rate cameras are used and multiple bullets are formed within the voltage cycle. The method is based on plasma observations by means of two photoelectron multiplier tubes. It is suggested that these signals correlate better with bullet propagation events than the driving voltage or bullet current waveforms do, and allow either the elimination of issues arising from erratic propagation and hardware delays or at least the quantification of certain uncertainties. Herein, the entire setup, the related concept and the limits of accuracy are discussed in detail. Snapshots of the bullets are captured and commented on, with the bullets being produced by a sinusoidally driven single-electrode plasma jet reactor operating with helium. Finally, the instantaneous velocities of bullets on the order of 104-105 m s-1 are measured and propagation phases are distinguished in good agreement with the bibliography.

  9. Windowed R-PDLF recoupling: a flexible and reliable tool to characterize molecular dynamics.

    PubMed

    Gansmüller, Axel; Simorre, Jean-Pierre; Hediger, Sabine

    2013-09-01

    This work focuses on the improvement of the R-PDLF heteronuclear recoupling scheme, a method that allows quantification of molecular dynamics up to the microsecond timescale in heterogeneous materials. We show how the stability of the sequence towards rf-imperfections, one of the main sources of error of this technique, can be improved by the insertion of windows without irradiation into the basic elements of the symmetry-based recoupling sequence. The impact of this modification on the overall performance of the sequence in terms of scaling factor and homonuclear decoupling efficiency is evaluated. This study indicates the experimental conditions for which precise and reliable measurement of dipolar couplings can be obtained using the popular R18(1)(7) recoupling sequence, as well as alternative symmetry-based R sequences suited for fast MAS conditions. An analytical expression for the recoupled dipolar modulation has been derived that applies to a whole class of sequences with similar recoupling properties as R18(1)(7). This analytical expression provides an efficient and precise way to extract dipolar couplings from the experimental dipolar modulation curves. We hereby provide helpful tools and information for tailoring R-PDLF recoupling schemes to specific sample properties and hardware capabilities. This approach is particularly well suited for the study of materials with strong and heterogeneous molecular dynamics where a precise measurement of dipolar couplings is crucial. Copyright © 2013 Elsevier Inc. All rights reserved.

  10. Multiscale reconstruction for MR fingerprinting.

    PubMed

    Pierre, Eric Y; Ma, Dan; Chen, Yong; Badve, Chaitra; Griswold, Mark A

    2016-06-01

    To reduce the acquisition time needed to obtain reliable parametric maps with Magnetic Resonance Fingerprinting. An iterative-denoising algorithm is initialized by reconstructing the MRF image series at low image resolution. For subsequent iterations, the method enforces pixel-wise fidelity to the best-matching dictionary template then enforces fidelity to the acquired data at slightly higher spatial resolution. After convergence, parametric maps with desirable spatial resolution are obtained through template matching of the final image series. The proposed method was evaluated on phantom and in vivo data using the highly undersampled, variable-density spiral trajectory and compared with the original MRF method. The benefits of additional sparsity constraints were also evaluated. When available, gold standard parameter maps were used to quantify the performance of each method. The proposed approach allowed convergence to accurate parametric maps with as few as 300 time points of acquisition, as compared to 1000 in the original MRF work. Simultaneous quantification of T1, T2, proton density (PD), and B0 field variations in the brain was achieved in vivo for a 256 × 256 matrix for a total acquisition time of 10.2 s, representing a three-fold reduction in acquisition time. The proposed iterative multiscale reconstruction reliably increases MRF acquisition speed and accuracy. Magn Reson Med 75:2481-2492, 2016. © 2015 Wiley Periodicals, Inc. © 2015 Wiley Periodicals, Inc.

  11. Reliability and safety, and the risk of construction damage in mining areas

    NASA Astrophysics Data System (ADS)

    Skrzypczak, Izabela; Kogut, Janusz P.; Kokoszka, Wanda; Oleniacz, Grzegorz

    2018-04-01

    This article concerns the reliability and safety of building structures in mining areas, with a particular emphasis on the quantitative risk analysis of buildings. The issues of threat assessment and risk estimation, in the design of facilities in mining exploitation areas, are presented here, indicating the difficulties and ambiguities associated with their quantification and quantitative analysis. This article presents the concept of quantitative risk assessment of the impact of mining exploitation, in accordance with ISO 13824 [1]. The risk analysis is illustrated through an example of a construction located within an area affected by mining exploitation.

  12. Interim reliability-evaluation program: analysis of the Browns Ferry, Unit 1, nuclear plant. Appendix C - sequence quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Mays, S.E.; Poloski, J.P.; Sullivan, W.H.

    1982-07-01

    This report describes a risk study of the Browns Ferry, Unit 1, nuclear plant. The study is one of four such studies sponsored by the NRC Office of Research, Division of Risk Assessment, as part of its Interim Reliability Evaluation Program (IREP), Phase II. This report is contained in four volumes: a main report and three appendixes. Appendix C generally describes the methods used to estimate accident sequence frequency values. Information is presented concerning the approach, example collection, failure data, candidate dominant sequences, uncertainty analysis, and sensitivity analysis.

  13. Quantification of lithium at ppm level in geological samples using nuclear reaction analysis.

    PubMed

    De La Rosa, Nathaly; Kristiansson, Per; Nilsson, E J Charlotta; Ros, Linus; Pallon, Jan; Skogby, Henrik

    2018-01-01

    Proton-induced reaction (p,α) is one type of nuclear reaction analysis (NRA) suitable especially for light element quantification. In the case of lithium quantification presented in this work, accelerated protons with an energy about of 850 keV were used to induce the 7 Li(p,α) 4 He reaction in standard reference and geological samples such as tourmaline and other Li-minerals. It is shown that this technique for lithium quantification allowed for measurement of concentrations down below one ppm. The possibility to relate the lithium content with the boron content in a single analysis was also demonstrated using tourmaline samples, both in absolute concentration and in lateral distribution. In addition, Particle induced X-ray emission (PIXE) was utilized as a complementary IBA technique for simultaneous mapping of elements heavier than sodium.

  14. Design and application of a synthetic DNA standard for real-time PCR analysis of microbial communities in a biogas digester.

    PubMed

    May, T; Koch-Singenstreu, M; Ebling, J; Stantscheff, R; Müller, L; Jacobi, F; Polag, D; Keppler, F; König, H

    2015-08-01

    A synthetic DNA fragment containing primer binding sites for the quantification of ten different microbial groups was constructed and evaluated as a reliable enumeration standard for quantitative real-time PCR (qPCR) analyses. This approach has been exemplary verified for the quantification of several methanogenic orders and families in a series of samples drawn from a mesophilic biogas plant. Furthermore, the total amounts of bacteria as well as the number of sulfate-reducing and propionic acid bacteria as potential methanogenic interaction partners were successfully determined. The obtained results indicated a highly dynamic microbial community structure which was distinctly affected by the organic loading rate, the substrate selection, and the amount of free volatile fatty acids in the fermenter. Methanosarcinales was the most predominant methanogenic order during the 3 months of observation despite fluctuating process conditions. During all trials, the modified quantification standard indicated a maximum of reproducibility and efficiency, enabling this method to open up a wide range of novel application options.

  15. Optically transmitted and inductively coupled electric reference to access in vivo concentrations for quantitative proton-decoupled ¹³C magnetic resonance spectroscopy.

    PubMed

    Chen, Xing; Pavan, Matteo; Heinzer-Schweizer, Susanne; Boesiger, Peter; Henning, Anke

    2012-01-01

    This report describes our efforts on quantification of tissue metabolite concentrations in mM by nuclear Overhauser enhanced and proton decoupled (13) C magnetic resonance spectroscopy and the Electric Reference To access In vivo Concentrations (ERETIC) method. Previous work showed that a calibrated synthetic magnetic resonance spectroscopy-like signal transmitted through an optical fiber and inductively coupled into a transmit/receive coil represents a reliable reference standard for in vivo (1) H magnetic resonance spectroscopy quantification on a clinical platform. In this work, we introduce a related implementation that enables simultaneous proton decoupling and ERETIC-based metabolite quantification and hence extends the applicability of the ERETIC method to nuclear Overhauser enhanced and proton decoupled in vivo (13) C magnetic resonance spectroscopy. In addition, ERETIC signal stability under the influence of simultaneous proton decoupling is investigated. The proposed quantification method was cross-validated against internal and external reference standards on human skeletal muscle. The ERETIC signal intensity stability was 100.65 ± 4.18% over 3 months including measurements with and without proton decoupling. Glycogen and unsaturated fatty acid concentrations measured with the ERETIC method were in excellent agreement with internal creatine and external phantom reference methods, showing a difference of 1.85 ± 1.21% for glycogen and 1.84 ± 1.00% for unsaturated fatty acid between ERETIC and creatine-based quantification, whereas the deviations between external reference and creatine-based quantification are 6.95 ± 9.52% and 3.19 ± 2.60%, respectively. Copyright © 2011 Wiley Periodicals, Inc.

  16. Statistical image quantification toward optimal scan fusion and change quantification

    NASA Astrophysics Data System (ADS)

    Potesil, Vaclav; Zhou, Xiang Sean

    2007-03-01

    Recent advance of imaging technology has brought new challenges and opportunities for automatic and quantitative analysis of medical images. With broader accessibility of more imaging modalities for more patients, fusion of modalities/scans from one time point and longitudinal analysis of changes across time points have become the two most critical differentiators to support more informed, more reliable and more reproducible diagnosis and therapy decisions. Unfortunately, scan fusion and longitudinal analysis are both inherently plagued with increased levels of statistical errors. A lack of comprehensive analysis by imaging scientists and a lack of full awareness by physicians pose potential risks in clinical practice. In this paper, we discuss several key error factors affecting imaging quantification, studying their interactions, and introducing a simulation strategy to establish general error bounds for change quantification across time. We quantitatively show that image resolution, voxel anisotropy, lesion size, eccentricity, and orientation are all contributing factors to quantification error; and there is an intricate relationship between voxel anisotropy and lesion shape in affecting quantification error. Specifically, when two or more scans are to be fused at feature level, optimal linear fusion analysis reveals that scans with voxel anisotropy aligned with lesion elongation should receive a higher weight than other scans. As a result of such optimal linear fusion, we will achieve a lower variance than naïve averaging. Simulated experiments are used to validate theoretical predictions. Future work based on the proposed simulation methods may lead to general guidelines and error lower bounds for quantitative image analysis and change detection.

  17. Lowering the quantification limit of the QubitTM RNA HS assay using RNA spike-in.

    PubMed

    Li, Xin; Ben-Dov, Iddo Z; Mauro, Maurizio; Williams, Zev

    2015-05-06

    RNA quantification is often a prerequisite for most RNA analyses such as RNA sequencing. However, the relatively low sensitivity and large sample consumption of traditional RNA quantification methods such as UV spectrophotometry and even the much more sensitive fluorescence-based RNA quantification assays, such as the Qubit™ RNA HS Assay, are often inadequate for measuring minute levels of RNA isolated from limited cell and tissue samples and biofluids. Thus, there is a pressing need for a more sensitive method to reliably and robustly detect trace levels of RNA without interference from DNA. To improve the quantification limit of the Qubit™ RNA HS Assay, we spiked-in a known quantity of RNA to achieve the minimum reading required by the assay. Samples containing trace amounts of RNA were then added to the spike-in and measured as a reading increase over RNA spike-in baseline. We determined the accuracy and precision of reading increases between 1 and 20 pg/μL as well as RNA-specificity in this range, and compared to those of RiboGreen(®), another sensitive fluorescence-based RNA quantification assay. We then applied Qubit™ Assay with RNA spike-in to quantify plasma RNA samples. RNA spike-in improved the quantification limit of the Qubit™ RNA HS Assay 5-fold, from 25 pg/μL down to 5 pg/μL while maintaining high specificity to RNA. This enabled quantification of RNA with original concentration as low as 55.6 pg/μL compared to 250 pg/μL for the standard assay and decreased sample consumption from 5 to 1 ng. Plasma RNA samples that were not measurable by the Qubit™ RNA HS Assay were measurable by our modified method. The Qubit™ RNA HS Assay with RNA spike-in is able to quantify RNA with high specificity at 5-fold lower concentration and uses 5-fold less sample quantity than the standard Qubit™ Assay.

  18. Real-time quantitative PCR for retrovirus-like particle quantification in CHO cell culture.

    PubMed

    de Wit, C; Fautz, C; Xu, Y

    2000-09-01

    Chinese hamster ovary (CHO) cells have been widely used to manufacture recombinant proteins intended for human therapeutic uses. Retrovirus-like particles, which are apparently defective and non-infectious, have been detected in all CHO cells by electron microscopy (EM). To assure viral safety of CHO cell-derived biologicals, quantification of retrovirus-like particles in production cell culture and demonstration of sufficient elimination of such retrovirus-like particles by the down-stream purification process are required for product market registration worldwide. EM, with a detection limit of 1x10(6) particles/ml, is the standard retrovirus-like particle quantification method. The whole process, which requires a large amount of sample (3-6 litres), is labour intensive, time consuming, expensive, and subject to significant assay variability. In this paper, a novel real-time quantitative PCR assay (TaqMan assay) has been developed for the quantification of retrovirus-like particles. Each retrovirus particle contains two copies of the viral genomic particle RNA (pRNA) molecule. Therefore, quantification of retrovirus particles can be achieved by quantifying the pRNA copy number, i.e. every two copies of retroviral pRNA is equivalent to one retrovirus-like particle. The TaqMan assay takes advantage of the 5'-->3' exonuclease activity of Taq DNA polymerase and utilizes the PRISM 7700 Sequence Detection System of PE Applied Biosystems (Foster City, CA, U.S.A.) for automated pRNA quantification through a dual-labelled fluorogenic probe. The TaqMan quantification technique is highly comparable to the EM analysis. In addition, it offers significant advantages over the EM analysis, such as a higher sensitivity of less than 600 particles/ml, greater accuracy and reliability, higher sample throughput, more flexibility and lower cost. Therefore, the TaqMan assay should be used as a substitute for EM analysis for retrovirus-like particle quantification in CHO cell-based production system. Copyright 2000 The International Association for Biologicals.

  19. Three-Dimensional Echocardiographic Assessment of Left Heart Chamber Size and Function with Fully Automated Quantification Software in Patients with Atrial Fibrillation.

    PubMed

    Otani, Kyoko; Nakazono, Akemi; Salgo, Ivan S; Lang, Roberto M; Takeuchi, Masaaki

    2016-10-01

    Echocardiographic determination of left heart chamber volumetric parameters by using manual tracings during multiple beats is tedious in atrial fibrillation (AF). The aim of this study was to determine the usefulness of fully automated left chamber quantification software with single-beat three-dimensional transthoracic echocardiographic data sets in patients with AF. Single-beat full-volume three-dimensional transthoracic echocardiographic data sets were prospectively acquired during consecutive multiple cardiac beats (≥10 beats) in 88 patients with AF. In protocol 1, left ventricular volumes, left ventricular ejection fraction, and maximal left atrial volume were validated using automated quantification against the manual tracing method in identical beats in 10 patients. In protocol 2, automated quantification-derived averaged values from multiple beats were compared with the corresponding values obtained from the indexed beat in all patients. Excellent correlations of left chamber parameters between automated quantification and the manual method were observed (r = 0.88-0.98) in protocol 1. The time required for the analysis with the automated quantification method (5 min) was significantly less compared with the manual method (27 min) (P < .0001). In protocol 2, there were excellent linear correlations between the averaged left chamber parameters and the corresponding values obtained from the indexed beat (r = 0.94-0.99), and test-retest variability of left chamber parameters was low (3.5%-4.8%). Three-dimensional transthoracic echocardiography with fully automated quantification software is a rapid and reliable way to measure averaged values of left heart chamber parameters during multiple consecutive beats. Thus, it is a potential new approach for left chamber quantification in patients with AF in daily routine practice. Copyright © 2016 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.

  20. Unstable Expression of Commonly Used Reference Genes in Rat Pancreatic Islets Early after Isolation Affects Results of Gene Expression Studies.

    PubMed

    Kosinová, Lucie; Cahová, Monika; Fábryová, Eva; Týcová, Irena; Koblas, Tomáš; Leontovyč, Ivan; Saudek, František; Kříž, Jan

    2016-01-01

    The use of RT-qPCR provides a powerful tool for gene expression studies; however, the proper interpretation of the obtained data is crucially dependent on accurate normalization based on stable reference genes. Recently, strong evidence has been shown indicating that the expression of many commonly used reference genes may vary significantly due to diverse experimental conditions. The isolation of pancreatic islets is a complicated procedure which creates severe mechanical and metabolic stress leading possibly to cellular damage and alteration of gene expression. Despite of this, freshly isolated islets frequently serve as a control in various gene expression and intervention studies. The aim of our study was to determine expression of 16 candidate reference genes and one gene of interest (F3) in isolated rat pancreatic islets during short-term cultivation in order to find a suitable endogenous control for gene expression studies. We compared the expression stability of the most commonly used reference genes and evaluated the reliability of relative and absolute quantification using RT-qPCR during 0-120 hrs after isolation. In freshly isolated islets, the expression of all tested genes was markedly depressed and it increased several times throughout the first 48 hrs of cultivation. We observed significant variability among samples at 0 and 24 hrs but substantial stabilization from 48 hrs onwards. During the first 48 hrs, relative quantification failed to reflect the real changes in respective mRNA concentrations while in the interval 48-120 hrs, the relative expression generally paralleled the results determined by absolute quantification. Thus, our data call into question the suitability of relative quantification for gene expression analysis in pancreatic islets during the first 48 hrs of cultivation, as the results may be significantly affected by unstable expression of reference genes. However, this method could provide reliable information from 48 hrs onwards.

  1. Intensive management modifies soil CO2 efflux in 6-year-old Pinus taeda L. stands

    Treesearch

    Lisa J. Samuelson; Kurt Johnsen; Tom Stokes; Weinlang Lu

    2004-01-01

    Intensive forestry may reduce net CO2 emission into atmosphere by storing carbon in living biomass, dead organic matter and soil, and durable wood products. Because quantification of belowground carbon dynamics is important for reliable estimation of the carbon sequestered by intensively managed plantations, we examined soil CO2...

  2. A reliable methodology for quantitative extraction of fruit and vegetable physiological amino acids and their subsequent analysis with commonly available HPLC systems

    USDA-ARS?s Scientific Manuscript database

    High performance liquid chromatography of dabsyl derivatives of amino acids was employed for quantification of physiological amino acids in selected fruits and vegetables. This method was found to be particularly useful because the dabsyl derivatives of glutamine and citrulline were sufficiently se...

  3. A Standardized System of Training Intensity Guidelines for the Sports of Track and Field and Cross Country

    ERIC Educational Resources Information Center

    Belcher, Christopher P.; Pemberton, Cynthia Lee A.

    2012-01-01

    Accurate quantification of training intensity is an essential component of a training program (Rowbottom, 2000). A training program designed to optimize athlete performance abilities cannot be practically planned or implemented without a valid and reliable indication of training intensity and its effect on the physiological mechanisms of the human…

  4. Determination of humic and fulvic acids in commercial solid and liquid humic products by alkaline extraction and gravimetric determination

    USDA-ARS?s Scientific Manuscript database

    Increased use of humic substances in agriculture has generated intense interest among producers, consumers, and regulators for an accurate and reliable method for quantification of humic (HA) and fulvic acids (FA) in raw ores and products. Here we present a thoroughly validated method, the Humic Pro...

  5. PSEA-Quant: a protein set enrichment analysis on label-free and label-based protein quantification data.

    PubMed

    Lavallée-Adam, Mathieu; Rauniyar, Navin; McClatchy, Daniel B; Yates, John R

    2014-12-05

    The majority of large-scale proteomics quantification methods yield long lists of quantified proteins that are often difficult to interpret and poorly reproduced. Computational approaches are required to analyze such intricate quantitative proteomics data sets. We propose a statistical approach to computationally identify protein sets (e.g., Gene Ontology (GO) terms) that are significantly enriched with abundant proteins with reproducible quantification measurements across a set of replicates. To this end, we developed PSEA-Quant, a protein set enrichment analysis algorithm for label-free and label-based protein quantification data sets. It offers an alternative approach to classic GO analyses, models protein annotation biases, and allows the analysis of samples originating from a single condition, unlike analogous approaches such as GSEA and PSEA. We demonstrate that PSEA-Quant produces results complementary to GO analyses. We also show that PSEA-Quant provides valuable information about the biological processes involved in cystic fibrosis using label-free protein quantification of a cell line expressing a CFTR mutant. Finally, PSEA-Quant highlights the differences in the mechanisms taking place in the human, rat, and mouse brain frontal cortices based on tandem mass tag quantification. Our approach, which is available online, will thus improve the analysis of proteomics quantification data sets by providing meaningful biological insights.

  6. PSEA-Quant: A Protein Set Enrichment Analysis on Label-Free and Label-Based Protein Quantification Data

    PubMed Central

    2015-01-01

    The majority of large-scale proteomics quantification methods yield long lists of quantified proteins that are often difficult to interpret and poorly reproduced. Computational approaches are required to analyze such intricate quantitative proteomics data sets. We propose a statistical approach to computationally identify protein sets (e.g., Gene Ontology (GO) terms) that are significantly enriched with abundant proteins with reproducible quantification measurements across a set of replicates. To this end, we developed PSEA-Quant, a protein set enrichment analysis algorithm for label-free and label-based protein quantification data sets. It offers an alternative approach to classic GO analyses, models protein annotation biases, and allows the analysis of samples originating from a single condition, unlike analogous approaches such as GSEA and PSEA. We demonstrate that PSEA-Quant produces results complementary to GO analyses. We also show that PSEA-Quant provides valuable information about the biological processes involved in cystic fibrosis using label-free protein quantification of a cell line expressing a CFTR mutant. Finally, PSEA-Quant highlights the differences in the mechanisms taking place in the human, rat, and mouse brain frontal cortices based on tandem mass tag quantification. Our approach, which is available online, will thus improve the analysis of proteomics quantification data sets by providing meaningful biological insights. PMID:25177766

  7. Results of a Demonstration Assessment of Passive System Reliability Utilizing the Reliability Method for Passive Systems (RMPS)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bucknor, Matthew; Grabaskas, David; Brunett, Acacia

    2015-04-26

    Advanced small modular reactor designs include many advantageous design features such as passively driven safety systems that are arguably more reliable and cost effective relative to conventional active systems. Despite their attractiveness, a reliability assessment of passive systems can be difficult using conventional reliability methods due to the nature of passive systems. Simple deviations in boundary conditions can induce functional failures in a passive system, and intermediate or unexpected operating modes can also occur. As part of an ongoing project, Argonne National Laboratory is investigating various methodologies to address passive system reliability. The Reliability Method for Passive Systems (RMPS), amore » systematic approach for examining reliability, is one technique chosen for this analysis. This methodology is combined with the Risk-Informed Safety Margin Characterization (RISMC) approach to assess the reliability of a passive system and the impact of its associated uncertainties. For this demonstration problem, an integrated plant model of an advanced small modular pool-type sodium fast reactor with a passive reactor cavity cooling system is subjected to a station blackout using RELAP5-3D. This paper discusses important aspects of the reliability assessment, including deployment of the methodology, the uncertainty identification and quantification process, and identification of key risk metrics.« less

  8. EvaGreen real-time PCR protocol for specific 'Candidatus Phytoplasma mali' detection and quantification in insects.

    PubMed

    Monti, Monia; Martini, Marta; Tedeschi, Rosemarie

    2013-01-01

    In this paper the validation and implementation of a Real-time PCR protocol based on ribosomal protein genes has been carried out for sensitive and specific quantification of 'Candidatus (Ca.) Phytoplasma mali' (apple proliferation phytoplasma, APP) in insects. The method combines the use of EvaGreen(®) dye as chemistry detection system and the specific primer pair rpAP15f-mod/rpAP15r3, which amplifies a fragment of 238 bp of the ribosomal protein rplV (rpl22) gene of APP. Primers specificity was demonstrated by running in the same Real-time PCR 'Ca. Phytoplasma mali' samples with phytoplasmas belonging to the same group (16SrX) as 'Ca. Phytoplasma pyri' and 'Ca. Phytoplasma prunorum', and also phytoplasmas from different groups, as 'Ca. Phytoplasma phoenicium' (16SrIX) and Flavescence dorée phytoplasma (16SrV). 'Ca. Phytoplasma mali' titre in insects was quantified using a specific approach, which relates the concentration of the phytoplasma to insect 18S rDNA. Absolute quantification of APP and insect 18S rDNA were calculated using standard curves prepared from serial dilutions of plasmids containing rplV-rpsC and a portion of 18S rDNA genes, respectively. APP titre in insects was expressed as genome units (GU) of phytoplasma per picogram (pg) of individual insect 18S rDNA. 'Ca. Phytoplasma mali' concentration in examined samples (Cacopsylla melanoneura overwintered adults) ranged from 5.94 × 10(2) to 2.51 × 10(4) GU/pg of insect 18S rDNA. Repeatability and reproducibility of the method were also evaluated by calculation of the coefficient of variation (CV%) of GU of phytoplasma and pg of 18S rDNA fragment for both assays. CV less than 14% and 9% (for reproducibility test) and less than 10 and 11% (for repeatability test) were obtained for phytoplasma and insect qPCR assays, respectively. Sensitivity of the method was also evaluated, in comparison with conventional 16S rDNA-based nested-PCR procedure. The method described has been demonstrated reliable, sensitive and specific for the quantification of 'Ca. Phytoplasma mali' in insects. The possibility to study the trend of phytoplasma titre in the vectors will allow a deepen investigation on the epidemiology of the disease. Copyright © 2013 Elsevier Ltd. All rights reserved.

  9. Emphysema quantification from CT scans using novel application of diaphragm curvature estimation: comparison with standard quantification methods and pulmonary function data

    NASA Astrophysics Data System (ADS)

    Keller, Brad M.; Reeves, Anthony P.; Yankelevitz, David F.; Henschke, Claudia I.; Barr, R. Graham

    2009-02-01

    Emphysema is a disease of the lungs that destroys the alveolar air sacs and induces long-term respiratory dysfunction. CT scans allow for the imaging of the anatomical basis of emphysema and quantification of the underlying disease state. Several measures have been introduced for the quantification emphysema directly from CT data; most,however, are based on the analysis of density information provided by the CT scans, which vary by scanner and can be hard to standardize across sites and time. Given that one of the anatomical variations associated with the progression of emphysema is the flatting of the diaphragm due to the loss of elasticity in the lung parenchyma, curvature analysis of the diaphragm would provide information about emphysema from CT. Therefore, we propose a new, non-density based measure of the curvature of the diaphragm that would allow for further quantification methods in a robust manner. To evaluate the new method, 24 whole-lung scans were analyzed using the ratios of the lung height and diaphragm width to diaphragm height as curvature estimates as well as using the emphysema index as comparison. Pearson correlation coefficients showed a strong trend of several of the proposed diaphragm curvature measures to have higher correlations, of up to r=0.57, with DLCO% and VA than did the emphysema index. Furthermore, we found emphysema index to have only a 0.27 correlation to the proposed measures, indicating that the proposed measures evaluate different aspects of the disease.

  10. High spatial resolution free-breathing 3D late gadolinium enhancement cardiac magnetic resonance imaging in ischaemic and non-ischaemic cardiomyopathy: quantitative assessment of scar mass and image quality.

    PubMed

    Bizino, Maurice B; Tao, Qian; Amersfoort, Jacob; Siebelink, Hans-Marc J; van den Bogaard, Pieter J; van der Geest, Rob J; Lamb, Hildo J

    2018-04-06

    To compare breath-hold (BH) with navigated free-breathing (FB) 3D late gadolinium enhancement cardiac MRI (LGE-CMR) MATERIALS AND METHODS: Fifty-one patients were retrospectively included (34 ischaemic cardiomyopathy, 14 non-ischaemic cardiomyopathy, three discarded). BH and FB 3D phase sensitive inversion recovery sequences were performed at 3T. FB datasets were reformatted into normal resolution (FB-NR, 1.46x1.46x10mm) and high resolution (FB-HR, isotropic 0.91-mm voxels). Scar mass, scar edge sharpness (SES), SNR and CNR were compared using paired-samples t-test, Pearson correlation and Bland-Altman analysis. Scar mass was similar in BH and FB-NR (mean ± SD: 15.5±18.0 g vs. 15.5±16.9 g, p=0.997), with good correlation (r=0.953), and no bias (mean difference ± SD: 0.00±5.47 g). FB-NR significantly overestimated scar mass compared with FB-HR (15.5±16.9 g vs 14.4±15.6 g; p=0.007). FB-NR and FB-HR correlated well (r=0.988), but Bland-Altman demonstrated systematic bias (1.15±2.84 g). SES was similar in BH and FB-NR (p=0.947), but significantly higher in FB-HR than FB-NR (p<0.01). SNR and CNR were lower in BH than FB-NR (p<0.01), and lower in FB-HR than FB-NR (p<0.01). Navigated free-breathing 3D LGE-CMR allows reliable scar mass quantification comparable to breath-hold. During free-breathing, spatial resolution can be increased resulting in improved sharpness and reduced scar mass. • Navigated free-breathing 3D late gadolinium enhancement is reliable for myocardial scar quantification. • High-resolution 3D late gadolinium enhancement increases scar sharpness • Ischaemic and non-ischaemic cardiomyopathy patients can be imaged using free-breathing LGE CMR.

  11. Statistical tools for transgene copy number estimation based on real-time PCR.

    PubMed

    Yuan, Joshua S; Burris, Jason; Stewart, Nathan R; Mentewab, Ayalew; Stewart, C Neal

    2007-11-01

    As compared with traditional transgene copy number detection technologies such as Southern blot analysis, real-time PCR provides a fast, inexpensive and high-throughput alternative. However, the real-time PCR based transgene copy number estimation tends to be ambiguous and subjective stemming from the lack of proper statistical analysis and data quality control to render a reliable estimation of copy number with a prediction value. Despite the recent progresses in statistical analysis of real-time PCR, few publications have integrated these advancements in real-time PCR based transgene copy number determination. Three experimental designs and four data quality control integrated statistical models are presented. For the first method, external calibration curves are established for the transgene based on serially-diluted templates. The Ct number from a control transgenic event and putative transgenic event are compared to derive the transgene copy number or zygosity estimation. Simple linear regression and two group T-test procedures were combined to model the data from this design. For the second experimental design, standard curves were generated for both an internal reference gene and the transgene, and the copy number of transgene was compared with that of internal reference gene. Multiple regression models and ANOVA models can be employed to analyze the data and perform quality control for this approach. In the third experimental design, transgene copy number is compared with reference gene without a standard curve, but rather, is based directly on fluorescence data. Two different multiple regression models were proposed to analyze the data based on two different approaches of amplification efficiency integration. Our results highlight the importance of proper statistical treatment and quality control integration in real-time PCR-based transgene copy number determination. These statistical methods allow the real-time PCR-based transgene copy number estimation to be more reliable and precise with a proper statistical estimation. Proper confidence intervals are necessary for unambiguous prediction of trangene copy number. The four different statistical methods are compared for their advantages and disadvantages. Moreover, the statistical methods can also be applied for other real-time PCR-based quantification assays including transfection efficiency analysis and pathogen quantification.

  12. Quantification of Training and Competition Loads in Endurance Sports: Methods and Applications.

    PubMed

    Mujika, Iñigo

    2017-04-01

    Training quantification is basic to evaluate an endurance athlete's responses to training loads, ensure adequate stress/recovery balance, and determine the relationship between training and performance. Quantifying both external and internal workload is important, because external workload does not measure the biological stress imposed by the exercise sessions. Generally used quantification methods include retrospective questionnaires, diaries, direct observation, and physiological monitoring, often based on the measurement of oxygen uptake, heart rate, and blood lactate concentration. Other methods in use in endurance sports include speed measurement and the measurement of power output, made possible by recent technological advances such as power meters in cycling and triathlon. Among subjective methods of quantification, rating of perceived exertion stands out because of its wide use. Concurrent assessments of the various quantification methods allow researchers and practitioners to evaluate stress/recovery balance, adjust individual training programs, and determine the relationships between external load, internal load, and athletes' performance. This brief review summarizes the most relevant external- and internal-workload-quantification methods in endurance sports and provides practical examples of their implementation to adjust the training programs of elite athletes in accordance with their individualized stress/recovery balance.

  13. Fifty Years of THERP and Human Reliability Analysis

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ronald L. Boring

    2012-06-01

    In 1962 at a Human Factors Society symposium, Alan Swain presented a paper introducing a Technique for Human Error Rate Prediction (THERP). This was followed in 1963 by a Sandia Laboratories monograph outlining basic human error quantification using THERP and, in 1964, by a special journal edition of Human Factors on quantification of human performance. Throughout the 1960s, Swain and his colleagues focused on collecting human performance data for the Sandia Human Error Rate Bank (SHERB), primarily in connection with supporting the reliability of nuclear weapons assembly in the US. In 1969, Swain met with Jens Rasmussen of Risø Nationalmore » Laboratory and discussed the applicability of THERP to nuclear power applications. By 1975, in WASH-1400, Swain had articulated the use of THERP for nuclear power applications, and the approach was finalized in the watershed publication of the NUREG/CR-1278 in 1983. THERP is now 50 years old, and remains the most well known and most widely used HRA method. In this paper, the author discusses the history of THERP, based on published reports and personal communication and interviews with Swain. The author also outlines the significance of THERP. The foundations of human reliability analysis are found in THERP: human failure events, task analysis, performance shaping factors, human error probabilities, dependence, event trees, recovery, and pre- and post-initiating events were all introduced in THERP. While THERP is not without its detractors, and it is showing signs of its age in the face of newer technological applications, the longevity of THERP is a testament of its tremendous significance. THERP started the field of human reliability analysis. This paper concludes with a discussion of THERP in the context of newer methods, which can be seen as extensions of or departures from Swain’s pioneering work.« less

  14. EEG-Based Quantification of Cortical Current Density and Dynamic Causal Connectivity Generalized across Subjects Performing BCI-Monitored Cognitive Tasks

    PubMed Central

    Courellis, Hristos; Mullen, Tim; Poizner, Howard; Cauwenberghs, Gert; Iversen, John R.

    2017-01-01

    Quantification of dynamic causal interactions among brain regions constitutes an important component of conducting research and developing applications in experimental and translational neuroscience. Furthermore, cortical networks with dynamic causal connectivity in brain-computer interface (BCI) applications offer a more comprehensive view of brain states implicated in behavior than do individual brain regions. However, models of cortical network dynamics are difficult to generalize across subjects because current electroencephalography (EEG) signal analysis techniques are limited in their ability to reliably localize sources across subjects. We propose an algorithmic and computational framework for identifying cortical networks across subjects in which dynamic causal connectivity is modeled among user-selected cortical regions of interest (ROIs). We demonstrate the strength of the proposed framework using a “reach/saccade to spatial target” cognitive task performed by 10 right-handed individuals. Modeling of causal cortical interactions was accomplished through measurement of cortical activity using (EEG), application of independent component clustering to identify cortical ROIs as network nodes, estimation of cortical current density using cortically constrained low resolution electromagnetic brain tomography (cLORETA), multivariate autoregressive (MVAR) modeling of representative cortical activity signals from each ROI, and quantification of the dynamic causal interaction among the identified ROIs using the Short-time direct Directed Transfer function (SdDTF). The resulting cortical network and the computed causal dynamics among its nodes exhibited physiologically plausible behavior, consistent with past results reported in the literature. This physiological plausibility of the results strengthens the framework's applicability in reliably capturing complex brain functionality, which is required by applications, such as diagnostics and BCI. PMID:28566997

  15. The applications of statistical quantification techniques in nanomechanics and nanoelectronics.

    PubMed

    Mai, Wenjie; Deng, Xinwei

    2010-10-08

    Although nanoscience and nanotechnology have been developing for approximately two decades and have achieved numerous breakthroughs, the experimental results from nanomaterials with a higher noise level and poorer repeatability than those from bulk materials still remain as a practical issue, and challenge many techniques of quantification of nanomaterials. This work proposes a physical-statistical modeling approach and a global fitting statistical method to use all the available discrete data or quasi-continuous curves to quantify a few targeted physical parameters, which can provide more accurate, efficient and reliable parameter estimates, and give reasonable physical explanations. In the resonance method for measuring the elastic modulus of ZnO nanowires (Zhou et al 2006 Solid State Commun. 139 222-6), our statistical technique gives E = 128.33 GPa instead of the original E = 108 GPa, and unveils a negative bias adjustment f(0). The causes are suggested by the systematic bias in measuring the length of the nanowires. In the electronic measurement of the resistivity of a Mo nanowire (Zach et al 2000 Science 290 2120-3), the proposed new method automatically identified the importance of accounting for the Ohmic contact resistance in the model of the Ohmic behavior in nanoelectronics experiments. The 95% confidence interval of resistivity in the proposed one-step procedure is determined to be 3.57 +/- 0.0274 x 10( - 5) ohm cm, which should be a more reliable and precise estimate. The statistical quantification technique should find wide applications in obtaining better estimations from various systematic errors and biased effects that become more significant at the nanoscale.

  16. [Health protection for rural workers: the need to standardize techniques for quantifying dermal exposure to pesticides].

    PubMed

    Selmi, Giuliana da Fontoura Rodrigues; Trapé, Angelo Zanaga

    2014-05-01

    Quantification of dermal exposure to pesticides in rural workers, used in risk assessment, can be performed with different techniques such as patches or whole body evaluation. However, the wide variety of methods can jeopardize the process by producing disparate results, depending on the principles in sample collection. A critical review was thus performed on the main techniques for quantifying dermal exposure, calling attention to this issue and the need to establish a single methodology for quantification of dermal exposure in rural workers. Such harmonization of different techniques should help achieve safer and healthier working conditions. Techniques that can provide reliable exposure data are an essential first step towards avoiding harm to workers' health.

  17. A novel TaqMan® assay for Nosema ceranae quantification in honey bee, based on the protein coding gene Hsp70.

    PubMed

    Cilia, Giovanni; Cabbri, Riccardo; Maiorana, Giacomo; Cardaio, Ilaria; Dall'Olio, Raffaele; Nanetti, Antonio

    2018-04-01

    Nosema ceranae is now a widespread honey bee pathogen with high incidence in apiculture. Rapid and reliable detection and quantification methods are a matter of concern for research community, nowadays mainly relying on the use of biomolecular techniques such as PCR, RT-PCR or HRMA. The aim of this technical paper is to provide a new qPCR assay, based on the highly-conserved protein coding gene Hsp70, to detect and quantify the microsporidian Nosema ceranae affecting the western honey bee Apis mellifera. The validation steps to assess efficiency, sensitivity, specificity and robustness of the assay are described also. Copyright © 2018 Elsevier GmbH. All rights reserved.

  18. Methods and techniques for measuring gas emissions from agricultural and animal feeding operations.

    PubMed

    Hu, Enzhu; Babcock, Esther L; Bialkowski, Stephen E; Jones, Scott B; Tuller, Markus

    2014-01-01

    Emissions of gases from agricultural and animal feeding operations contribute to climate change, produce odors, degrade sensitive ecosystems, and pose a threat to public health. The complexity of processes and environmental variables affecting these emissions complicate accurate and reliable quantification of gas fluxes and production rates. Although a plethora of measurement technologies exist, each method has its limitations that exacerbate accurate quantification of gas fluxes. Despite a growing interest in gas emission measurements, only a few available technologies include real-time, continuous monitoring capabilities. Commonly applied state-of-the-art measurement frameworks and technologies were critically examined and discussed, and recommendations for future research to address real-time monitoring requirements for forthcoming regulation and management needs are provided.

  19. Simultaneous quantification of five major active components in capsules of the traditional Chinese medicine ‘Shu-Jin-Zhi-Tong’ by high performance liquid chromatography

    PubMed Central

    Yang, Xing-Xin; Zhang, Xiao-Xia; Chang, Rui-Miao; Wang, Yan-Wei; Li, Xiao-Ni

    2011-01-01

    A simple and reliable high performance liquid chromatography (HPLC) method has been developed for the simultaneous quantification of five major bioactive components in ‘Shu-Jin-Zhi-Tong’ capsules (SJZTC), for the purposes of quality control of this commonly prescribed traditional Chinese medicine. Under the optimum conditions, excellent separation was achieved, and the assay was fully validated in terms of linearity, precision, repeatability, stability and accuracy. The validated method was applied successfully to the determination of the five compounds in SJZTC samples from different production batches. The HPLC method can be used as a valid analytical method to evaluate the intrinsic quality of SJZTC. PMID:29403711

  20. A surrogate accelerated multicanonical Monte Carlo method for uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Wu, Keyi; Li, Jinglai

    2016-09-01

    In this work we consider a class of uncertainty quantification problems where the system performance or reliability is characterized by a scalar parameter y. The performance parameter y is random due to the presence of various sources of uncertainty in the system, and our goal is to estimate the probability density function (PDF) of y. We propose to use the multicanonical Monte Carlo (MMC) method, a special type of adaptive importance sampling algorithms, to compute the PDF of interest. Moreover, we develop an adaptive algorithm to construct local Gaussian process surrogates to further accelerate the MMC iterations. With numerical examples we demonstrate that the proposed method can achieve several orders of magnitudes of speedup over the standard Monte Carlo methods.

  1. Detection and Quantification of Viable and Nonviable Trypanosoma cruzi Parasites by a Propidium Monoazide Real-Time Polymerase Chain Reaction Assay

    PubMed Central

    Cancino-Faure, Beatriz; Fisa, Roser; Alcover, M. Magdalena; Jimenez-Marco, Teresa; Riera, Cristina

    2016-01-01

    Molecular techniques based on real-time polymerase chain reaction (qPCR) allow the detection and quantification of DNA but are unable to distinguish between signals from dead or live cells. Because of the lack of simple techniques to differentiate between viable and nonviable cells, the aim of this study was to optimize and evaluate a straightforward test based on propidium monoazide (PMA) dye action combined with a qPCR assay (PMA-qPCR) for the selective quantification of viable/nonviable epimastigotes of Trypanosoma cruzi. PMA has the ability to penetrate the plasma membrane of dead cells and covalently cross-link to the DNA during exposure to bright visible light, thereby inhibiting PCR amplification. Different concentrations of PMA (50–200 μM) and epimastigotes of the Maracay strain of T. cruzi (1 × 105–10 parasites/mL) were assayed; viable and nonviable parasites were tested and quantified by qPCR with a TaqMan probe specific for T. cruzi. In the PMA-qPCR assay optimized at 100 μM PMA, a significant qPCR signal reduction was observed in the nonviable versus viable epimastigotes treated with PMA, with a mean signal reduction of 2.5 logarithm units and a percentage of signal reduction > 98%, in all concentrations of parasites assayed. This signal reduction was also observed when PMA-qPCR was applied to a mixture of live/dead parasites, which allowed the detection of live cells, except when the concentration of live parasites was low (10 parasites/mL). The PMA-qPCR developed allows differentiation between viable and nonviable epimastigotes of T. cruzi and could thus be a potential method of parasite viability assessment and quantification. PMID:27139452

  2. The role of PET quantification in cardiovascular imaging.

    PubMed

    Slomka, Piotr; Berman, Daniel S; Alexanderson, Erick; Germano, Guido

    2014-08-01

    Positron Emission Tomography (PET) has several clinical and research applications in cardiovascular imaging. Myocardial perfusion imaging with PET allows accurate global and regional measurements of myocardial perfusion, myocardial blood flow and function at stress and rest in one exam. Simultaneous assessment of function and perfusion by PET with quantitative software is currently the routine practice. Combination of ejection fraction reserve with perfusion information may improve the identification of severe disease. The myocardial viability can be estimated by quantitative comparison of fluorodeoxyglucose ( 18 FDG) and rest perfusion imaging. The myocardial blood flow and coronary flow reserve measurements are becoming routinely included in the clinical assessment due to enhanced dynamic imaging capabilities of the latest PET/CT scanners. Absolute flow measurements allow evaluation of the coronary microvascular dysfunction and provide additional prognostic and diagnostic information for coronary disease. Standard quantitative approaches to compute myocardial blood flow from kinetic PET data in automated and rapid fashion have been developed for 13 N-ammonia, 15 O-water and 82 Rb radiotracers. The agreement between software methods available for such analysis is excellent. Relative quantification of 82 Rb PET myocardial perfusion, based on comparisons to normal databases, demonstrates high performance for the detection of obstructive coronary disease. New tracers, such as 18 F-flurpiridaz may allow further improvements in the disease detection. Computerized analysis of perfusion at stress and rest reduces the variability of the assessment as compared to visual analysis. PET quantification can be enhanced by precise coregistration with CT angiography. In emerging clinical applications, the potential to identify vulnerable plaques by quantification of atherosclerotic plaque uptake of 18 FDG and 18 F-sodium fluoride tracers in carotids, aorta and coronary arteries has been demonstrated.

  3. A GEIL flow cytometry consensus proposal for quantification of plasma cells: application to differential diagnosis between MGUS and myeloma.

    PubMed

    Frébet, Elise; Abraham, Julie; Geneviève, Franck; Lepelley, Pascale; Daliphard, Sylvie; Bardet, Valérie; Amsellem, Sophie; Guy, Julien; Mullier, Francois; Durrieu, Francoise; Venon, Marie-Dominique; Leleu, Xavier; Jaccard, Arnaud; Faucher, Jean-Luc; Béné, Marie C; Feuillard, Jean

    2011-05-01

    Flow cytometry is the sole available technique for quantification of tumor plasma-cells in plasma-cell disorders, but so far, no consensus technique has been proposed. Here, we report on a standardized, simple, robust five color flow cytometry protocol developed to characterize and quantify bone marrow tumor plasma-cells, validated in a multicenter manner. CD36 was used to exclude red blood cell debris and erythroblasts, CD38 and CD138 to detect plasma-cells, immunoglobulin light chains, CD45, CD56, CD19, and CD117 + CD34 to simultaneously characterize abnormal plasma-cells and quantify bone marrow precursors. This approach was applied in nine centers to 229 cases, including 25 controls. Tumor plasma-cells were detected in 96.8% of cases, all exhibiting an immunoglobulin peak over 1g/L. Calculation of a plasma-cells/precursors (PC/P) ratio allowed quantification of the plasma-cell burden independently from bone marrow hemodilution. The PC/P ratio yielded the best results in terms of sensitivity (81%) and specificity (84%) for differential diagnosis between MGUS and myeloma, when compared with other criteria. Combination of both the PC/P ratio and percentage of abnormal plasma-cells allowed the best differential diagnosis, but these criteria were discordant in 25% cases. Indirect calculation of CD19 negative PC/R ratio gave the best results in terms of sensitivity (87%). This standardized multiparameter flow cytometric approach allows for the detection and quantification of bone marrow tumor plasma-cell infiltration in nearly all cases of MGUS and myeloma, independently of debris and hemodilution. This approach may also prove useful for the detection of minimal residual disease. Copyright © 2010 International Clinical Cytometry Society.

  4. Fast microwave-assisted extraction of rotenone for its quantification in seeds of yam bean (Pachyrhizus sp.).

    PubMed

    Lautié, Emmanuelle; Rasse, Catherine; Rozet, Eric; Mourgues, Claire; Vanhelleputte, Jean-Paul; Quetin-Leclercq, Joëlle

    2013-02-01

    The aim of this study was to find if fast microwave-assisted extraction could be an alternative to the conventional Soxhlet extraction for the quantification of rotenone in yam bean seeds by SPE and HPLC-UV. For this purpose, an experimental design was used to determine the optimal conditions of the microwave extraction. Then the values of the quantification on three accessions from two different species of yam bean seeds were compared using the two different kinds of extraction. A microwave extraction of 11 min at 55°C using methanol/dichloromethane (50:50) allowed rotenone extraction either equivalently or more efficiently than the 8-h-Soxhlet extraction method and was less sensitive to moisture content. The selectivity, precision, trueness, accuracy, and limit of quantification of the method with microwave extraction were also demonstrated. © 2013 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  5. Quantification of trace metals in water using complexation and filter concentration.

    PubMed

    Dolgin, Bella; Bulatov, Valery; Japarov, Julia; Elish, Eyal; Edri, Elad; Schechter, Israel

    2010-06-15

    Various metals undergo complexation with organic reagents, resulting in colored products. In practice, their molar absorptivities allow for quantification in the ppm range. However, a proper pre-concentration of the colored complex on paper filter lowers the quantification limit to the low ppb range. In this study, several pre-concentration techniques have been examined and compared: filtering the already complexed mixture, complexation on filter, and dipping of dye-covered filter in solution. The best quantification has been based on the ratio of filter reflectance at a certain wavelength to that at zero metal concentration. The studied complex formations (Ni ions with TAN and Cd ions with PAN) involve production of nanoparticle suspensions, which are associated with complicated kinetics. The kinetics of the complexation of Ni ions with TAN has been investigated and optimum timing could be found. Kinetic optimization in regard to some interferences has also been suggested.

  6. Measuring Food Brand Awareness in Australian Children: Development and Validation of a New Instrument

    PubMed Central

    Boyland, Emma; Bauman, Adrian E.

    2015-01-01

    Background Children’s exposure to food marketing is one environmental determinant of childhood obesity. Measuring the extent to which children are aware of food brands may be one way to estimate relative prior exposures to food marketing. This study aimed to develop and validate an Australian Brand Awareness Instrument (ABAI) to estimate children’s food brand awareness. Methods The ABAI incorporated 30 flashcards depicting food/drink logos and their corresponding products. An abbreviated version was also created using 12 flashcards (ABAI-a). The ABAI was presented to 60 primary school aged children (7-11yrs) attending two Australian after-school centres. A week later, the full-version was repeated on approximately half the sample (n=27) and the abbreviated-version was presented to the remaining half (n=30). The test-retest reliability of the ABAI was analysed using Intra-class correlation coefficients. The concordance of the ABAI-a and full-version was assessed using Bland-Altman plots. The ‘nomological’ validity of the full tool was investigated by comparing children’s brand awareness with food marketing-related variables (e.g. television habits, intake of heavily promoted foods). Results Brand awareness increased with age (p<0.01) but was not significantly correlated with other variables. Bland-Altman analyses showed good agreement between the ABAI and ABAI-a. Reliability analyses revealed excellent agreement between the two administrations of the full-ABAI. Conclusions The ABAI was able to differentiate children’s varying levels of brand awareness. It was shown to be a valid and reliable tool and may allow quantification of brand awareness as a proxy measure for children’s prior food marketing exposure. PMID:26222624

  7. Next-Generation Genotyping by Digital PCR to Detect and Quantify the BRAF V600E Mutation in Melanoma Biopsies.

    PubMed

    Lamy, Pierre-Jean; Castan, Florence; Lozano, Nicolas; Montélion, Cécile; Audran, Patricia; Bibeau, Frédéric; Roques, Sylvie; Montels, Frédéric; Laberenne, Anne-Claire

    2015-07-01

    The detection of the BRAF V600E mutation in melanoma samples is used to select patients who should respond to BRAF inhibitors. Different techniques are routinely used to determine BRAF status in clinical samples. However, low tumor cellularity and tumor heterogeneity can affect the sensitivity of somatic mutation detection. Digital PCR (dPCR) is a next-generation genotyping method that clonally amplifies nucleic acids and allows the detection and quantification of rare mutations. Our aim was to evaluate the clinical routine performance of a new dPCR-based test to detect and quantify BRAF mutation load in 47 paraffin-embedded cutaneous melanoma biopsies. We compared the results obtained by dPCR with high-resolution melting curve analysis and pyrosequencing or with one of the allele-specific PCR methods available on the market. dPCR showed the lowest limit of detection. dPCR and allele-specific amplification detected the highest number of mutated samples. For the BRAF mutation load quantification both dPCR and pyrosequencing gave similar results with strong disparities in allele frequencies in the 47 tumor samples under study (from 0.7% to 79% of BRAF V600E mutations/sample). In conclusion, the four methods showed a high degree of concordance. dPCR was the more-sensitive method to reliably and easily detect mutations. Both pyrosequencing and dPCR could quantify the mutation load in heterogeneous tumor samples. Copyright © 2015 American Society for Investigative Pathology and the Association for Molecular Pathology. Published by Elsevier Inc. All rights reserved.

  8. Capillary electrophoresis with indirect UV detection for the determination of stabilizers and citrates present in human albumin solutions.

    PubMed

    Jaworska, Małgorzata; Cygan, Paulina; Wilk, Małgorzata; Anuszewska, Elzbieta

    2009-08-15

    Sodium caprylate and N-acetyltryptophan are the most frequently used stabilizers that protect the albumin from aggregation or heat induced denaturation. In turn citrates - excipients remaining after fractionation process - can be treated as by-product favoring leaching aluminum out of glass containers whilst albumin solution is stored. With ionic nature these substances have all the markings of a subject for capillary electrophoresis analysis. Thus CE methods were proposed as new approach for quality control of human albumin solution in terms of determination of stabilizers and citrates residue. Human albumin solutions both 5% and 20% from various manufacturers were tested. Indirect detection mode was set to provide sufficient detectability of analytes lacking of chromophores. As being anions analytes were separated with reversed electroosmotic flow. As a result of method optimization two background electrolytes based on p-hydroxybenzoic acid and 2,6-pyridinedicarboxylic acid were selected for stabilizers and citrates separation, respectively. The optimized methods were successfully validated. For citrates that require quantification below 100microM the method demonstrated the precision less than 4% and the limit of detection at 4microM. In order to check the new methods accuracy and applicability the samples were additionally tested with selected reference methods. The proposed methods allow reliable quantification of stabilizers and citrates in human albumin solution that was confirmed by method validation as well as result comparison with reference methods. The CE methods are considered to be suitable for quality control yet simplifying and reducing cost of analysis.

  9. Non-invasive dynamic near-infrared imaging and quantification of vascular leakage in vivo.

    PubMed

    Proulx, Steven T; Luciani, Paola; Alitalo, Annamari; Mumprecht, Viviane; Christiansen, Ailsa J; Huggenberger, Reto; Leroux, Jean-Christophe; Detmar, Michael

    2013-07-01

    Preclinical vascular research has been hindered by a lack of methods that can sensitively image and quantify vascular perfusion and leakage in vivo. In this study, we have developed dynamic near-infrared imaging methods to repeatedly visualize and quantify vascular leakage in mouse skin in vivo, and we have applied these methods to transgenic mice with overexpression of vascular endothelial growth factors VEGF-A or -C. Near-infrared dye conjugates were developed to identify a suitable vascular tracer that had a prolonged circulation lifetime and slow leakage into normal tissue after intravenous injection. Dynamic simultaneous imaging of ear skin and a large blood vessel in the leg enabled determination of the intravascular signal (blood volume fraction) from the tissue signal shortly after injection and quantifications of vascular leakage into the extravascular tissue over time. This method allowed for the sensitive detection of increased blood vascularity and leakage rates in K14-VEGF-A transgenic mice and also reliably measured inflammation-induced changes of vascularity and leakage over time in the same mice. Measurements after injection of recombinant VEGF-A surprisingly revealed increased blood vascular leakage and lymphatic clearance in K14-VEGF-C transgenic mice which have an expanded cutaneous lymphatic vessel network, potentially indicating unanticipated effects of lymphatic drainage on vascular leakage. Increased vascular leakage was also detected in subcutaneous tumors, confirming that the method can also be applied to deeper tissues. This new imaging method might facilitate longitudinal investigations of the in vivo effects of drug candidates, including angiogenesis inhibitors, in preclinical disease models.

  10. Non-invasive dynamic near-infrared imaging and quantification of vascular leakage in vivo

    PubMed Central

    Proulx, Steven T.; Luciani, Paola; Alitalo, Annamari; Mumprecht, Viviane; Christiansen, Ailsa J.; Huggenberger, Reto

    2013-01-01

    Preclinical vascular research has been hindered by a lack of methods that can sensitively image and quantify vascular perfusion and leakage in vivo. In this study, we have developed dynamic near-infrared imaging methods to repeatedly visualize and quantify vascular leakage in mouse skin in vivo, and we have applied these methods to transgenic mice with overexpression of vascular endothelial growth factors VEGF-A or -C. Near-infrared dye conjugates were developed to identify a suitable vascular tracer that had a prolonged circulation lifetime and slow leakage into normal tissue after intravenous injection. Dynamic simultaneous imaging of ear skin and a large blood vessel in the leg enabled determination of the intravascular signal (blood volume fraction) from the tissue signal shortly after injection and quantifications of vascular leakage into the extravascular tissue over time. This method allowed for the sensitive detection of increased blood vascularity and leakage rates in K14-VEGF-A transgenic mice and also reliably measured inflammation-induced changes of vascularity and leakage over time in the same mice. Measurements after injection of recombinant VEGF-A surprisingly revealed increased blood vascular leakage and lymphatic clearance in K14-VEGF-C transgenic mice which have an expanded cutaneous lymphatic vessel network, potentially indicating unanticipated effects of lymphatic drainage on vascular leakage. Increased vascular leakage was also detected in subcutaneous tumors, confirming that the method can also be applied to deeper tissues. This new imaging method might facilitate longitudinal investigations of the in vivo effects of drug candidates, including angiogenesis inhibitors, in preclinical disease models. PMID:23325334

  11. Asymmetric flow field-flow fractionation coupled to inductively coupled plasma mass spectrometry for the quantification of quantum dots bioconjugation efficiency.

    PubMed

    Menéndez-Miranda, Mario; Encinar, Jorge Ruiz; Costa-Fernández, José M; Sanz-Medel, Alfredo

    2015-11-27

    Hyphenation of asymmetric flow field-flow fractionation (AF4) to an on-line elemental detection (inductively coupled plasma-mass spectrometry, ICP-MS) is proposed as a powerful diagnostic tool for quantum dots bioconjugation studies. In particular, conjugation effectiveness between a "model" monoclonal IgG antibody (Ab) and CdSe/ZnS core-shell Quantum Dots (QDs), surface-coated with an amphiphilic polymer, has been monitored here by such hybrid AF4-ICP-MS technique. Experimental conditions have been optimized searching for a proper separation between the sought bioconjugates from the eventual free reagents excesses employed during the bioconjugation (QDs and antibodies). Composition and pH of the carrier have been found to be critical parameters to ensure an efficient separation while ensuring high species recovery from the AF4 channel. An ICP-MS equipped with a triple quadropole was selected as elemental detector to enable sensitive and reliable simultaneous quantification of the elemental constituents, including sulfur, of the nanoparticulated species and the antibody. The hyphenated technique used provided nanoparticle size-based separation, elemental detection, and composition analysis capabilities that turned out to be instrumental in order to investigate in depth the Ab-QDs bioconjugation process. Moreover, the analytical strategy here proposed allowed us not only to clearly identify the bioconjugation reaction products but also to quantify nanoparticle:antibodies bioconjugation efficiency. This is a key issue in future development of analytical and bioanalytical photoluminescent QDs applications. Copyright © 2015 Elsevier B.V. All rights reserved.

  12. A Methodological Approach to Quantifying Plyometric Intensity.

    PubMed

    Jarvis, Mark M; Graham-Smith, Phil; Comfort, Paul

    2016-09-01

    Jarvis, MM, Graham-Smith, P, and Comfort, P. A Methodological approach to quantifying plyometric intensity. J Strength Cond Res 30(9): 2522-2532, 2016-In contrast to other methods of training, the quantification of plyometric exercise intensity is poorly defined. The purpose of this study was to evaluate the suitability of a range of neuromuscular and mechanical variables to describe the intensity of plyometric exercises. Seven male recreationally active subjects performed a series of 7 plyometric exercises. Neuromuscular activity was measured using surface electromyography (SEMG) at vastus lateralis (VL) and biceps femoris (BF). Surface electromyography data were divided into concentric (CON) and eccentric (ECC) phases of movement. Mechanical output was measured by ground reaction forces and processed to provide peak impact ground reaction force (PF), peak eccentric power (PEP), and impulse (IMP). Statistical analysis was conducted to assess the reliability intraclass correlation coefficient and sensitivity smallest detectable difference of all variables. Mean values of SEMG demonstrate high reliability (r ≥ 0.82), excluding ECC VL during a 40-cm drop jump (r = 0.74). PF, PEP, and IMP demonstrated high reliability (r ≥ 0.85). Statistical power for force variables was excellent (power = 1.0), and good for SEMG (power ≥0.86) excluding CON BF (power = 0.57). There was no significant difference (p > 0.05) in CON SEMG between exercises. Eccentric phase SEMG only distinguished between exercises involving a landing and those that did not (percentage of maximal voluntary isometric contraction [%MVIC] = no landing -65 ± 5, landing -140 ± 8). Peak eccentric power, PF, and IMP all distinguished between exercises. In conclusion, CON neuromuscular activity does not appear to vary when intent is maximal, whereas ECC activity is dependent on the presence of a landing. Force characteristics provide a reliable and sensitive measure enabling precise description of intensity in plyometric exercises. The present findings provide coaches and scientists with an insightful and precise method of measuring intensity in plyometrics, which will allow for greater control of programming variables.

  13. Automatic Segmentation and Quantification of Filamentous Structures in Electron Tomography

    PubMed Central

    Loss, Leandro A.; Bebis, George; Chang, Hang; Auer, Manfred; Sarkar, Purbasha; Parvin, Bahram

    2016-01-01

    Electron tomography is a promising technology for imaging ultrastructures at nanoscale resolutions. However, image and quantitative analyses are often hindered by high levels of noise, staining heterogeneity, and material damage either as a result of the electron beam or sample preparation. We have developed and built a framework that allows for automatic segmentation and quantification of filamentous objects in 3D electron tomography. Our approach consists of three steps: (i) local enhancement of filaments by Hessian filtering; (ii) detection and completion (e.g., gap filling) of filamentous structures through tensor voting; and (iii) delineation of the filamentous networks. Our approach allows for quantification of filamentous networks in terms of their compositional and morphological features. We first validate our approach using a set of specifically designed synthetic data. We then apply our segmentation framework to tomograms of plant cell walls that have undergone different chemical treatments for polysaccharide extraction. The subsequent compositional and morphological analyses of the plant cell walls reveal their organizational characteristics and the effects of the different chemical protocols on specific polysaccharides. PMID:28090597

  14. Automatic Segmentation and Quantification of Filamentous Structures in Electron Tomography.

    PubMed

    Loss, Leandro A; Bebis, George; Chang, Hang; Auer, Manfred; Sarkar, Purbasha; Parvin, Bahram

    2012-10-01

    Electron tomography is a promising technology for imaging ultrastructures at nanoscale resolutions. However, image and quantitative analyses are often hindered by high levels of noise, staining heterogeneity, and material damage either as a result of the electron beam or sample preparation. We have developed and built a framework that allows for automatic segmentation and quantification of filamentous objects in 3D electron tomography. Our approach consists of three steps: (i) local enhancement of filaments by Hessian filtering; (ii) detection and completion (e.g., gap filling) of filamentous structures through tensor voting; and (iii) delineation of the filamentous networks. Our approach allows for quantification of filamentous networks in terms of their compositional and morphological features. We first validate our approach using a set of specifically designed synthetic data. We then apply our segmentation framework to tomograms of plant cell walls that have undergone different chemical treatments for polysaccharide extraction. The subsequent compositional and morphological analyses of the plant cell walls reveal their organizational characteristics and the effects of the different chemical protocols on specific polysaccharides.

  15. Final Technical Report on Quantifying Dependability Attributes of Software Based Safety Critical Instrumentation and Control Systems in Nuclear Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smidts, Carol; Huang, Funqun; Li, Boyuan

    With the current transition from analog to digital instrumentation and control systems in nuclear power plants, the number and variety of software-based systems have significantly increased. The sophisticated nature and increasing complexity of software raises trust in these systems as a significant challenge. The trust placed in a software system is typically termed software dependability. Software dependability analysis faces uncommon challenges since software systems’ characteristics differ from those of hardware systems. The lack of systematic science-based methods for quantifying the dependability attributes in software-based instrumentation as well as control systems in safety critical applications has proved itself to be amore » significant inhibitor to the expanded use of modern digital technology in the nuclear industry. Dependability refers to the ability of a system to deliver a service that can be trusted. Dependability is commonly considered as a general concept that encompasses different attributes, e.g., reliability, safety, security, availability and maintainability. Dependability research has progressed significantly over the last few decades. For example, various assessment models and/or design approaches have been proposed for software reliability, software availability and software maintainability. Advances have also been made to integrate multiple dependability attributes, e.g., integrating security with other dependability attributes, measuring availability and maintainability, modeling reliability and availability, quantifying reliability and security, exploring the dependencies between security and safety and developing integrated analysis models. However, there is still a lack of understanding of the dependencies between various dependability attributes as a whole and of how such dependencies are formed. To address the need for quantification and give a more objective basis to the review process -- therefore reducing regulatory uncertainty -- measures and methods are needed to assess dependability attributes early on, as well as throughout the life-cycle process of software development. In this research, extensive expert opinion elicitation is used to identify the measures and methods for assessing software dependability. Semi-structured questionnaires were designed to elicit expert knowledge. A new notation system, Causal Mechanism Graphing, was developed to extract and represent such knowledge. The Causal Mechanism Graphs were merged, thus, obtaining the consensus knowledge shared by the domain experts. In this report, we focus on how software contributes to dependability. However, software dependability is not discussed separately from the context of systems or socio-technical systems. Specifically, this report focuses on software dependability, reliability, safety, security, availability, and maintainability. Our research was conducted in the sequence of stages found below. Each stage is further examined in its corresponding chapter. Stage 1 (Chapter 2): Elicitation of causal maps describing the dependencies between dependability attributes. These causal maps were constructed using expert opinion elicitation. This chapter describes the expert opinion elicitation process, the questionnaire design, the causal map construction method and the causal maps obtained. Stage 2 (Chapter 3): Elicitation of the causal map describing the occurrence of the event of interest for each dependability attribute. The causal mechanisms for the “event of interest” were extracted for each of the software dependability attributes. The “event of interest” for a dependability attribute is generally considered to be the “attribute failure”, e.g. security failure. The extraction was based on the analysis of expert elicitation results obtained in Stage 1. Stage 3 (Chapter 4): Identification of relevant measurements. Measures for the “events of interest” and their causal mechanisms were obtained from expert opinion elicitation for each of the software dependability attributes. The measures extracted are presented in this chapter. Stage 4 (Chapter 5): Assessment of the coverage of the causal maps via measures. Coverage was assessed to determine whether the measures obtained were sufficient to quantify software dependability, and what measures are further required. Stage 5 (Chapter 6): Identification of “missing” measures and measurement approaches for concepts not covered. New measures, for concepts that had not been covered sufficiently as determined in Stage 4, were identified using supplementary expert opinion elicitation as well as literature reviews. Stage 6 (Chapter 7): Building of a detailed quantification model based on the causal maps and measurements obtained. Ability to derive such a quantification model shows that the causal models and measurements derived from the previous stages (Stage 1 to Stage 5) can form the technical basis for developing dependability quantification models. Scope restrictions have led us to prioritize this demonstration effort. The demonstration was focused on a critical system, i.e. the reactor protection system. For this system, a ranking of the software dependability attributes by nuclear stakeholders was developed. As expected for this application, the stakeholder ranking identified safety as the most critical attribute to be quantified. A safety quantification model limited to the requirements phase of development was built. Two case studies were conducted for verification. A preliminary control gate for software safety for the requirements stage was proposed and applied to the first case study. The control gate allows a cost effective selection of the duration of the requirements phase.« less

  16. CASL Dakota Capabilities Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Simmons, Chris; Williams, Brian J.

    2017-10-10

    The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.

  17. Validation and evaluation of an HPLC methodology for the quantification of the potent antimitotic compound (+)-discodermolide in the Caribbean marine sponge Discodermia dissoluta.

    PubMed

    Valderrama, Katherine; Castellanos, Leonardo; Zea, Sven

    2010-08-01

    The sponge Discodermia dissoluta is the source of the potent antimitotic compound (+)-discodermolide. The relatively abundant and shallow populations of this sponge in Santa Marta, Colombia, allow for studies to evaluate the natural and biotechnological supply options of (+)-discodermolide. In this work, an RP-HPLC-UV methodology for the quantification of (+)-discodermolide from sponge samples was tested and validated. Our protocol for extracting this compound from the sponge included lyophilization, exhaustive methanol extraction, partitioning using water and dichloromethane, purification of the organic fraction in RP-18 cartridges and then finally retrieving the (+)-discodermolide in the methanol-water (80:20 v/v) fraction. This fraction was injected into an HPLC system with an Xterra RP-18 column and a detection wavelength of 235 nm. The calibration curve was linear, making it possible to calculate the LODs and quantification in these experiments. The intra-day and inter-day precision showed relative standard deviations lower than 5%. The accuracy, determined as the percentage recovery, was 99.4%. Nine samples of the sponge from the Bahamas, Bonaire, Curaçao and Santa Marta had concentrations of (+)-discodermolide ranging from 5.3 to 29.3 microg/g(-1) of wet sponge. This methodology is quick and simple, allowing for the quantification in sponges from natural environments, in situ cultures or dissociated cells.

  18. Methods to Detect Nitric Oxide and its Metabolites in Biological Samples

    PubMed Central

    Bryan, Nathan S.; Grisham, Matthew B.

    2007-01-01

    Nitric oxide (NO) methodology is a complex and often confusing science and the focus of many debates and discussion concerning NO biochemistry. NO is involved in many physiological processes including regulation of blood pressure, immune response and neural communication. Therefore its accurate detection and quantification is critical to understanding health and disease. Due to the extremely short physiological half life of this gaseous free radical, alternative strategies for the detection of reaction products of NO biochemistry have been developed. The quantification of NO metabolites in biological samples provides valuable information with regards to in vivo NO production, bioavailability and metabolism. Simply sampling a single compartment such as blood or plasma may not always provide an accurate assessment of whole body NO status, particularly in tissues. Therefore, extrapolation of plasma or blood NO status to specific tissues of interest is no longer a valid approach. As a result, methods continue to be developed and validated which allow the detection and quantification of NO and NO-related products/metabolites in multiple compartments of experimental animals in vivo. The methods described in this review is not an exhaustive or comprehensive discussion of all methods available for the detection of NO but rather a description of the most commonly used and practical methods which allow accurate and sensitive quantification of NO products/metabolites in multiple biological matrices under normal physiological conditions. PMID:17664129

  19. Processing and domain selection: Quantificational variability effects

    PubMed Central

    Harris, Jesse A.; Clifton, Charles; Frazier, Lyn

    2014-01-01

    Three studies investigated how readers interpret sentences with variable quantificational domains, e.g., The army was mostly in the capital, where mostly may quantify over individuals or parts (Most of the army was in the capital) or over times (The army was in the capital most of the time). It is proposed that a general conceptual economy principle, No Extra Times (Majewski 2006, in preparation), discourages the postulation of potentially unnecessary times, and thus favors the interpretation quantifying over parts. Disambiguating an ambiguously quantified sentence to a quantification over times interpretation was rated as less natural than disambiguating it to a quantification over parts interpretation (Experiment 1). In an interpretation questionnaire, sentences with similar quantificational variability were constructed so that both interpretations of the sentence would require postulating multiple times; this resulted in the elimination of the preference for a quantification over parts interpretation, suggesting the parts preference observed in Experiment 1 is not reducible to a lexical bias of the adverb mostly (Experiment 2). An eye movement recording study showed that, in the absence of prior evidence for multiple times, readers exhibit greater difficulty when reading material that forces a quantification over times interpretation than when reading material that allows a quantification over parts interpretation (Experiment 3). These experiments contribute to understanding readers’ default assumptions about the temporal properties of sentences, which is essential for understanding the selection of a domain for adverbial quantifiers and, more generally, for understanding how situational constraints influence sentence processing. PMID:25328262

  20. Reliability evaluation methodology for NASA applications

    NASA Technical Reports Server (NTRS)

    Taneja, Vidya S.

    1992-01-01

    Liquid rocket engine technology has been characterized by the development of complex systems containing large number of subsystems, components, and parts. The trend to even larger and more complex system is continuing. The liquid rocket engineers have been focusing mainly on performance driven designs to increase payload delivery of a launch vehicle for a given mission. In otherwords, although the failure of a single inexpensive part or component may cause the failure of the system, reliability in general has not been considered as one of the system parameters like cost or performance. Up till now, quantification of reliability has not been a consideration during system design and development in the liquid rocket industry. Engineers and managers have long been aware of the fact that the reliability of the system increases during development, but no serious attempts have been made to quantify reliability. As a result, a method to quantify reliability during design and development is needed. This includes application of probabilistic models which utilize both engineering analysis and test data. Classical methods require the use of operating data for reliability demonstration. In contrast, the method described in this paper is based on similarity, analysis, and testing combined with Bayesian statistical analysis.

  1. Measurement of inter- and intra-annual variability of landscape fire activity at a continental scale: The Australian case

    Treesearch

    Grant J. Williamson; Lynda D. Prior; Matt Jolly; Mark A. Cochrane; Brett P. Murphy; David M. J. S. Bowman

    2016-01-01

    Climate dynamics at diurnal, seasonal and inter-annual scales shape global fire activity, although difficulties of assembling reliable fire and meteorological data with sufficient spatio-temporal resolution have frustrated quantification of this variability. Using Australia as a case study, we combine data from 4760 meteorological stations with 12 years of satellite-...

  2. High-performance liquid chromatographic method for the determination of dansyl-polyamines

    Treesearch

    Subhash C. Minocha; Rakesh Minocha; Cheryl A. Robie

    1990-01-01

    This paper describes a fast reliable, and a sensitive technique for the separation and quantification of dansylated polyamines by high-performance liquid chromatography. Using a small 33 x 4.6 mm I.D., 3 ?m particle size, C18 reversed-phase cartridge column and a linear gradient of acetonitrile-heptanesulfonate (10 mM, pH 3.4...

  3. Refinements of the attending equations for several spectral methods that provide improved quantification of B-carotene and/or lycopene in selected foods

    USDA-ARS?s Scientific Manuscript database

    Developing and maintaining maximal levels of carotenoids in fruits and vegetables that contain them is a concern of the produce industry. Toward this end, reliable methods for quantifying lycopene and B-carotene, two of the major health-enhancing carotenoids, are necessary. The goal of this resear...

  4. A new semiquantitative method for evaluation of metastasis progression.

    PubMed

    Volarevic, A; Ljujic, B; Volarevic, V; Milovanovic, M; Kanjevac, T; Lukic, A; Arsenijevic, N

    2012-01-01

    Although recent technical advancements are directed toward developing novel assays and methods for detection of micro and macro metastasis, there are still no reports of reliable, simple to use imaging software that could be used for the detection and quantification of metastasis in tissue sections. We herein report a new semiquantitative method for evaluation of metastasis progression in a well established 4T1 orthotopic mouse model of breast cancer metastasis. The new semiquantitative method presented here was implemented by using the Autodesk AutoCAD 2012 program, a computer-aided design program used primarily for preparing technical drawings in 2 dimensions. By using the Autodesk AutoCAD 2012 software- aided graphical evaluation we managed to detect each metastatic lesion and we precisely calculated the average percentage of lung and liver tissue parenchyma with metastasis in 4T1 tumor-bearing mice. The data were highly specific and relevant to descriptive histological analysis, confirming reliability and accuracy of the AutoCAD 2012 software as new method for quantification of metastatic lesions. The new semiquantitative method using AutoCAD 2012 software provides a novel approach for the estimation of metastatic progression in histological tissue sections.

  5. Modeling and Quantification of Team Performance in Human Reliability Analysis for Probabilistic Risk Assessment

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Jeffrey C. JOe; Ronald L. Boring

    Probabilistic Risk Assessment (PRA) and Human Reliability Assessment (HRA) are important technical contributors to the United States (U.S.) Nuclear Regulatory Commission’s (NRC) risk-informed and performance based approach to regulating U.S. commercial nuclear activities. Furthermore, all currently operating commercial NPPs in the U.S. are required by federal regulation to be staffed with crews of operators. Yet, aspects of team performance are underspecified in most HRA methods that are widely used in the nuclear industry. There are a variety of "emergent" team cognition and teamwork errors (e.g., communication errors) that are 1) distinct from individual human errors, and 2) important to understandmore » from a PRA perspective. The lack of robust models or quantification of team performance is an issue that affects the accuracy and validity of HRA methods and models, leading to significant uncertainty in estimating HEPs. This paper describes research that has the objective to model and quantify team dynamics and teamwork within NPP control room crews for risk informed applications, thereby improving the technical basis of HRA, which improves the risk-informed approach the NRC uses to regulate the U.S. commercial nuclear industry.« less

  6. Simultaneous quantification of 21 water soluble vitamin circulating forms in human plasma by liquid chromatography-mass spectrometry.

    PubMed

    Meisser Redeuil, Karine; Longet, Karin; Bénet, Sylvie; Munari, Caroline; Campos-Giménez, Esther

    2015-11-27

    This manuscript reports a validated analytical approach for the quantification of 21 water soluble vitamins and their main circulating forms in human plasma. Isotope dilution-based sample preparation consisted of protein precipitation using acidic methanol enriched with stable isotope labelled internal standards. Separation was achieved by reversed-phase liquid chromatography and detection performed by tandem mass spectrometry in positive electrospray ionization mode. Instrumental lower limits of detection and quantification reached <0.1-10nM and 0.2-25nM, respectively. Commercially available pooled human plasma was used to build matrix-matched calibration curves ranging 2-500, 5-1250, 20-5000 or 150-37500nM depending on the analyte. The overall performance of the method was considered adequate, with 2.8-20.9% and 5.2-20.0% intra and inter-day precision, respectively and averaged accuracy reaching 91-108%. Recovery experiments were also performed and reached in average 82%. This analytical approach was then applied for the quantification of circulating water soluble vitamins in human plasma single donor samples. The present report provides a sensitive and reliable approach for the quantification of water soluble vitamins and main circulating forms in human plasma. In the future, the application of this analytical approach will give more confidence to provide a comprehensive assessment of water soluble vitamins nutritional status and bioavailability studies in humans. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection

    PubMed Central

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-01-01

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets. PMID:27739510

  8. Real-time PCR based on SYBR-Green I fluorescence: an alternative to the TaqMan assay for a relative quantification of gene rearrangements, gene amplifications and micro gene deletions.

    PubMed

    Ponchel, Frederique; Toomes, Carmel; Bransfield, Kieran; Leong, Fong T; Douglas, Susan H; Field, Sarah L; Bell, Sandra M; Combaret, Valerie; Puisieux, Alain; Mighell, Alan J; Robinson, Philip A; Inglehearn, Chris F; Isaacs, John D; Markham, Alex F

    2003-10-13

    Real-time PCR is increasingly being adopted for RNA quantification and genetic analysis. At present the most popular real-time PCR assay is based on the hybridisation of a dual-labelled probe to the PCR product, and the development of a signal by loss of fluorescence quenching as PCR degrades the probe. Though this so-called 'TaqMan' approach has proved easy to optimise in practice, the dual-labelled probes are relatively expensive. We have designed a new assay based on SYBR-Green I binding that is quick, reliable, easily optimised and compares well with the published assay. Here we demonstrate its general applicability by measuring copy number in three different genetic contexts; the quantification of a gene rearrangement (T-cell receptor excision circles (TREC) in peripheral blood mononuclear cells); the detection and quantification of GLI, MYC-C and MYC-N gene amplification in cell lines and cancer biopsies; and detection of deletions in the OPA1 gene in dominant optic atrophy. Our assay has important clinical applications, providing accurate diagnostic results in less time, from less biopsy material and at less cost than assays currently employed such as FISH or Southern blotting.

  9. Target analyte quantification by isotope dilution LC-MS/MS directly referring to internal standard concentrations--validation for serum cortisol measurement.

    PubMed

    Maier, Barbara; Vogeser, Michael

    2013-04-01

    Isotope dilution LC-MS/MS methods used in the clinical laboratory typically involve multi-point external calibration in each analytical series. Our aim was to test the hypothesis that determination of target analyte concentrations directly derived from the relation of the target analyte peak area to the peak area of a corresponding stable isotope labelled internal standard compound [direct isotope dilution analysis (DIDA)] may be not inferior to conventional external calibration with respect to accuracy and reproducibility. Quality control samples and human serum pools were analysed in a comparative validation protocol for cortisol as an exemplary analyte by LC-MS/MS. Accuracy and reproducibility were compared between quantification either involving a six-point external calibration function, or a result calculation merely based on peak area ratios of unlabelled and labelled analyte. Both quantification approaches resulted in similar accuracy and reproducibility. For specified analytes, reliable analyte quantification directly derived from the ratio of peak areas of labelled and unlabelled analyte without the need for a time consuming multi-point calibration series is possible. This DIDA approach is of considerable practical importance for the application of LC-MS/MS in the clinical laboratory where short turnaround times often have high priority.

  10. Detection and quantification of genetically modified organisms using very short, locked nucleic acid TaqMan probes.

    PubMed

    Salvi, Sergio; D'Orso, Fabio; Morelli, Giorgio

    2008-06-25

    Many countries have introduced mandatory labeling requirements on foods derived from genetically modified organisms (GMOs). Real-time quantitative polymerase chain reaction (PCR) based upon the TaqMan probe chemistry has become the method mostly used to support these regulations; moreover, event-specific PCR is the preferred method in GMO detection because of its high specificity based on the flanking sequence of the exogenous integrant. The aim of this study was to evaluate the use of very short (eight-nucleotide long), locked nucleic acid (LNA) TaqMan probes in 5'-nuclease PCR assays for the detection and quantification of GMOs. Classic TaqMan and LNA TaqMan probes were compared for the analysis of the maize MON810 transgene. The performance of the two types of probes was tested on the maize endogenous reference gene hmga, the CaMV 35S promoter, and the hsp70/cryIA(b) construct as well as for the event-specific 5'-integration junction of MON810, using plasmids as standard reference molecules. The results of our study demonstrate that the LNA 5'-nuclease PCR assays represent a valid and reliable analytical system for the detection and quantification of transgenes. Application of very short LNA TaqMan probes to GMO quantification can simplify the design of 5'-nuclease assays.

  11. Feasibility and accuracy of dual-layer spectral detector computed tomography for quantification of gadolinium: a phantom study.

    PubMed

    van Hamersvelt, Robbert W; Willemink, Martin J; de Jong, Pim A; Milles, Julien; Vlassenbroek, Alain; Schilham, Arnold M R; Leiner, Tim

    2017-09-01

    The aim of this study was to evaluate the feasibility and accuracy of dual-layer spectral detector CT (SDCT) for the quantification of clinically encountered gadolinium concentrations. The cardiac chamber of an anthropomorphic thoracic phantom was equipped with 14 tubular inserts containing different gadolinium concentrations, ranging from 0 to 26.3 mg/mL (0.0, 0.1, 0.2, 0.4, 0.5, 1.0, 2.0, 3.0, 4.0, 5.1, 10.6, 15.7, 20.7 and 26.3 mg/mL). Images were acquired using a novel 64-detector row SDCT system at 120 and 140 kVp. Acquisitions were repeated five times to assess reproducibility. Regions of interest (ROIs) were drawn on three slices per insert. A spectral plot was extracted for every ROI and mean attenuation profiles were fitted to known attenuation profiles of water and pure gadolinium using in-house-developed software to calculate gadolinium concentrations. At both 120 and 140 kVp, excellent correlations between scan repetitions and true and measured gadolinium concentrations were found (R > 0.99, P < 0.001; ICCs > 0.99, CI 0.99-1.00). Relative mean measurement errors stayed below 10% down to 2.0 mg/mL true gadolinium concentration at 120 kVp and below 5% down to 1.0 mg/mL true gadolinium concentration at 140 kVp. SDCT allows for accurate quantification of gadolinium at both 120 and 140 kVp. Lowest measurement errors were found for 140 kVp acquisitions. • Gadolinium quantification may be useful in patients with contraindication to iodine. • Dual-layer spectral detector CT allows for overall accurate quantification of gadolinium. • Interscan variability of gadolinium quantification using SDCT material decomposition is excellent.

  12. Development and validation of InnoQuant™, a sensitive human DNA quantitation and degradation assessment method for forensic samples using high copy number mobile elements Alu and SVA.

    PubMed

    Pineda, Gina M; Montgomery, Anne H; Thompson, Robyn; Indest, Brooke; Carroll, Marion; Sinha, Sudhir K

    2014-11-01

    There is a constant need in forensic casework laboratories for an improved way to increase the first-pass success rate of forensic samples. The recent advances in mini STR analysis, SNP, and Alu marker systems have now made it possible to analyze highly compromised samples, yet few tools are available that can simultaneously provide an assessment of quantity, inhibition, and degradation in a sample prior to genotyping. Currently there are several different approaches used for fluorescence-based quantification assays which provide a measure of quantity and inhibition. However, a system which can also assess the extent of degradation in a forensic sample will be a useful tool for DNA analysts. Possessing this information prior to genotyping will allow an analyst to more informatively make downstream decisions for the successful typing of a forensic sample without unnecessarily consuming DNA extract. Real-time PCR provides a reliable method for determining the amount and quality of amplifiable DNA in a biological sample. Alu are Short Interspersed Elements (SINE), approximately 300bp insertions which are distributed throughout the human genome in large copy number. The use of an internal primer to amplify a segment of an Alu element allows for human specificity as well as high sensitivity when compared to a single copy target. The advantage of an Alu system is the presence of a large number (>1000) of fixed insertions in every human genome, which minimizes the individual specific variation possible when using a multi-copy target quantification system. This study utilizes two independent retrotransposon genomic targets to obtain quantification of an 80bp "short" DNA fragment and a 207bp "long" DNA fragment in a degraded DNA sample in the multiplex system InnoQuant™. The ratio of the two quantitation values provides a "Degradation Index", or a qualitative measure of a sample's extent of degradation. The Degradation Index was found to be predictive of the observed loss of STR markers and alleles as degradation increases. Use of a synthetic target as an internal positive control (IPC) provides an additional assessment for the presence of PCR inhibitors in the test sample. In conclusion, a DNA based qualitative/quantitative/inhibition assessment system that accurately predicts the status of a biological sample, will be a valuable tool for deciding which DNA test kit to utilize and how much target DNA to use, when processing compromised forensic samples for DNA testing. Copyright © 2014 Elsevier Ireland Ltd. All rights reserved.

  13. A new analytical method for quantification of olive and palm oil in blends with other vegetable edible oils based on the chromatographic fingerprints from the methyl-transesterified fraction.

    PubMed

    Jiménez-Carvelo, Ana M; González-Casado, Antonio; Cuadros-Rodríguez, Luis

    2017-03-01

    A new analytical method for the quantification of olive oil and palm oil in blends with other vegetable edible oils (canola, safflower, corn, peanut, seeds, grapeseed, linseed, sesame and soybean) using normal phase liquid chromatography, and applying chemometric tools was developed. The procedure for obtaining of chromatographic fingerprint from the methyl-transesterified fraction from each blend is described. The multivariate quantification methods used were Partial Least Square-Regression (PLS-R) and Support Vector Regression (SVR). The quantification results were evaluated by several parameters as the Root Mean Square Error of Validation (RMSEV), Mean Absolute Error of Validation (MAEV) and Median Absolute Error of Validation (MdAEV). It has to be highlighted that the new proposed analytical method, the chromatographic analysis takes only eight minutes and the results obtained showed the potential of this method and allowed quantification of mixtures of olive oil and palm oil with other vegetable oils. Copyright © 2016 Elsevier B.V. All rights reserved.

  14. A Probabilistic Framework for Peptide and Protein Quantification from Data-Dependent and Data-Independent LC-MS Proteomics Experiments

    PubMed Central

    Richardson, Keith; Denny, Richard; Hughes, Chris; Skilling, John; Sikora, Jacek; Dadlez, Michał; Manteca, Angel; Jung, Hye Ryung; Jensen, Ole Nørregaard; Redeker, Virginie; Melki, Ronald; Langridge, James I.; Vissers, Johannes P.C.

    2013-01-01

    A probability-based quantification framework is presented for the calculation of relative peptide and protein abundance in label-free and label-dependent LC-MS proteomics data. The results are accompanied by credible intervals and regulation probabilities. The algorithm takes into account data uncertainties via Poisson statistics modified by a noise contribution that is determined automatically during an initial normalization stage. Protein quantification relies on assignments of component peptides to the acquired data. These assignments are generally of variable reliability and may not be present across all of the experiments comprising an analysis. It is also possible for a peptide to be identified to more than one protein in a given mixture. For these reasons the algorithm accepts a prior probability of peptide assignment for each intensity measurement. The model is constructed in such a way that outliers of any type can be automatically reweighted. Two discrete normalization methods can be employed. The first method is based on a user-defined subset of peptides, while the second method relies on the presence of a dominant background of endogenous peptides for which the concentration is assumed to be unaffected. Normalization is performed using the same computational and statistical procedures employed by the main quantification algorithm. The performance of the algorithm will be illustrated on example data sets, and its utility demonstrated for typical proteomics applications. The quantification algorithm supports relative protein quantification based on precursor and product ion intensities acquired by means of data-dependent methods, originating from all common isotopically-labeled approaches, as well as label-free ion intensity-based data-independent methods. PMID:22871168

  15. Computer-aided Assessment of Regional Abdominal Fat with Food Residue Removal in CT

    PubMed Central

    Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi

    2014-01-01

    Rationale and Objectives Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Materials and Methods Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. Results We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Conclusions Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. PMID:24119354

  16. Computer-aided assessment of regional abdominal fat with food residue removal in CT.

    PubMed

    Makrogiannis, Sokratis; Caturegli, Giorgio; Davatzikos, Christos; Ferrucci, Luigi

    2013-11-01

    Separate quantification of abdominal subcutaneous and visceral fat regions is essential to understand the role of regional adiposity as risk factor in epidemiological studies. Fat quantification is often based on computed tomography (CT) because fat density is distinct from other tissue densities in the abdomen. However, the presence of intestinal food residues with densities similar to fat may reduce fat quantification accuracy. We introduce an abdominal fat quantification method in CT with interest in food residue removal. Total fat was identified in the feature space of Hounsfield units and divided into subcutaneous and visceral components using model-based segmentation. Regions of food residues were identified and removed from visceral fat using a machine learning method integrating intensity, texture, and spatial information. Cost-weighting and bagging techniques were investigated to address class imbalance. We validated our automated food residue removal technique against semimanual quantifications. Our feature selection experiments indicated that joint intensity and texture features produce the highest classification accuracy at 95%. We explored generalization capability using k-fold cross-validation and receiver operating characteristic (ROC) analysis with variable k. Losses in accuracy and area under ROC curve between maximum and minimum k were limited to 0.1% and 0.3%. We validated tissue segmentation against reference semimanual delineations. The Dice similarity scores were as high as 93.1 for subcutaneous fat and 85.6 for visceral fat. Computer-aided regional abdominal fat quantification is a reliable computational tool for large-scale epidemiological studies. Our proposed intestinal food residue reduction scheme is an original contribution of this work. Validation experiments indicate very good accuracy and generalization capability. Published by Elsevier Inc.

  17. Automated saccharification assay for determination of digestibility in plant materials.

    PubMed

    Gomez, Leonardo D; Whitehead, Caragh; Barakate, Abdellah; Halpin, Claire; McQueen-Mason, Simon J

    2010-10-27

    Cell wall resistance represents the main barrier for the production of second generation biofuels. The deconstruction of lignocellulose can provide sugars for the production of fuels or other industrial products through fermentation. Understanding the biochemical basis of the recalcitrance of cell walls to digestion will allow development of more effective and cost efficient ways to produce sugars from biomass. One approach is to identify plant genes that play a role in biomass recalcitrance, using association genetics. Such an approach requires a robust and reliable high throughput (HT) assay for biomass digestibility, which can be used to screen the large numbers of samples involved in such studies. We developed a HT saccharification assay based on a robotic platform that can carry out in a 96-well plate format the enzymatic digestion and quantification of the released sugars. The handling of the biomass powder for weighing and formatting into 96 wells is performed by a robotic station, where the plant material is ground, delivered to the desired well in the plates and weighed with a precision of 0.1 mg. Once the plates are loaded, an automated liquid handling platform delivers an optional mild pretreatment (< 100°C) followed by enzymatic hydrolysis of the biomass. Aliquots from the hydrolysis are then analyzed for the release of reducing sugar equivalents. The same platform can be used for the comparative evaluation of different enzymes and enzyme cocktails. The sensitivity and reliability of the platform was evaluated by measuring the saccharification of stems from lignin modified tobacco plants, and the results of automated and manual analyses compared. The automated assay systems are sensitive, robust and reliable. The system can reliably detect differences in the saccharification of plant tissues, and is able to process large number of samples with a minimum amount of human intervention. The automated system uncovered significant increases in the digestibility of certain lignin modified lines in a manner compatible with known effects of lignin modification on cell wall properties. We conclude that this automated assay platform is of sufficient sensitivity and reliability to undertake the screening of the large populations of plants necessary for mutant identification and genetic association studies.

  18. Development and validation of an ultra high performance liquid chromatography-electrospray tandem mass spectrometry method using selective derivatisation, for the quantification of two reactive aldehydes produced by lipid peroxidation, HNE (4-hydroxy-2(E)-nonenal) and HHE (4-hydroxy-2(E)-hexenal) in faecal water.

    PubMed

    Chevolleau, S; Noguer-Meireles, M-H; Jouanin, I; Naud, N; Pierre, F; Gueraud, F; Debrauwer, L

    2018-04-15

    Red or processed meat rich diets have been shown to be associated with an elevated risk of colorectal cancer (CRC). One major hypothesis involves dietary heme iron which induces lipid peroxidation. The quantification of the resulting reactive aldehydes (e.g. HNE and HHE) in the colon lumen is therefore of great concern since these compounds are known for their cytotoxic and genotoxic properties. UHPLC-ESI-MS/MS method has been developed and validated for HNE and HHE quantification in rat faeces. Samples were derivatised using a brominated reagent (BBHA) in presence of pre-synthesized deuterated internal standards (HNE-d11/HHE-d5), extracted by solid phase extraction, and then analysed by LC-positive ESI-MS/MS (MRM) on a TSQ Vantage mass spectrometer. The use of BBHA allowed the efficient stabilisation of the unstable and reactive hydroxy-alkenals HNE and HHE. The MRM method allowed selective detection of HNE and HHE on the basis of characteristic transitions monitored from both the 79 and 81 bromine isotopic peaks. This method was validated according to the European Medicines Agency (EMEA) guidelines, by determining selectivity, sensitivity, linearity, carry-over effect, recovery, matrix effect, repeatability, trueness and intermediate precision. The performance of the method enabled the quantification of HNE and HHE in concentrations 0.10-0.15 μM in faecal water. Results are presented on the application to the quantification of HNE and HHE in different faecal waters obtained from faeces of rats fed diets with various fatty acid compositions thus corresponding to different pro-oxidative features. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. Quantification of silver nanoparticle uptake and distribution within individual human macrophages by FIB/SEM slice and view.

    PubMed

    Guehrs, Erik; Schneider, Michael; Günther, Christian M; Hessing, Piet; Heitz, Karen; Wittke, Doreen; López-Serrano Oliver, Ana; Jakubowski, Norbert; Plendl, Johanna; Eisebitt, Stefan; Haase, Andrea

    2017-03-21

    Quantification of nanoparticle (NP) uptake in cells or tissues is very important for safety assessment. Often, electron microscopy based approaches are used for this purpose, which allow imaging at very high resolution. However, precise quantification of NP numbers in cells and tissues remains challenging. The aim of this study was to present a novel approach, that combines precise quantification of NPs in individual cells together with high resolution imaging of their intracellular distribution based on focused ion beam/ scanning electron microscopy (FIB/SEM) slice and view approaches. We quantified cellular uptake of 75 nm diameter citrate stabilized silver NPs (Ag 75 Cit) into an individual human macrophage derived from monocytic THP-1 cells using a FIB/SEM slice and view approach. Cells were treated with 10 μg/ml for 24 h. We investigated a single cell and found in total 3138 ± 722 silver NPs inside this cell. Most of the silver NPs were located in large agglomerates, only a few were found in clusters of fewer than five NPs. Furthermore, we cross-checked our results by using inductively coupled plasma mass spectrometry and could confirm the FIB/SEM results. Our approach based on FIB/SEM slice and view is currently the only one that allows the quantification of the absolute dose of silver NPs in individual cells and at the same time to assess their intracellular distribution at high resolution. We therefore propose to use FIB/SEM slice and view to systematically analyse the cellular uptake of various NPs as a function of size, concentration and incubation time.

  20. Quantification of source uncertainties in Seismic Probabilistic Tsunami Hazard Analysis (SPTHA)

    NASA Astrophysics Data System (ADS)

    Selva, J.; Tonini, R.; Molinari, I.; Tiberti, M. M.; Romano, F.; Grezio, A.; Melini, D.; Piatanesi, A.; Basili, R.; Lorito, S.

    2016-06-01

    We propose a procedure for uncertainty quantification in Probabilistic Tsunami Hazard Analysis (PTHA), with a special emphasis on the uncertainty related to statistical modelling of the earthquake source in Seismic PTHA (SPTHA), and on the separate treatment of subduction and crustal earthquakes (treated as background seismicity). An event tree approach and ensemble modelling are used in spite of more classical approaches, such as the hazard integral and the logic tree. This procedure consists of four steps: (1) exploration of aleatory uncertainty through an event tree, with alternative implementations for exploring epistemic uncertainty; (2) numerical computation of tsunami generation and propagation up to a given offshore isobath; (3) (optional) site-specific quantification of inundation; (4) simultaneous quantification of aleatory and epistemic uncertainty through ensemble modelling. The proposed procedure is general and independent of the kind of tsunami source considered; however, we implement step 1, the event tree, specifically for SPTHA, focusing on seismic source uncertainty. To exemplify the procedure, we develop a case study considering seismic sources in the Ionian Sea (central-eastern Mediterranean Sea), using the coasts of Southern Italy as a target zone. The results show that an efficient and complete quantification of all the uncertainties is feasible even when treating a large number of potential sources and a large set of alternative model formulations. We also find that (i) treating separately subduction and background (crustal) earthquakes allows for optimal use of available information and for avoiding significant biases; (ii) both subduction interface and crustal faults contribute to the SPTHA, with different proportions that depend on source-target position and tsunami intensity; (iii) the proposed framework allows sensitivity and deaggregation analyses, demonstrating the applicability of the method for operational assessments.

  1. Macular pigment optical density measurements: evaluation of a device using heterochromatic flicker photometry

    PubMed Central

    de Kinkelder, R; van der Veen, R L P; Verbaak, F D; Faber, D J; van Leeuwen, T G; Berendschot, T T J M

    2011-01-01

    Purpose Accurate assessment of the amount of macular pigment (MPOD) is necessary to investigate the role of carotenoids and their assumed protective functions. High repeatability and reliability are important to monitor patients in studies investigating the influence of diet and supplements on MPOD. We evaluated the Macuscope (Macuvision Europe Ltd., Lapworth, Solihull, UK), a recently introduced device for measuring MPOD using the technique of heterochromatic flicker photometry (HFP). We determined agreement with another HFP device (QuantifEye; MPS 9000 series: Tinsley Precision Instruments Ltd., Croydon, Essex, UK) and a fundus reflectance method. Methods The right eyes of 23 healthy subjects (mean age 33.9±15.1 years) were measured. We determined agreement with QuantifEye and correlation with a fundus reflectance method. Repeatability of QuantifEye was assessed in 20 other healthy subjects (mean age 32.1±7.3 years). Repeatability was also compared with measurements by a fundus reflectance method in 10 subjects. Results We found low agreement between test and retest measurements with Macuscope. The average difference and the limits of agreement were −0.041±0.32. We found high agreement between test and retest measurements of QuantifEye (−0.02±0.18) and the fundus reflectance method (−0.04±0.18). MPOD data obtained by Macuscope and QuantifEye showed poor agreement: −0.017±0.44. For Macuscope and the fundus reflectance method, the correlation coefficient was r=0.05 (P=0.83). A significant correlation of r=0.87 (P<0.001) was found between QuantifEye and the fundus reflectance method. Conclusions Because repeatability of Macuscope measurements was low (ie, wide limits of agreement) and MPOD values correlated poorly with the fundus reflectance method, and agreed poorly with QuantifEye, the tested Macuscope protocol seems less suitable for studying MPOD. PMID:21057522

  2. 75 FR 74713 - Reliability Monitoring, Enforcement and Compliance Issues; Notice Allowing Post-Technical...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2010-12-01

    ... Monitoring, Enforcement and Compliance Issues; Notice Allowing Post-Technical Conference Comments November 23... Commission-led technical conference to explore issues associated with reliability monitoring, enforcement and...- 000, on or before December 9, 2010. \\1\\ Reliability Monitoring, Enforcement and Compliance Issues...

  3. Assessing senescence in Drosophila using video tracking.

    PubMed

    Ardekani, Reza; Tavaré, Simon; Tower, John

    2013-01-01

    Senescence is associated with changes in gene expression, including the upregulation of stress response- and innate immune response-related genes. In addition, aging animals exhibit characteristic changes in movement behaviors including decreased gait speed and a deterioration in sleep/wake rhythms. Here, we describe methods for tracking Drosophila melanogaster movements in 3D with simultaneous quantification of fluorescent transgenic reporters. This approach allows for the assessment of correlations between behavior, aging, and gene expression as well as for the quantification of biomarkers of aging.

  4. Quantification of skin wrinkles using low coherence interferometry

    NASA Astrophysics Data System (ADS)

    Oh, Jung-Taek; Kim, Beop-Min; Son, Sang-Ryoon; Lee, Sang-Won; Kim, Dong-Yoon; Kim, Youn-Soo

    2004-07-01

    We measure the skin wrinkle topology by means of low coherence interferometry (LCI), which forms the basis of the optical coherence tomography (OCT). The skin topology obtained using LCI and corresponding 2-D fast Fourier transform allow quantification of skin wrinkles. It took approximately 2 minutes to obtain 2.1 mm x 2.1 mm topological image with 4 um and 16 um resolutions in axial and transverse directions, respectively. Measurement examples show the particular case of skin contour change after-wrinkle cosmeceutical treatments and atopic dermatitis

  5. Cochlear microdialysis for quantification of dexamethasone and fluorescein entry into scala tympani during round window administration.

    PubMed

    Hahn, Hartmut; Kammerer, Bernd; DiMauro, Andre; Salt, Alec N; Plontke, Stefan K

    2006-02-01

    Before new drugs for the treatment of inner ear disorders can be studied in controlled clinical trials, it is important that their pharmacokinetics be established in inner ear fluids. Microdialysis allows drug levels to be measured in perilymph without the volume disturbances and potential cerebrospinal fluid contamination associated with fluid sampling. The aims of this study were to show: (i) that despite low recovery rates from miniature dialysis probes, significant amounts of drug are removed from small fluid compartments, (ii) that dialysis sampling artifacts can be accounted for using computer simulations and (iii) that microdialysis allows quantification of the entry rates through the round window membrane (RWM) into scala tympani (ST). Initial experiments used microdialysis probes in small compartments in vitro containing sodium fluorescein. Stable concentrations were observed in large compartments (1000 microl) but significant concentration declines were observed in smaller compartments (100, 10 and 5.6 microl) comparable to the size of the inner ear. Computer simulations of these experiments closely approximated the experimental data. In in vivo experiments, sodium fluorescein 10 mg/ml and dexamethasone-dihydrogen-phosphate disodium salt 8 mg/ml were simultaneously applied to the RWM of guinea pigs. Perilymph concentration in the basal turn of ST was monitored using microdialysis. The fluorescein concentration reached after 200 min application (585+/-527 microg/ml) was approximately twice that of dexamethasone phosphate (291+/-369 microg/ml). Substantial variation in concentrations was found between animals by approximately a factor of 34 for fluorescein and at least 41 for dexamethasone phosphate. This is, to a large extent, thought to be the result of the RWM permeability varying in different animals. It was not caused by substance analysis variations, because two different analytic methods were used and the concentration ratio between the two substances remained nearly constant across the experiments and because differences were apparent for the repeated samples obtained in each animal. Interpretation of the results using computer simulations allowed RWM permeability to be quantified. It also demonstrated, however, that cochlear clearance values could not be reliably obtained with microdialysis because of the significant contribution of dialysis to clearance. The observed interanimal variation, e.g., in RWM permeability, is likely to be clinically relevant to the local application of drugs in patients.

  6. Cochlear Microdialysis for Quantification of Dexamethasone and Fluorescein Entry into Scala Tympani During Round Window Administration

    PubMed Central

    Hahn, Hartmut; Kammerer, Bernd; DiMauro, Andre; Salt, Alec N.; Plontke, Stefan K.

    2006-01-01

    Before new drugs for the treatment of inner ear disorders can be studied in controlled clinical trials, it is important that their pharmacokinetics be established in inner ear fluids. Microdialysis allows drug levels to be measured in perilymph without the volume disturbances and potential cerebrospinal fluid contamination associated with fluid sampling. The aims of this study were to show: (i) that despite low recovery rates from miniature dialysis probes, significant amounts of drug are removed from small fluid compartments, (ii) that dialysis sampling artifacts can be accounted for using computer simulations and (iii) that microdialysis allows quantification of the entry rates through the round window membrane (RWM) into scala tympani (ST). Initial experiments used microdialysis probes in small compartments in vitro containing sodium fluorescein. Stable concentrations were observed in large compartments (1000 μl) but significant concentration declines were observed in smaller compartments (100, 10 and 5.6 μl) comparable to the size of the inner ear. Computer simulations of these experiments closely approximated the experimental data. In in vivo experiments, sodium fluorescein 10 mg/ml and dexamethasone-dihydrogen-phosphate disodium salt 8 mg/ml were simultaneously applied to the RWM of guinea pigs. Perilymph concentration in the basal turn of ST was monitored using microdialysis. The fluorescein concentration reached after 200 min application (585 ± 527 μg/ml) was approximately twice that of dexamethasone phosphate (291 ± 369 μg/ml). Substantial variation in concentrations was found between animals by approximately a factor of 34 for fluorescein and at least 41 for dexamethasone phosphate. This is, to a large extent, thought to be the result of the RWM permeability varying in different animals. It was not caused by substance analysis variations, because two different analytic methods were used and the concentration ratio between the two substances remained nearly constant across the experiments and because differences were apparent for the repeated samples obtained in each animal. Interpretation of the results using computer simulations allowed RWM permeability to be quantified. It also demonstrated, however, that cochlear clearance values could not be reliably obtained with microdialysis because of the significant contribution of dialysis to clearance. The observed interanimal variation, e.g., in RWM permeability, is likely to be clinically relevant to the local application of drugs in patients. PMID:16442251

  7. A quantitative witness for Greenberger-Horne-Zeilinger entanglement.

    PubMed

    Eltschka, Christopher; Siewert, Jens

    2012-01-01

    Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger-type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties.

  8. A quantitative witness for Greenberger-Horne-Zeilinger entanglement

    PubMed Central

    Eltschka, Christopher; Siewert, Jens

    2012-01-01

    Along with the vast progress in experimental quantum technologies there is an increasing demand for the quantification of entanglement between three or more quantum systems. Theory still does not provide adequate tools for this purpose. The objective is, besides the quest for exact results, to develop operational methods that allow for efficient entanglement quantification. Here we put forward an analytical approach that serves both these goals. We provide a simple procedure to quantify Greenberger-Horne-Zeilinger–type multipartite entanglement in arbitrary three-qubit states. For two qubits this method is equivalent to Wootters' seminal result for the concurrence. It establishes a close link between entanglement quantification and entanglement detection by witnesses, and can be generalised both to higher dimensions and to more than three parties. PMID:23267431

  9. Toward reliable and repeatable automated STEM-EDS metrology with high throughput

    NASA Astrophysics Data System (ADS)

    Zhong, Zhenxin; Donald, Jason; Dutrow, Gavin; Roller, Justin; Ugurlu, Ozan; Verheijen, Martin; Bidiuk, Oleksii

    2018-03-01

    New materials and designs in complex 3D architectures in logic and memory devices have raised complexity in S/TEM metrology. In this paper, we report about a newly developed, automated, scanning transmission electron microscopy (STEM) based, energy dispersive X-ray spectroscopy (STEM-EDS) metrology method that addresses these challenges. Different methodologies toward repeatable and efficient, automated STEM-EDS metrology with high throughput are presented: we introduce the best known auto-EDS acquisition and quantification methods for robust and reliable metrology and present how electron exposure dose impacts the EDS metrology reproducibility, either due to poor signalto-noise ratio (SNR) at low dose or due to sample modifications at high dose conditions. Finally, we discuss the limitations of the STEM-EDS metrology technique and propose strategies to optimize the process both in terms of throughput and metrology reliability.

  10. Central nervous system remyelination in culture--a tool for multiple sclerosis research.

    PubMed

    Zhang, Hui; Jarjour, Andrew A; Boyd, Amanda; Williams, Anna

    2011-07-01

    Multiple sclerosis is a demyelinating disease of the central nervous system which only affects humans. This makes it difficult to study at a molecular level, and to develop and test potential therapies that may change the course of the disease. The development of therapies to promote remyelination in multiple sclerosis is a key research aim, to both aid restoration of electrical impulse conduction in nerves and provide neuroprotection, reducing disability in patients. Testing a remyelination therapy in the many and various in vivo models of multiple sclerosis is expensive in terms of time, animals and money. We report the development and characterisation of an ex vivo slice culture system using mouse brain and spinal cord, allowing investigation of myelination, demyelination and remyelination, which can be used as an initial reliable screen to select the most promising remyelination strategies. We have automated the quantification of myelin to provide a high content and moderately-high-throughput screen for testing therapies for remyelination both by endogenous and exogenous means and as an invaluable way of studying the biology of remyelination. Copyright © 2011 Elsevier Inc. All rights reserved.

  11. Central nervous system remyelination in culture — A tool for multiple sclerosis research

    PubMed Central

    Zhang, Hui; Jarjour, Andrew A.; Boyd, Amanda; Williams, Anna

    2011-01-01

    Multiple sclerosis is a demyelinating disease of the central nervous system which only affects humans. This makes it difficult to study at a molecular level, and to develop and test potential therapies that may change the course of the disease. The development of therapies to promote remyelination in multiple sclerosis is a key research aim, to both aid restoration of electrical impulse conduction in nerves and provide neuroprotection, reducing disability in patients. Testing a remyelination therapy in the many and various in vivo models of multiple sclerosis is expensive in terms of time, animals and money. We report the development and characterisation of an ex vivo slice culture system using mouse brain and spinal cord, allowing investigation of myelination, demyelination and remyelination, which can be used as an initial reliable screen to select the most promising remyelination strategies. We have automated the quantification of myelin to provide a high content and moderately-high-throughput screen for testing therapies for remyelination both by endogenous and exogenous means and as an invaluable way of studying the biology of remyelination. PMID:21515259

  12. Quantification of cellular autofluorescence of human skin using multiphoton tomography and fluorescence lifetime imaging in two spectral detection channels

    PubMed Central

    Patalay, Rakesh; Talbot, Clifford; Alexandrov, Yuriy; Munro, Ian; Neil, Mark A. A.; König, Karsten; French, Paul M. W.; Chu, Anthony; Stamp, Gordon W.; Dunsby, Chris

    2011-01-01

    We explore the diagnostic potential of imaging endogenous fluorophores using two photon microscopy and fluorescence lifetime imaging (FLIM) in human skin with two spectral detection channels. Freshly excised benign dysplastic nevi (DN) and malignant nodular Basal Cell Carcinomas (nBCCs) were excited at 760 nm. The resulting fluorescence signal was binned manually on a cell by cell basis. This improved the reliability of fitting using a double exponential decay model and allowed the fluorescence signatures from different cell populations within the tissue to be identified and studied. We also performed a direct comparison between different diagnostic groups. A statistically significant difference between the median mean fluorescence lifetime of 2.79 ns versus 2.52 ns (blue channel, 300-500 nm) and 2.08 ns versus 1.33 ns (green channel, 500-640 nm) was found between nBCCs and DN respectively, using the Mann-Whitney U test (p < 0.01). Further differences in the distribution of fluorescence lifetime parameters and inter-patient variability are also discussed. PMID:22162820

  13. Rapid Modified QuEChERS Method for Pesticides Detection in Honey by High-Performance Liquid Chromatography UV-visible

    PubMed Central

    Ceci, Edmondo; Montemurro, Nicola; Tantillo, Giuseppina; Di Pinto, Angela; Celano, Gaetano Vitale; Bozzo, Giancarlo

    2014-01-01

    The extensive use of pesticides in agriculture plays an important role in bees die-off and allows the presence of residues in hive products, particularly in honey. An accurate and reliable analytical method, based on QuEChERS extractive technique, has been developed for the quantitative determination by high-performance liquid chromatography UV-visible detector of 5 pesticides (Deltamethrin, Dimethoate, Imidacloprid, Acetamiprid, Chlorfenvinphos) in honey. The method, according to Commission Directive 2002/63/EC and Regulation 882/2004/EC, provided excellent results with respect to linearity (correlation coefficient up to 0.993), limits of detection and quantification (0.005 and 0.01 µg/mL for Dimethoate, Deltamethrin and Chlorfenvinphos; 0.02 and 0.05 µg/mL for Acetamiprid and Imidacloprid), recovery values (86.4 to 96.3%), precision and relative expanded uncertainty of a measurement, demonstrating the conformity of the this method with the European directives. The proposed method was applied to 23 samples of Apulian honey. None of the investigated pesticides was detected in these samples. PMID:27800334

  14. Automated GC-MS analysis of free amino acids in biological fluids.

    PubMed

    Kaspar, Hannelore; Dettmer, Katja; Gronwald, Wolfram; Oefner, Peter J

    2008-07-15

    A gas chromatography-mass spectrometry (GC-MS) method was developed for the quantitative analysis of free amino acids as their propyl chloroformate derivatives in biological fluids. Derivatization with propyl chloroformate is carried out directly in the biological samples without prior protein precipitation or solid-phase extraction of the amino acids, thereby allowing automation of the entire procedure, including addition of reagents, extraction and injection into the GC-MS. The total analysis time was 30 min and 30 amino acids could be reliably quantified using 19 stable isotope-labeled amino acids as internal standards. Limits of detection (LOD) and lower limits of quantification (LLOQ) were in the range of 0.03-12 microM and 0.3-30 microM, respectively. The method was validated using a certified amino acid standard and reference plasma, and its applicability to different biological fluids was shown. Intra-day precision for the analysis of human urine, blood plasma, and cell culture medium was 2.0-8.8%, 0.9-8.3%, and 2.0-14.3%, respectively, while the inter-day precision for human urine was 1.5-14.1%.

  15. Pesticide residues determination in Polish organic crops in 2007-2010 applying gas chromatography-tandem quadrupole mass spectrometry.

    PubMed

    Walorczyk, Stanisław; Drożdżyński, Dariusz; Kowalska, Jolanta; Remlein-Starosta, Dorota; Ziółkowski, Andrzej; Przewoźniak, Monika; Gnusowski, Bogusław

    2013-08-15

    A sensitive, accurate and reliable multiresidue method based on the application of gas chromatography-tandem quadrupole mass spectrometry (GC-QqQ-MS/MS) has been established for screening, identification and quantification of a large number of pesticide residues in produce. The method was accredited in compliance with PN-EN ISO/IEC 17025:2005 standard and it was operated under flexible scope as PB-11 method. The flexible scope of accreditation allowed for minor modifications and extension of the analytical scope while using the same analytical technique. During the years 2007-2010, the method was used for the purpose of verification of organic crop production by multiresidue analysis for the presence of pesticides. A total of 528 samples of differing matrices such as fruits, vegetables, cereals, plant leaves and other green parts were analysed, of which 4.4% samples contained pesticide residues above the threshold value of 0.01 mg/kg. A total of 20 different pesticide residues were determined in the samples. Copyright © 2013 Elsevier Ltd. All rights reserved.

  16. Experimental Review of DNA-Based Methods for Wine Traceability and Development of a Single-Nucleotide Polymorphism (SNP) Genotyping Assay for Quantitative Varietal Authentication.

    PubMed

    Catalano, Valentina; Moreno-Sanz, Paula; Lorenzi, Silvia; Grando, Maria Stella

    2016-09-21

    The genetic varietal authentication of wine was investigated according to DNA isolation procedures reported for enological matrices and also by testing 11 commercial extraction kits and various protocol modifications. Samples were collected at different stages of the winemaking process of renowned Italian wines Brunello di Montalcino, Lambruschi Modenesi, and Trento DOC. Results demonstrated not only that grape DNA loss is produced by the fermentation process but also that clarification and stabilization operations contribute to the reduction of double-stranded DNA content on wine. Despite the presence of inhibitors, downstream PCR genotyping yielded reliable nuclear and chloroplast SSR markers for must samples, whereas no amplification or inconsistent results were obtained at later stages of the vinification. In addition, a TaqMan genotyping assay based on cultivar-specific single-nucleotide polymorphisms (SNPs) was designed, which allowed assessment of grapevine DNA mixtures. Once the wine matrix limitations are overcome, this sensitive tool may be implemented for the relative quantification of cultivars used for blend wines or frauds.

  17. Cutting-edge analysis of extracellular microparticles using ImageStream(X) imaging flow cytometry.

    PubMed

    Headland, Sarah E; Jones, Hefin R; D'Sa, Adelina S V; Perretti, Mauro; Norling, Lucy V

    2014-06-10

    Interest in extracellular vesicle biology has exploded in the past decade, since these microstructures seem endowed with multiple roles, from blood coagulation to inter-cellular communication in pathophysiology. In order for microparticle research to evolve as a preclinical and clinical tool, accurate quantification of microparticle levels is a fundamental requirement, but their size and the complexity of sample fluids present major technical challenges. Flow cytometry is commonly used, but suffers from low sensitivity and accuracy. Use of Amnis ImageStream(X) Mk II imaging flow cytometer afforded accurate analysis of calibration beads ranging from 1 μm to 20 nm; and microparticles, which could be observed and quantified in whole blood, platelet-rich and platelet-free plasma and in leukocyte supernatants. Another advantage was the minimal sample preparation and volume required. Use of this high throughput analyzer allowed simultaneous phenotypic definition of the parent cells and offspring microparticles along with real time microparticle generation kinetics. With the current paucity of reliable techniques for the analysis of microparticles, we propose that the ImageStream(X) could be used effectively to advance this scientific field.

  18. Determination of water-extractable nonstructural carbohydrates, including inulin, in grass samples with high-performance anion exchange chromatography and pulsed amperometric detection.

    PubMed

    Raessler, Michael; Wissuwa, Bianka; Breul, Alexander; Unger, Wolfgang; Grimm, Torsten

    2008-09-10

    The exact and reliable determination of carbohydrates in plant samples of different origin is of great importance with respect to plant physiology. Additionally, the identification and quantification of carbohydrates are necessary for the evaluation of the impact of these compounds on the biogeochemistry of carbon. To attain this goal, it is necessary to analyze a great number of samples with both high sensitivity and selectivity within a limited time frame. This paper presents a rugged and easy method that allows the isocratic chromatographic determination of 12 carbohydrates and sugar alcohols from one sample within 30 min. The method was successfully applied to a variety of plant materials with particular emphasis on perennial ryegrass samples of the species Lolium perenne. The method was easily extended to the analysis of the polysaccharide inulin after its acidic hydrolysis into the corresponding monomers without the need for substantial change of chromatographic conditions or even the use of enzymes. It therefore offers a fundamental advantage for the analysis of the complex mixture of nonstructural carbohydrates often found in plant samples.

  19. Quantification of Nε-(2-Furoylmethyl)-L-lysine (furosine), Nε-(Carboxymethyl)-L-lysine (CML), Nε-(Carboxyethyl)-L-lysine (CEL) and total lysine through stable isotope dilution assay and tandem mass spectrometry.

    PubMed

    Troise, Antonio Dario; Fiore, Alberto; Wiltafsky, Markus; Fogliano, Vincenzo

    2015-12-01

    The control of Maillard reaction (MR) is a key point to ensure processed foods quality. Due to the presence of a primary amino group on its side chain, lysine is particularly prone to chemical modifications with the formation of Amadori products (AP), Nε-(Carboxymethyl)-L-lysine (CML), Nε-(Carboxyethyl)-L-lysine (CEL). A new analytical strategy was proposed which allowed to simultaneously quantify lysine, CML, CEL and the Nε-(2-Furoylmethyl)-L-lysine (furosine), the indirect marker of AP. The procedure is based on stable isotope dilution assay followed by liquid chromatography tandem mass spectrometry. It showed high sensitivity and good reproducibility and repeatability in different foods. The limit of detection and the RSD% were lower than 5 ppb and below 8%, respectively. Results obtained with the new procedure not only improved the knowledge about the reliability of thermal treatment markers, but also defined new insights in the relationship between Maillard reaction products and their precursors. Copyright © 2015 Elsevier Ltd. All rights reserved.

  20. Exploitation of the complexation reaction of ortho-dihydroxylated anthocyanins with aluminum(III) for their quantitative spectrophotometric determination in edible sources.

    PubMed

    Bernal, Freddy A; Orduz-Diaz, Luisa L; Coy-Barrera, Ericsson

    2015-10-15

    Anthocyanins are natural pigments known for their color and antioxidant activity. These properties allow their use in various fields, including food and pharmaceutical ones. Quantitative determination of anthocyanins had been performed by non-specific methods that limit the accuracy and reliability of the results. Therefore, a novel, simple spectrophotometric method for the anthocyanins quantification based on a formation of blue-colored complexes by the known reaction between catechol- and pyrogallol-containing anthocyanins and aluminum(III) is presented. The method demonstrated to be reproducible, repetitive (RSD<1.5%) and highly sensitive to ortho-dihydroxylated anthocyanins (LOD = 0.186 μg/mL). Compliance with Beer's law was also evident in a range of concentrations (2-16 μg/mL for cyanidin 3-O-glucoside). Good recoveries (98.8-103.3%) were calculated using anthocyanin-rich plant samples. The described method revealed direct correlation to pH differential method results for several common anthocyanin-containing fruits indicating its great analytical potential. The presented method was successfully validated. Copyright © 2015 Elsevier Ltd. All rights reserved.

  1. Assessment of maternal antibody decay and response to canine parvovirus vaccination using a clinic-based enzyme-linked immunosorbent assay.

    PubMed

    Waner, T; Naveh, A; Wudovsky, I; Carmichael, L E

    1996-10-01

    Interference caused by maternal antibodies is considered a major cause of canine parvovirus (CPV) vaccination failure. In this study, an immunoblot clinic-based enzyme-linked immunosorbent assay (ELISA) method was used to detect CPV antibodies in sera of pregnant bitches and their offspring to study the response of pups to vaccination. With a easily accessible procedure for CPV antibody determination, the veterinarian should be able to gauge the response of pups after vaccination. The validity of the technique was tested in parallel against the standard hemagglutination inhibition (HI) test. Results of the ELISA were correlated with those of the standard HI method for quantification of CPV antibodies. With the ELISA, successfully immunized pups were identified, allowing for a more reliable and cost-effective program of vaccination. This simple clinic-based test could be used for the assessment of vaccination status of pups during the critical phase of 6 to about 16 weeks of age. This study is the first in which vaccination response to CPV in pups was followed, using a clinic-based ELISA for CPV antibody monitoring.

  2. Recurrence Methods for the Identification of Morphogenetic Patterns

    PubMed Central

    Facchini, Angelo; Mocenni, Chiara

    2013-01-01

    This paper addresses the problem of identifying the parameters involved in the formation of spatial patterns in nonlinear two dimensional systems. To this aim, we perform numerical experiments on a prototypical model generating morphogenetic Turing patterns, by changing both the spatial frequency and shape of the patterns. The features of the patterns and their relationship with the model parameters are characterized by means of the Generalized Recurrence Quantification measures. We show that the recurrence measures Determinism and Recurrence Entropy, as well as the distribution of the line lengths, allow for a full characterization of the patterns in terms of power law decay with respect to the parameters involved in the determination of their spatial frequency and shape. A comparison with the standard two dimensional Fourier transform is performed and the results show a better performance of the recurrence indicators in identifying a reliable connection with the spatial frequency of the patterns. Finally, in order to evaluate the robustness of the estimation of the power low decay, extensive simulations have been performed by adding different levels of noise to the patterns. PMID:24066062

  3. Assessment of spill flow emissions on the basis of measured precipitation and waste water data

    NASA Astrophysics Data System (ADS)

    Hochedlinger, Martin; Gruber, Günter; Kainz, Harald

    2005-09-01

    Combined sewer overflows (CSOs) are substantial contributors to the total emissions into surface water bodies. The emitted pollution results from dry-weather waste water loads, surface runoff pollution and from the remobilisation of sewer deposits and sewer slime during storm events. One possibility to estimate overflow loads is a calculation with load quantification models. Input data for these models are pollution concentrations, e.g. Total Chemical Oxygen Demand (COD tot), Total Suspended Solids (TSS) or Soluble Chemical Oxygen Demand (COD sol), rainfall series and flow measurements for model calibration and validation. It is important for the result of overflow loads to model with reliable input data, otherwise this inevitably leads to bad results. In this paper the correction of precipitation measurements and the sewer online-measurements are presented to satisfy the load quantification model requirements already described. The main focus is on tipping bucket gauge measurements and their corrections. The results evidence the importance of their corrections due the effects on load quantification modelling and show the difference between corrected and not corrected data of storm events with high rain intensities.

  4. A new dimethyl labeling-based SID-MRM-MS method and its application to three proteases involved in insulin maturation.

    PubMed

    Cheng, Dongwan; Zheng, Li; Hou, Junjie; Wang, Jifeng; Xue, Peng; Yang, Fuquan; Xu, Tao

    2015-01-01

    The absolute quantification of target proteins in proteomics involves stable isotope dilution coupled with multiple reactions monitoring mass spectrometry (SID-MRM-MS). The successful preparation of stable isotope-labeled internal standard peptides is an important prerequisite for the SID-MRM absolute quantification methods. Dimethyl labeling has been widely used in relative quantitative proteomics and it is fast, simple, reliable, cost-effective, and applicable to any protein sample, making it an ideal candidate method for the preparation of stable isotope-labeled internal standards. MRM mass spectrometry is of high sensitivity, specificity, and throughput characteristics and can quantify multiple proteins simultaneously, including low-abundance proteins in precious samples such as pancreatic islets. In this study, a new method for the absolute quantification of three proteases involved in insulin maturation, namely PC1/3, PC2 and CPE, was developed by coupling a stable isotope dimethyl labeling strategy for internal standard peptide preparation with SID-MRM-MS quantitative technology. This method offers a new and effective approach for deep understanding of the functional status of pancreatic β cells and pathogenesis in diabetes.

  5. Fundamentals of Counting Statistics in Digital PCR: I Just Measured Two Target Copies-What Does It Mean?

    PubMed

    Tzonev, Svilen

    2018-01-01

    Current commercially available digital PCR (dPCR) systems and assays are capable of detecting individual target molecules with considerable reliability. As tests are developed and validated for use on clinical samples, the need to understand and develop robust statistical analysis routines increases. This chapter covers the fundamental processes and limitations of detecting and reporting on single molecule detection. We cover the basics of quantification of targets and sources of imprecision. We describe the basic test concepts: sensitivity, specificity, limit of blank, limit of detection, and limit of quantification in the context of dPCR. We provide basic guidelines how to determine those, how to choose and interpret the operating point, and what factors may influence overall test performance in practice.

  6. Absolute Quantification of Middle- to High-Abundant Plasma Proteins via Targeted Proteomics.

    PubMed

    Dittrich, Julia; Ceglarek, Uta

    2017-01-01

    The increasing number of peptide and protein biomarker candidates requires expeditious and reliable quantification strategies. The utilization of liquid chromatography coupled to quadrupole tandem mass spectrometry (LC-MS/MS) for the absolute quantitation of plasma proteins and peptides facilitates the multiplexed verification of tens to hundreds of biomarkers from smallest sample quantities. Targeted proteomics assays derived from bottom-up proteomics principles rely on the identification and analysis of proteotypic peptides formed in an enzymatic digestion of the target protein. This protocol proposes a procedure for the establishment of a targeted absolute quantitation method for middle- to high-abundant plasma proteins waiving depletion or enrichment steps. Essential topics as proteotypic peptide identification and LC-MS/MS method development as well as sample preparation and calibration strategies are described in detail.

  7. Accelerated benzene polycarboxylic acid analysis by liquid chromatography-time-of-flight-mass spectrometry for the determination of petrogenic and pyrogenic carbon.

    PubMed

    Hindersmann, Benjamin; Achten, Christine

    2017-08-11

    Pyrogenic carbon species are of particular interest due to their ubiquitous occurrence in the environment and their high sorption capacities for nonpolar organic compounds. It has recently been shown that the analysis of the molecular markers for complex aromatic carbon structures, benzene polycarboxylic acids (BPCA), has a high potential for aid in the identification of different carbon sources. In this study, the first LC method using mass spectrometry (MS) for reliable and accelerated (<24h) quantification of pyrogenic and petrogenic carbon by BPCA analysis has been developed. The main advantage of LC-MS compared to previous methods is the higher sensitivity, which is important if only small sample amounts are available. Sample pre-treatment could be reduced to a minimum. Deuterated phthalic acid was introduced as internal standard due to its structural similarity to BPCA and its lack of occurrence in the environment. Linear quantification with r 2 ≥0997 was accomplished for all BPCA. Method validation showed an excellent quantification reproducibility (mean CV<5%) which is comparable to LC-DAD methods and more reliable than GC-FID measurements (CV 16-23%). In summary, the presented BPCA method is more economic, efficient and presumably attractive to use. Besides reference materials, various pyrogenic and petrogenic samples were analyzed to test if the sources were indicated by BPCA analysis. In addition to pyrogenic carbon, large amounts of petrogenic carbon species can also be present in urban soils and river sediments, especially in mining regions. They also to a large degree consist of aromatic carbon structures and therefore have an impact on source identification by BPCA analysis. Comparison of petrogenic and pyrogenic carbon samples shows similarities in the BPCA concentrations and patterns, in their aromaticity and degree of aromatic condensation. Thus, a differentiation between petrogenic and pyrogenic carbon only by BPCA analysis of samples with unknown carbon sources is not possible. For reliable source identification of the carbon species, the combination with other methods, such as e. g. analysis of polycyclic aromatic hydrocarbons may be successful. Copyright © 2017 Elsevier B.V. All rights reserved.

  8. Imaging tools to measure treatment response in gout.

    PubMed

    Dalbeth, Nicola; Doyle, Anthony J

    2018-01-01

    Imaging tests are in clinical use for diagnosis, assessment of disease severity and as a marker of treatment response in people with gout. Various imaging tests have differing properties for assessing the three key disease domains in gout: urate deposition (including tophus burden), joint inflammation and structural joint damage. Dual-energy CT allows measurement of urate deposition and bone damage, and ultrasonography allows assessment of all three domains. Scoring systems have been described that allow radiological quantification of disease severity and these scoring systems may play a role in assessing the response to treatment in gout. This article reviews the properties of imaging tests, describes the available scoring systems for quantification of disease severity and discusses the challenges and controversies regarding the use of imaging tools to measure treatment response in gout. © The Author 2018. Published by Oxford University Press on behalf of the British Society for Rheumatology. All rights reserved. For Permissions, please email: journals.permissions@oup.com.

  9. Development of the Hand Assessment for Infants: evidence of internal scale validity.

    PubMed

    Krumlinde-Sundholm, Lena; Ek, Linda; Sicola, Elisa; Sjöstrand, Lena; Guzzetta, Andrea; Sgandurra, Giuseppina; Cioni, Giovanni; Eliasson, Ann-Christin

    2017-12-01

    The aim of this study was to develop a descriptive and evaluative assessment of upper limb function for infants aged 3 to 12 months and to investigate its internal scale validity for use with infants at risk of unilateral cerebral palsy. The concepts of the test items and scoring criteria were developed. Internal scale validity and aspects of reliability were investigated on the basis of 156 assessments of infants at 3 to 12 months corrected age (mean 7.2mo, SD 2.5) with signs of asymmetric hand use. Rasch measurement model analysis and non-parametric statistics were used. The new test, the Hand Assessment for Infants (HAI), consists of 12 unimanual and five bimanual items, each scored on a 3-point rating scale. It demonstrated a unidimensional construct and good fit to the Rasch model requirements. The excellent person reliability enabled person separation to six significant ability strata. The HAI produced an interval-level measure of bilateral hand use as well as unimanual scores of each hand, allowing a quantification of possible asymmetry expressed as an asymmetry index. The HAI can be considered a valid assessment tool for measuring bilateral hand use and quantifying side difference between hands among infants at risk of developing unilateral cerebral palsy. The Hand Assessment for Infants (HAI) measures the use of both hands and quantifies a possible asymmetry of hand use. HAI is valid for infants at 3 to 12 months corrected age at risk of unilateral cerebral palsy. © 2017 Mac Keith Press.

  10. snoU6 and 5S RNAs are not reliable miRNA reference genes in neuronal differentiation.

    PubMed

    Lim, Q E; Zhou, L; Ho, Y K; Wan, G; Too, H P

    2011-12-29

    Accurate profiling of microRNAs (miRNAs) is an essential step for understanding the functional significance of these small RNAs in both physiological and pathological processes. Quantitative real-time PCR (qPCR) has gained acceptance as a robust and reliable transcriptomic method to profile subtle changes in miRNA levels and requires reference genes for accurate normalization of gene expression. 5S and snoU6 RNAs are commonly used as reference genes in microRNA quantification. It is currently unknown if these small RNAs are stably expressed during neuronal differentiation. Panels of miRNAs have been suggested as alternative reference genes to 5S and snoU6 in various physiological contexts. To test the hypothesis that miRNAs may serve as stable references during neuronal differentiation, the expressions of eight miRNAs, 5S and snoU6 RNAs in five differentiating neuronal cell types were analyzed using qPCR. The stabilities of the expressions were evaluated using two complementary statistical approaches (geNorm and Normfinder). Expressions of 5S and snoU6 RNAs were stable under some but not all conditions of neuronal differentiation and thus are not suitable reference genes. In contrast, a combination of three miRNAs (miR-103, miR-106b and miR-26b) allowed accurate expression normalization across different models of neuronal differentiation. Copyright © 2011 IBRO. Published by Elsevier Ltd. All rights reserved.

  11. Multiplex SNaPshot-a new simple and efficient CYP2D6 and ADRB1 genotyping method.

    PubMed

    Ben, Songtao; Cooper-DeHoff, Rhonda M; Flaten, Hanna K; Evero, Oghenero; Ferrara, Tracey M; Spritz, Richard A; Monte, Andrew A

    2016-04-23

    Reliable, inexpensive, high-throughput genotyping methods are required for clinical trials. Traditional assays require numerous enzyme digestions or are too expensive for large sample volumes. Our objective was to develop an inexpensive, efficient, and reliable assay for CYP2D6 and ADRB1 accounting for numerous polymorphisms including gene duplications. We utilized the multiplex SNaPshot® custom genotype method to genotype CYP2D6 and ADRB1. We compared the method to reference standards genotyped using the Taqman Copy Number Variant Assay followed by pyrosequencing quantification and determined assigned genotype concordance. We genotyped 119 subjects. Seven (5.9 %) were found to be CYP2D6 poor metabolizers (PMs), 18 (15.1 %) intermediate metabolizers (IMs), 89 (74.8 %) extensive metabolizers (EMs), and 5 (4.2 %) ultra-rapid metabolizers (UMs). We genotyped two variants in the β1-adrenoreceptor, rs1801253 (Gly389Arg) and rs1801252 (Ser49Gly). The Gly389Arg genotype is Gly/Gly 18 (15.1 %), Gly/Arg 58 (48.7 %), and Arg/Arg 43 (36.1 %). The Ser49Gly genotype is Ser/Ser 82 (68.9 %), Ser/Gly 32 (26.9), and Gly/Gly 5 (4.2 %). The multiplex SNaPshot method was concordant with genotypes in reference samples. The multiplex SNaPshot method allows for specific and accurate detection of CYP2D6 genotypes and ADRB1 genotypes and haplotypes. This platform is simple and efficient and suited for high throughput.

  12. External ocular hyperemia: a quantifiable indicator of spacecraft air quality.

    PubMed

    Ogle, J W; Cohen, K L

    1996-05-01

    Eye irritation consistently ranks as a top astronaut complaint but is difficult to measure. Exposure to internal air pollution hypothetically disrupts the eye's tear film, thereby exposing the crewmembers' conjunctivae to the irritating effects of the recirculated, contaminant-laden atmosphere of the space vehicle. Causes elude engineers and toxicologists, who report that measured irritants remain below established Spacecraft Maximum Allowable Concentrations. Lack of objective ocular endpoints stymies efforts to identify etiologies. Computers offer a practical means of analyzing ocular hyperemia in space. We use computer analysis to quantify redness and blood vessels of digitized images of bulbar conjunctivae in near real time. Custom software masks artifacts, lids and lashes for each photographic or telemedicine ocular image, Algorithms then generate semi-independent measurements of hyperemia. Computed difference scores between 34 pairs of images were compared with subjective difference scores as voted on by a panel of ophthalmology residents. Objective data were reliably extracted from ocular images and significantly correlated (r = 0.583, p < 0.05) with subjective scores. This ground-based methodology generates accurate and reliable ocular endpoint data without mass, volume, or power penalty. To assist in identifying and eliminating onboard ocular irritants, these objective data can be regressed against independent variables such as mission elapsed time, subjective astronaut complaints, levels of chemical and electromagnetic contaminants, nephthelometric and barothermal data. As missions lengthen, sensitive tools such as hyperemia quantification will become increasingly important for assessing and optimizing spacecraft environments.

  13. Is qPCR a Reliable Indicator of Cyanotoxin Risk in Freshwater?

    PubMed Central

    Pacheco, Ana Beatriz F.; Guedes, Iame A.; Azevedo, Sandra M.F.O.

    2016-01-01

    The wide distribution of cyanobacteria in aquatic environments leads to the risk of water contamination by cyanotoxins, which generate environmental and public health issues. Measurements of cell densities or pigment contents allow both the early detection of cellular growth and bloom monitoring, but these methods are not sufficiently accurate to predict actual cyanobacterial risk. To quantify cyanotoxins, analytical methods are considered the gold standards, but they are laborious, expensive, time-consuming and available in a limited number of laboratories. In cyanobacterial species with toxic potential, cyanotoxin production is restricted to some strains, and blooms can contain varying proportions of both toxic and non-toxic cells, which are morphologically indistinguishable. The sequencing of cyanobacterial genomes led to the description of gene clusters responsible for cyanotoxin production, which paved the way for the use of these genes as targets for PCR and then quantitative PCR (qPCR). Thus, the quantification of cyanotoxin genes appeared as a new method for estimating the potential toxicity of blooms. This raises a question concerning whether qPCR-based methods would be a reliable indicator of toxin concentration in the environment. Here, we review studies that report the parallel detection of microcystin genes and microcystin concentrations in natural populations and also a smaller number of studies dedicated to cylindrospermopsin and saxitoxin. We discuss the possible issues associated with the contradictory findings reported to date, present methodological limitations and consider the use of qPCR as an indicator of cyanotoxin risk. PMID:27338471

  14. A 3D Freehand Ultrasound System for Multi-view Reconstructions from Sparse 2D Scanning Planes

    PubMed Central

    2011-01-01

    Background A significant limitation of existing 3D ultrasound systems comes from the fact that the majority of them work with fixed acquisition geometries. As a result, the users have very limited control over the geometry of the 2D scanning planes. Methods We present a low-cost and flexible ultrasound imaging system that integrates several image processing components to allow for 3D reconstructions from limited numbers of 2D image planes and multiple acoustic views. Our approach is based on a 3D freehand ultrasound system that allows users to control the 2D acquisition imaging using conventional 2D probes. For reliable performance, we develop new methods for image segmentation and robust multi-view registration. We first present a new hybrid geometric level-set approach that provides reliable segmentation performance with relatively simple initializations and minimum edge leakage. Optimization of the segmentation model parameters and its effect on performance is carefully discussed. Second, using the segmented images, a new coarse to fine automatic multi-view registration method is introduced. The approach uses a 3D Hotelling transform to initialize an optimization search. Then, the fine scale feature-based registration is performed using a robust, non-linear least squares algorithm. The robustness of the multi-view registration system allows for accurate 3D reconstructions from sparse 2D image planes. Results Volume measurements from multi-view 3D reconstructions are found to be consistently and significantly more accurate than measurements from single view reconstructions. The volume error of multi-view reconstruction is measured to be less than 5% of the true volume. We show that volume reconstruction accuracy is a function of the total number of 2D image planes and the number of views for calibrated phantom. In clinical in-vivo cardiac experiments, we show that volume estimates of the left ventricle from multi-view reconstructions are found to be in better agreement with clinical measures than measures from single view reconstructions. Conclusions Multi-view 3D reconstruction from sparse 2D freehand B-mode images leads to more accurate volume quantification compared to single view systems. The flexibility and low-cost of the proposed system allow for fine control of the image acquisition planes for optimal 3D reconstructions from multiple views. PMID:21251284

  15. A 3D freehand ultrasound system for multi-view reconstructions from sparse 2D scanning planes.

    PubMed

    Yu, Honggang; Pattichis, Marios S; Agurto, Carla; Beth Goens, M

    2011-01-20

    A significant limitation of existing 3D ultrasound systems comes from the fact that the majority of them work with fixed acquisition geometries. As a result, the users have very limited control over the geometry of the 2D scanning planes. We present a low-cost and flexible ultrasound imaging system that integrates several image processing components to allow for 3D reconstructions from limited numbers of 2D image planes and multiple acoustic views. Our approach is based on a 3D freehand ultrasound system that allows users to control the 2D acquisition imaging using conventional 2D probes.For reliable performance, we develop new methods for image segmentation and robust multi-view registration. We first present a new hybrid geometric level-set approach that provides reliable segmentation performance with relatively simple initializations and minimum edge leakage. Optimization of the segmentation model parameters and its effect on performance is carefully discussed. Second, using the segmented images, a new coarse to fine automatic multi-view registration method is introduced. The approach uses a 3D Hotelling transform to initialize an optimization search. Then, the fine scale feature-based registration is performed using a robust, non-linear least squares algorithm. The robustness of the multi-view registration system allows for accurate 3D reconstructions from sparse 2D image planes. Volume measurements from multi-view 3D reconstructions are found to be consistently and significantly more accurate than measurements from single view reconstructions. The volume error of multi-view reconstruction is measured to be less than 5% of the true volume. We show that volume reconstruction accuracy is a function of the total number of 2D image planes and the number of views for calibrated phantom. In clinical in-vivo cardiac experiments, we show that volume estimates of the left ventricle from multi-view reconstructions are found to be in better agreement with clinical measures than measures from single view reconstructions. Multi-view 3D reconstruction from sparse 2D freehand B-mode images leads to more accurate volume quantification compared to single view systems. The flexibility and low-cost of the proposed system allow for fine control of the image acquisition planes for optimal 3D reconstructions from multiple views.

  16. Rapid discrimination of different Apiaceae species based on HPTLC fingerprints and targeted flavonoids determination using multivariate image analysis.

    PubMed

    Shawky, Eman; Abou El Kheir, Rasha M

    2018-02-11

    Species of Apiaceae are used in folk medicine as spices and in officinal medicinal preparations of drugs. They are an excellent source of phenolics exhibiting antioxidant activity, which are of great benefit to human health. Discrimination among Apiaceae medicinal herbs remains an intricate challenge due to their morphological similarity. In this study, a combined "untargeted" and "targeted" approach to investigate different Apiaceae plants species was proposed by using the merging of high-performance thin layer chromatography (HPTLC)-image analysis and pattern recognition methods which were used for fingerprinting and classification of 42 different Apiaceae samples collected from Egypt. Software for image processing was applied for fingerprinting and data acquisition. HPTLC fingerprint assisted by principal component analysis (PCA) and hierarchical cluster analysis (HCA)-heat maps resulted in a reliable untargeted approach for discrimination and classification of different samples. The "targeted" approach was performed by developing and validating an HPTLC method allowing the quantification of eight flavonoids. The combination of quantitative data with PCA and HCA-heat-maps allowed the different samples to be discriminated from each other. The use of chemometrics tools for evaluation of fingerprints reduced expense and analysis time. The proposed method can be adopted for routine discrimination and evaluation of the phytochemical variability in different Apiaceae species extracts. Copyright © 2018 John Wiley & Sons, Ltd.

  17. In vivo determination of the time and location of mucoadhesive drug delivery systems disintegration in the gastrointestinal tract.

    PubMed

    Kremser, Christian; Albrecht, Karin; Greindl, Melanie; Wolf, Christian; Debbage, Paul; Bernkop-Schnürch, Andreas

    2008-06-01

    The objective of this study was to use magnetic resonance imaging (MRI) to detect the time when and the location at which orally delivered mucoadhesive drugs are released. Drug delivery systems comprising tablets or capsules containing a mucoadhesive polymer were designed to deliver the polymer to the intestine in dry powder form. Dry Gd-DTPA [diethylenetriaminepentaacetic acid gadolinium(III) dihydrogen salt hydrate] powder was added to the mucoadhesive polymer, resulting in a susceptibility artifact that allows tracking of the application forms before their disintegration and that gives a strong positive signal on disintegration. Experiments were performed with rats using T(1)-weighted spin-echo imaging on a standard 1.5-T MRI system. The susceptibility artifact produced by the dry Gd-DTPA powder in tablets or capsules was clearly visible within the stomach of the rats and could be followed during movement towards the intestine. Upon disintegration, a strong positive signal was unambiguously observed. The time between ingestion and observation of a positive signal was significantly different for different application forms. Quantification of the remaining mucoadhesive polymer in the intestine 3 h after observed release showed significant differences in mucoadhesive effectiveness. MRI allows detection of the exact time of release of the mucoadhesive polymer in vivo, which is a prerequisite for a reliable quantitative comparison between different application forms.

  18. Quantitative Susceptibility Mapping of Human Brain Reflects Spatial Variation in Tissue Composition

    PubMed Central

    Li, Wei; Wu, Bing; Liu, Chunlei

    2011-01-01

    Image phase from gradient echo MRI provides a unique contrast that reflects brain tissue composition variations, such as iron and myelin distribution. Phase imaging is emerging as a powerful tool for the investigation of functional brain anatomy and disease diagnosis. However, the quantitative value of phase is compromised by its nonlocal and orientation dependent properties. There is an increasing need for reliable quantification of magnetic susceptibility, the intrinsic property of tissue. In this study, we developed a novel and accurate susceptibility mapping method that is also phase-wrap insensitive. The proposed susceptibility mapping method utilized two complementary equations: (1) the Fourier relationship of phase and magnetic susceptibility; and (2) the first-order partial derivative of the first equation in the spatial frequency domain. In numerical simulation, this method reconstructed the susceptibility map almost free of streaking artifact. Further, the iterative implementation of this method allowed for high quality reconstruction of susceptibility maps of human brain in vivo. The reconstructed susceptibility map provided excellent contrast of iron-rich deep nuclei and white matter bundles from surrounding tissues. Further, it also revealed anisotropic magnetic susceptibility in brain white matter. Hence, the proposed susceptibility mapping method may provide a powerful tool for the study of brain physiology and pathophysiology. Further elucidation of anisotropic magnetic susceptibility in vivo may allow us to gain more insight into the white matter microarchitectures. PMID:21224002

  19. Measurements of Mode I Interlaminar Properties of Carbon Fiber Reinforced Polymers Using Digital Image Correlation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Merzkirch, Matthias; Ahure Powell, Louise; Foecke, Tim

    Numerical models based on cohesive zones are usually used to model and simulate the mechanical behavior of laminated carbon fiber reinforced polymers (CFRP) in automotive and aerospace applications and require different interlaminar properties. This work focuses on determining the interlaminar fracture toughness (G IC) under Mode I loading of a double cantilever beam (DCB) specimen of unidirectional CFRP, serving as prototypical material. The novelty of this investigation is the improvement of the testing methodology by introducing digital image correlation (DIC) as an extensometer and this tool allows for crack growth measurement, phenomenological visualization and quantification of various material responses tomore » Mode I loading. Multiple methodologies from different international standards and other common techniques are compared for the determination of the evolution of G IC as crack resistance curves (R-curves). The primarily metrological sources of uncertainty, in contrast to material specific related uncertainties, are discussed through a simple sensitivity analysis. Additionally, the current work offers a detailed insight into the constraints and assumptions to allow exploration of different methods for the determination of material properties using the DIC measured data. The main aim is an improvement of the measurement technique and an increase in the reliability of measured data during static testing, in advance of future rate dependent testing for crashworthiness simulations.« less

  20. Tibiofemoral forces for the native and post-arthroplasty knee: relationship to maximal laxity through a functional arc of motion.

    PubMed

    Manning, William A; Ghosh, Kanishka; Blain, Alasdair; Longstaff, Lee; Deehan, David John

    2017-06-01

    Accurate soft tissue balance must be achieved to improve functional outcome after total knee arthroplasty (TKA). Sensor-integrated tibial trials have been introduced that allow real-time measurement of tibiofemoral kinematics during TKA. This study examined the interplay between tibiofemoral force and laxity, under defined intraoperative conditions, so as to quantify the kinematic behaviour of the CR femoral single-radius knee. TKA was undertaken in eight loaded cadaveric specimens. Computer navigation in combination with sensor data defined laxity and tibiofemoral contact force, respectively, during manual laxity testing. Fixed-effect linear modelling allowed quantification of the effect for flexion angle, direction of movement and TKA implantation upon the knee. An inverse relationship between laxity and contact force was demonstrated. With flexion, laxity increased as contact force decreased under manual stress. Change in laxity was significant beyond 30° for coronal plane laxity and beyond 60° for rotatory laxity (p < 0.01). Rotational stress in mid-flexion demonstrated the greatest mismatch in inter-compartmental forces. Contact point position over the tibial sensor demonstrated paradoxical roll-forward with knee flexion. Traditional balancing techniques may not reliably equate to uniform laxity or contact forces across the tibiofemoral joint through a range of flexion and argue for the role of per-operative sensor use to aid final balancing of the knee.

  1. Measurements of Mode I Interlaminar Properties of Carbon Fiber Reinforced Polymers Using Digital Image Correlation

    DOE PAGES

    Merzkirch, Matthias; Ahure Powell, Louise; Foecke, Tim

    2017-07-01

    Numerical models based on cohesive zones are usually used to model and simulate the mechanical behavior of laminated carbon fiber reinforced polymers (CFRP) in automotive and aerospace applications and require different interlaminar properties. This work focuses on determining the interlaminar fracture toughness (G IC) under Mode I loading of a double cantilever beam (DCB) specimen of unidirectional CFRP, serving as prototypical material. The novelty of this investigation is the improvement of the testing methodology by introducing digital image correlation (DIC) as an extensometer and this tool allows for crack growth measurement, phenomenological visualization and quantification of various material responses tomore » Mode I loading. Multiple methodologies from different international standards and other common techniques are compared for the determination of the evolution of G IC as crack resistance curves (R-curves). The primarily metrological sources of uncertainty, in contrast to material specific related uncertainties, are discussed through a simple sensitivity analysis. Additionally, the current work offers a detailed insight into the constraints and assumptions to allow exploration of different methods for the determination of material properties using the DIC measured data. The main aim is an improvement of the measurement technique and an increase in the reliability of measured data during static testing, in advance of future rate dependent testing for crashworthiness simulations.« less

  2. Flow cytometry for intracellular SPION quantification: specificity and sensitivity in comparison with spectroscopic methods

    PubMed Central

    Friedrich, Ralf P; Janko, Christina; Poettler, Marina; Tripal, Philipp; Zaloga, Jan; Cicha, Iwona; Dürr, Stephan; Nowak, Johannes; Odenbach, Stefan; Slabu, Ioana; Liebl, Maik; Trahms, Lutz; Stapf, Marcus; Hilger, Ingrid; Lyer, Stefan; Alexiou, Christoph

    2015-01-01

    Due to their special physicochemical properties, iron nanoparticles offer new promising possibilities for biomedical applications. For bench to bedside translation of super-paramagnetic iron oxide nanoparticles (SPIONs), safety issues have to be comprehensively clarified. To understand concentration-dependent nanoparticle-mediated toxicity, the exact quantification of intracellular SPIONs by reliable methods is of great importance. In the present study, we compared three different SPION quantification methods (ultraviolet spectrophotometry, magnetic particle spectroscopy, atomic adsorption spectroscopy) and discussed the shortcomings and advantages of each method. Moreover, we used those results to evaluate the possibility to use flow cytometric technique to determine the cellular SPION content. For this purpose, we correlated the side scatter data received from flow cytometry with the actual cellular SPION amount. We showed that flow cytometry provides a rapid and reliable method to assess the cellular SPION content. Our data also demonstrate that internalization of iron oxide nanoparticles in human umbilical vein endothelial cells is strongly dependent to the SPION type and results in a dose-dependent increase of toxicity. Thus, treatment with lauric acid-coated SPIONs (SEONLA) resulted in a significant increase in the intensity of side scatter and toxicity, whereas SEONLA with an additional protein corona formed by bovine serum albumin (SEONLA-BSA) and commercially available Rienso® particles showed only a minimal increase in both side scatter intensity and cellular toxicity. The increase in side scatter was in accordance with the measurements for SPION content by the atomic adsorption spectroscopy reference method. In summary, our data show that flow cytometry analysis can be used for estimation of uptake of SPIONs by mammalian cells and provides a fast tool for scientists to evaluate the safety of nanoparticle products. PMID:26170658

  3. A novel LC-MS/MS method for the simultaneous quantification of topiramate and its main metabolites in human plasma.

    PubMed

    Milosheska, Daniela; Roškar, Robert

    2017-05-10

    The aim of the present report was to develop and validate simple, sensitive and reliable LC-MS/MS method for quantification of topiramate (TPM) and its main metabolites: 2,3-desisopropylidene TPM, 4,5-desisopropylidene TPM, 10-OH TPM and 9-OH TPM in human plasma samples. The most abundant metabolite 2,3-desisopropylidene TPM was isolated from patients urine, characterized and afterwards used as an authentic standard for method development and validation. Sample preparation method employs 100μL of plasma sample and liquid-liquid extraction with a mixture of ethyl acetate and diethyl ether as extraction solvent. Chromatographic separation was achieved on a 1290 Infinity UHPLC coupled to 6460 Triple Quad Mass Spectrometer operated in negative MRM mode using Kinetex C18 column (50×2.1mm, 2.6μm) by gradient elution using water and methanol as a mobile phase and stable isotope labeled TPM as internal standard. The method showed to be selective, accurate, precise and linear over the concentration ranges of 0.10-20μg/mL for TPM, 0.01-2.0μg/mL for 2,3-desisopropylidene TPM, and 0.001-0.200μg/mL for 4,5-desisopropylidene TPM, 10-OH TPM and 9-OH TPM. The described method is the first fully validated method capable of simultaneous determination of TPM and its main metabolites in plasma over the selected analytical range. The suitability of the method was successfully demonstrated by the quantification of all analytes in plasma samples of patients with epilepsy and can be considered as reliable analytical tool for future investigations of the TPM metabolism. Copyright © 2017 Elsevier B.V. All rights reserved.

  4. Enantiomeric separation and quantification of R/S-amphetamine in urine by ultra-high performance supercritical fluid chromatography tandem mass spectrometry.

    PubMed

    Hegstad, S; Havnen, H; Helland, A; Spigset, O; Frost, J

    2018-03-01

    To distinguish between legal and illegal consumption of amphetamine reliable analytical methods for chiral separation of the R- and S-enantiomers of amphetamine in biological specimens are required. In this regard, supercritical fluid chromatography (SFC) has several potential advantages over liquid chromatography, including rapid separation of enantiomers due to low viscosity and high diffusivity of supercritical carbon dioxide, the main component in the SFC mobile phase. A method for enantiomeric separation and quantification of R- and S-amphetamine in urine was developed and validated using ultra-high performance supercritical fluid chromatography-tandem mass spectrometry (UHPSFC-MS/MS). Sample preparation prior to UHPSFC-MS/MS analysis was a semi-automatic solid phase extraction method. The UHPSFC-MS/MS method used a Chiralpak AD-3 column with a mobile phase consisting of CO 2 and 0.2% cyclohexylamine in 2-propanol. The injection volume was 2 μL and run-time was 6 min. MS/MS detection was performed with positive electrospray ionization and two multiple reaction monitoring transitions (m/z 136.1 > 119.0 and m/z 136.1 > 91.0). The calibration range was 50-10,000 ng/mL for each enantiomer. The between-assay relative standard deviations were in the range of 3.7-7.6%. Recovery was 92-93% and matrix effects ranged from 100 to 104% corrected with internal standard. After development and validation, the method has been successfully implemented in routine use at our laboratory for both separation and quantification of R/S-amphetamine, and has proved to be a reliable and useful tool for distinguishing intake of R- and S-amphetamine in authentic patient samples. Copyright © 2018 Elsevier B.V. All rights reserved.

  5. Use of capillary Western immunoassay (Wes) for quantification of dystrophin levels in skeletal muscle of healthy controls and individuals with Becker and Duchenne muscular dystrophy.

    PubMed

    Beekman, Chantal; Janson, Anneke A; Baghat, Aabed; van Deutekom, Judith C; Datson, Nicole A

    2018-01-01

    Duchenne muscular dystrophy (DMD) is a neuromuscular disease characterized by progressive weakness of the skeletal and cardiac muscles. This X-linked disorder is caused by open reading frame disrupting mutations in the DMD gene, resulting in strong reduction or complete absence of dystrophin protein. In order to use dystrophin as a supportive or even surrogate biomarker in clinical studies on investigational drugs aiming at correcting the primary cause of the disease, the ability to reliably quantify dystrophin expression in muscle biopsies of DMD patients pre- and post-treatment is essential. Here we demonstrate the application of the ProteinSimple capillary immunoassay (Wes) method, a gel- and blot-free method requiring less sample, antibody and time to run than conventional Western blot assay. We optimized dystrophin quantification by Wes using 2 different antibodies and found it to be highly sensitive, reproducible and quantitative over a large dynamic range. Using a healthy control muscle sample as a reference and α-actinin as a protein loading/muscle content control, a panel of skeletal muscle samples consisting of 31 healthy controls, 25 Becker Muscle dystrophy (BMD) and 17 DMD samples was subjected to Wes analysis. In healthy controls dystrophin levels varied 3 to 5-fold between the highest and lowest muscle samples, with the reference sample representing the average of all 31 samples. In BMD muscle samples dystrophin levels ranged from 10% to 90%, with an average of 33% of the healthy muscle average, while for the DMD samples the average dystrophin level was 1.3%, ranging from 0.7% to 7% of the healthy muscle average. In conclusion, Wes is a suitable, efficient and reliable method for quantification of dystrophin expression as a biomarker in DMD clinical drug development.

  6. Quantifying uncertainties in streamflow predictions through signature based inference of hydrological model parameters

    NASA Astrophysics Data System (ADS)

    Fenicia, Fabrizio; Reichert, Peter; Kavetski, Dmitri; Albert, Calro

    2016-04-01

    The calibration of hydrological models based on signatures (e.g. Flow Duration Curves - FDCs) is often advocated as an alternative to model calibration based on the full time series of system responses (e.g. hydrographs). Signature based calibration is motivated by various arguments. From a conceptual perspective, calibration on signatures is a way to filter out errors that are difficult to represent when calibrating on the full time series. Such errors may for example occur when observed and simulated hydrographs are shifted, either on the "time" axis (i.e. left or right), or on the "streamflow" axis (i.e. above or below). These shifts may be due to errors in the precipitation input (time or amount), and if not properly accounted in the likelihood function, may cause biased parameter estimates (e.g. estimated model parameters that do not reproduce the recession characteristics of a hydrograph). From a practical perspective, signature based calibration is seen as a possible solution for making predictions in ungauged basins. Where streamflow data are not available, it may in fact be possible to reliably estimate streamflow signatures. Previous research has for example shown how FDCs can be reliably estimated at ungauged locations based on climatic and physiographic influence factors. Typically, the goal of signature based calibration is not the prediction of the signatures themselves, but the prediction of the system responses. Ideally, the prediction of system responses should be accompanied by a reliable quantification of the associated uncertainties. Previous approaches for signature based calibration, however, do not allow reliable estimates of streamflow predictive distributions. Here, we illustrate how the Bayesian approach can be employed to obtain reliable streamflow predictive distributions based on signatures. A case study is presented, where a hydrological model is calibrated on FDCs and additional signatures. We propose an approach where the likelihood function for the signatures is derived from the likelihood for streamflow (rather than using an "ad-hoc" likelihood for the signatures as done in previous approaches). This likelihood is not easily tractable analytically and we therefore cannot apply "simple" MCMC methods. This numerical problem is solved using Approximate Bayesian Computation (ABC). Our result indicate that the proposed approach is suitable for producing reliable streamflow predictive distributions based on calibration to signature data. Moreover, our results provide indications on which signatures are more appropriate to represent the information content of the hydrograph.

  7. Quantification of dsDNA using the Hitachi F-7000 Fluorescence Spectrophotometer and PicoGreen dye.

    PubMed

    Moreno, Luis A; Cox, Kendra L

    2010-11-05

    Quantification of DNA, especially in small concentrations, is an important task with a wide range of biological applications including standard molecular biology assays such as synthesis and purification of DNA, diagnostic applications such as quantification of DNA amplification products, and detection of DNA molecules in drug preparations. During this video we will demonstrate the capability of the Hitachi F-7000 Fluorescence Spectrophotometer equipped with a Micro Plate Reader accessory to perform dsDNA quantification using Molecular Probes Quant-it PicoGreen dye reagent kit. The F-7000 Fluorescence Spectrophotometer offers high sensitivity and high speed measurements. It is a highly flexible system capable of measuring fluorescence, luminescence, and phosphorescence. Several measuring modes are available, including wavelength scan, time scan, photometry and 3-D scan measurement. The spectrophotometer has sensitivity in the range of 50 picomoles of fluorescein when using a 300 μL sample volume in the microplate, and is capable of measuring scan speeds of 60,000 nm/minute. It also has a wide dynamic range of up to 5 orders of magnitude which allows for the use of calibration curves over a wide range of concentrations. The optical system uses all reflective optics for maximum energy and sensitivity. The standard wavelength range is 200 to 750 nm, and can be extended to 900 nm when using one of the optional near infrared photomultipliers. The system allows optional temperature control for the plate reader from 5 to 60 degrees Celsius using an optional external temperature controlled liquid circulator. The microplate reader allows for the use of 96 well microplates, and the measuring speed for 96 wells is less than 60 seconds when using the kinetics mode. Software controls for the F-7000 and Microplate Reader are also highly flexible. Samples may be set in either column or row formats, and any combination of wells may be chosen for sample measurements. This allows for optimal utilization of the microplate. Additionally, the software allows importing micro plate sample configurations created in Excel and saved in comma separated values, or "csv" format. Microplate measuring configurations can be saved and recalled by the software for convenience and increased productivity. Data results can be output to a standard report, to Excel, or to an optional Report Generator Program.

  8. Quantification of dsDNA using the Hitachi F-7000 Fluorescence Spectrophotometer and PicoGreen Dye

    PubMed Central

    Moreno, Luis A.; Cox, Kendra L.

    2010-01-01

    Quantification of DNA, especially in small concentrations, is an important task with a wide range of biological applications including standard molecular biology assays such as synthesis and purification of DNA, diagnostic applications such as quantification of DNA amplification products, and detection of DNA molecules in drug preparations. During this video we will demonstrate the capability of the Hitachi F-7000 Fluorescence Spectrophotometer equipped with a Micro Plate Reader accessory to perform dsDNA quantification using Molecular Probes Quant-it PicoGreen dye reagent kit. The F-7000 Fluorescence Spectrophotometer offers high sensitivity and high speed measurements. It is a highly flexible system capable of measuring fluorescence, luminescence, and phosphorescence. Several measuring modes are available, including wavelength scan, time scan, photometry and 3-D scan measurement. The spectrophotometer has sensitivity in the range of 50 picomoles of fluorescein when using a 300 μL sample volume in the microplate, and is capable of measuring scan speeds of 60,000 nm/minute. It also has a wide dynamic range of up to 5 orders of magnitude which allows for the use of calibration curves over a wide range of concentrations. The optical system uses all reflective optics for maximum energy and sensitivity. The standard wavelength range is 200 to 750 nm, and can be extended to 900 nm when using one of the optional near infrared photomultipliers. The system allows optional temperature control for the plate reader from 5 to 60 degrees Celsius using an optional external temperature controlled liquid circulator. The microplate reader allows for the use of 96 well microplates, and the measuring speed for 96 wells is less than 60 seconds when using the kinetics mode. Software controls for the F-7000 and Microplate Reader are also highly flexible. Samples may be set in either column or row formats, and any combination of wells may be chosen for sample measurements. This allows for optimal utilization of the microplate. Additionally, the software allows importing micro plate sample configurations created in Excel and saved in comma separated values, or "csv" format. Microplate measuring configurations can be saved and recalled by the software for convenience and increased productivity. Data results can be output to a standard report, to Excel, or to an optional Report Generator Program. PMID:21189464

  9. Quantitative multi-color FRET measurements by Fourier lifetime excitation-emission matrix spectroscopy.

    PubMed

    Zhao, Ming; Huang, Run; Peng, Leilei

    2012-11-19

    Förster resonant energy transfer (FRET) is extensively used to probe macromolecular interactions and conformation changes. The established FRET lifetime analysis method measures the FRET process through its effect on the donor lifetime. In this paper we present a method that directly probes the time-resolved FRET signal with frequency domain Fourier lifetime excitation-emission matrix (FLEEM) measurements. FLEEM separates fluorescent signals by their different phonon energy pathways from excitation to emission. The FRET process generates a unique signal channel that is initiated by donor excitation but ends with acceptor emission. Time-resolved analysis of the FRET EEM channel allows direct measurements on the FRET process, unaffected by free fluorophores that might be present in the sample. Together with time-resolved analysis on non-FRET channels, i.e. donor and acceptor EEM channels, time resolved EEM analysis allows precise quantification of FRET in the presence of free fluorophores. The method is extended to three-color FRET processes, where quantification with traditional methods remains challenging because of the significantly increased complexity in the three-way FRET interactions. We demonstrate the time-resolved EEM analysis method with quantification of three-color FRET in incompletely hybridized triple-labeled DNA oligonucleotides. Quantitative measurements of the three-color FRET process in triple-labeled dsDNA are obtained in the presence of free single-labeled ssDNA and double-labeled dsDNA. The results establish a quantification method for studying multi-color FRET between multiple macromolecules in biochemical equilibrium.

  10. Quantitative multi-color FRET measurements by Fourier lifetime excitation-emission matrix spectroscopy

    PubMed Central

    Zhao, Ming; Huang, Run; Peng, Leilei

    2012-01-01

    Förster resonant energy transfer (FRET) is extensively used to probe macromolecular interactions and conformation changes. The established FRET lifetime analysis method measures the FRET process through its effect on the donor lifetime. In this paper we present a method that directly probes the time-resolved FRET signal with frequency domain Fourier lifetime excitation-emission matrix (FLEEM) measurements. FLEEM separates fluorescent signals by their different phonon energy pathways from excitation to emission. The FRET process generates a unique signal channel that is initiated by donor excitation but ends with acceptor emission. Time-resolved analysis of the FRET EEM channel allows direct measurements on the FRET process, unaffected by free fluorophores that might be present in the sample. Together with time-resolved analysis on non-FRET channels, i.e. donor and acceptor EEM channels, time resolved EEM analysis allows precise quantification of FRET in the presence of free fluorophores. The method is extended to three-color FRET processes, where quantification with traditional methods remains challenging because of the significantly increased complexity in the three-way FRET interactions. We demonstrate the time-resolved EEM analysis method with quantification of three-color FRET in incompletely hybridized triple-labeled DNA oligonucleotides. Quantitative measurements of the three-color FRET process in triple-labeled dsDNA are obtained in the presence of free single-labeled ssDNA and double-labeled dsDNA. The results establish a quantification method for studying multi-color FRET between multiple macromolecules in biochemical equilibrium. PMID:23187535

  11. Measurements of VOC fluxes by Eddy-covariance with a PTR-Qi-TOF-MS over a mature wheat crop near Paris: Evaluation of data quality and uncertainties.

    NASA Astrophysics Data System (ADS)

    Buysse, Pauline; Loubet, Benjamin; Ciuraru, Raluca; Lafouge, Florence; Zurfluh, Olivier; Gonzaga-Gomez, Lais; Fanucci, Olivier; Gueudet, Jean-Christophe; Decuq, Céline; Gros, Valérie; Sarda, Roland; Zannoni, Nora

    2017-04-01

    The quantification of volatile organic compounds (VOC) fluxes exchanged by terrestrial ecosystems is of large interest because of their influence on the chemistry and composition of the atmosphere including aerosols and oxidants. Latest developments in the techniques for detecting, identifying and measuring VOC fluxes have considerably improved the abilities to get reliable estimates. Among these, the eddy-covariance (EC) methodology constitutes the most direct approach, and relies on both well-established principles (Aubinet et al. 2000) and a sound continuously worldwide improving experience. The combination of the EC methodology with the latest proton-transfer-reaction mass spectrometer (PTR-MS) device, the PTR-Qi-TOF-MS, which allows the identification and quantification of more than 500 VOC at high frequency, now provides a very powerful and precise tool for an accurate quantification of VOC fluxes on various types of terrestrial ecosystems. The complexity of the whole methodology however demands that several data quality requirements are fulfilled. VOC fluxes were measured by EC with a PTR-Qi-TOF-MS (national instrument within the ANAEE-France framework) for one month and a half over a mature wheat crop near Paris (FR-GRI ICOS site). Most important emissions (by descending order) were observed from detected compounds with mass-over-charge (m/z) ratios of 33.033 (methanol), 45.033 (acetaldehyde), 93.033 (not identified yet), 59.049 (acetone), and 63.026 (dimethyl sulfide or DMS). Emissions from higher-mass compounds, which might be due to pesticide applications at the beginning of our observation period, were also detected. Some compounds were also seen to deposit (e.g. m/z 47.013, 71.085, 75.044, 83.05) while others exhibited bidirectional fluxes (e.g. m/z 57.07, 69.07). Before analyzing VOC flux responses to meteorological and crop development drivers, a data quality check was performed which included (i) uncertainty analysis of mass and concentration calibration, (ii) determination of fragmentation patterns and (iii) of lag time high-frequency losses for all ions that showed a flux, and (iv) the determination of the flux random uncertainties and of the limit of detection.

  12. Lessons from Astrobiological Planetary Analogue Exploration in Iceland: Biomarker Assay Performance and Downselection

    NASA Technical Reports Server (NTRS)

    Gentry, D. M.; Amador, E. S.; Cable, M. L.; Cantrell, T.; Chaudry, N.; Cullen, T.; Duca, Z.; Kirby, J.; Jacobsen, M.; McCaig, H.; hide

    2017-01-01

    Understanding the sensitivity of biomarker assays to the local physicochemical environment, and the underlying spatial distribution of the target biomarkers in 'homogeneous' environments, can increase mission science return. We have conducted four expeditions to Icelandic Mars analogue sites in which an increasingly refined battery of physicochemical measurements and biomarker assays were performed, staggered with scouting of further sites. Completed expeditions took place in 2012 (location scouting and field assay use testing), 2013 (sampling of two major sites with three assays and observational physicochemical measurements), 2015 (repeat sampling of prior sites and one new site, scouting of new sites, three assays and three instruments), and 2016 (preliminary sampling of new sites with analysis of returned samples). Target sites were geologically recent basaltic lava flows, and sample loci were arranged in hierarchically nested grids at 10 cm, 1 m, 10 m, 100 m, and >1 km order scales, subject to field constraints. Assays were intended to represent a diversity of potential biomarker types (cell counting via nucleic acid staining and fluorescence microscopy, ATP quantification via luciferase luminescence, and relative DNA quantification with simple domain-level primers) rather a specific mission science target, and were selected to reduce laboratory overhead, require limited consumables, and allow rapid turnaround. All analytical work was performed in situ or in a field laboratory within a day's travel of the field sites unless otherwise noted. We have demonstrated the feasibility of performing ATP quantification and qPCR analysis in a field-based laboratory with single-day turnaround. The ATP assay was generally robust and reliable and required minimal field equipment and training to produce a large amount of useful data. DNA was successfully extracted from all samples, but the serial-batch nature of qPCR significantly limited the number of primers (hence classifications) and replicates that could be run in a single day. Fluorescence microscopy did not prove feasible under the same constraints, primarily due to the large number of person-hours required to view, analyze, and record results from the images; however, this could be mitigated with higher-quality imaging instruments and appropriate image analysis software.

  13. Quantification of glomerular filtration rate by measurement of gadobutrol clearance from the extracellular fluid volume: comparison of a TurboFLASH and a TrueFISP approach

    NASA Astrophysics Data System (ADS)

    Boss, Andreas; Martirosian, Petros; Artunc, Ferruh; Risler, Teut; Claussen, Claus D.; Schlemmer, Heinz-Peter; Schick, Fritz

    2007-03-01

    Purpose: As the MR contrast-medium gadobutrol is completely eliminated via glomerular filtration, the glomerular filtration rate (GFR) can be quantified after bolus-injection of gadobutrol and complete mixing in the extracellular fluid volume (ECFV) by measuring the signal decrease within the liver parenchyma. Two different navigator-gated single-shot saturation-recovery sequences have been tested for suitability of GFR quantification: a TurboFLASH and a TrueFISP readout technique. Materials and Methods: Ten healthy volunteers (mean age 26.1+/-3.6) were equally devided in two subgroups. After bolus-injection of 0.05 mmol/kg gadobutrol, coronal single-slice images of the liver were recorded every 4-5 seconds during free breathing using either the TurboFLASH or the TrueFISP technique. Time-intensity curves were determined from manually drawn regions-of-interest over the liver parenchyma. Both sequences were subsequently evaluated regarding signal to noise ratio (SNR) and the behaviour of signal intensity curves. The calculated GFR values were compared to an iopromide clearance gold standard. Results: The TrueFISP sequence exhibited a 3.4-fold higher SNR as compared to the TurboFLASH sequence and markedly lower variability of the recorded time-intensity curves. The calculated mean GFR values were 107.0+/-16.1 ml/min/1.73m2 (iopromide: 92.1+/-14.5 ml/min/1.73m2) for the TrueFISP technique and 125.6+/-24.1 ml/min/1.73m2 (iopromide: 97.7+/-6.3 ml/min/1.73m2) for the TurboFLASH approach. The mean paired differences with TrueFISP was lower (15.0 ml/min/1.73m2) than in the TurboFLASH method (27.9 ml/min/1.73m2). Conclusion: The global GFR can be quantified via measurement of gadobutrol clearance from the ECFV. A saturation-recovery TrueFISP sequence allows for more reliable GFR quantification as a saturation recovery TurboFLASH technique.

  14. Development and validation of a rapid and simple LC-MS/MS method for quantification of vemurafenib in human plasma: application to a human pharmacokinetic study.

    PubMed

    Bihan, Kevin; Sauzay, Chloé; Goldwirt, Lauriane; Charbonnier-Beaupel, Fanny; Hulot, Jean-Sebastien; Funck-Brentano, Christian; Zahr, Noël

    2015-02-01

    Vemurafenib (Zelboraf) is a new tyrosine kinase inhibitor that selectively targets activated BRAF V600E gene and is indicated for the treatment of advanced BRAF mutation-positive melanoma. We developed a simple method for vemurafenib quantification using liquid chromatography-tandem mass spectrometry. A stability study of vemurafenib in human plasma was also performed. (13)C(6)-vemurafenib was used as the internal standard. A single-step protein precipitation was used for plasma sample preparation. Chromatography was performed on an Acquity UPLC system (Waters) with chromatographic separation by the use of an Acquity UPLC BEH C18 column (2.1 × 50 mm, 1.7-mm particle size; Waters). Quantification was performed using the monitoring of multiple reactions of following transitions: m/z 488.2 → 381.0 for vemurafenib and m/z 494.2 → 387.0 for internal standard. This method was linear over the range from 1.0 to 100.0 mcg/mL. The lower limit of quantification was 0.1 mcg/mL for vemurafenib in plasma. Vemurafenib remained stable for 1 month at all levels tested, when stored indifferently at room temperature (20 °C), at +4 °C, or at -20 °C. This method was used successfully to perform a plasma pharmacokinetic study of vemurafenib in a patient after oral administration at a steady state. This liquid chromatography-tandem mass spectrometry method for vemurafenib quantification in human plasma is simple, rapid, specific, sensitive, accurate, precise, and reliable.

  15. Digital Quantification of Proteins and mRNA in Single Mammalian Cells.

    PubMed

    Albayrak, Cem; Jordi, Christian A; Zechner, Christoph; Lin, Jing; Bichsel, Colette A; Khammash, Mustafa; Tay, Savaş

    2016-03-17

    Absolute quantification of macromolecules in single cells is critical for understanding and modeling biological systems that feature cellular heterogeneity. Here we show extremely sensitive and absolute quantification of both proteins and mRNA in single mammalian cells by a very practical workflow that combines proximity ligation assay (PLA) and digital PCR. This digital PLA method has femtomolar sensitivity, which enables the quantification of very small protein concentration changes over its entire 3-log dynamic range, a quality necessary for accounting for single-cell heterogeneity. We counted both endogenous (CD147) and exogenously expressed (GFP-p65) proteins from hundreds of single cells and determined the correlation between CD147 mRNA and the protein it encodes. Using our data, a stochastic two-state model of the central dogma was constructed and verified using joint mRNA/protein distributions, allowing us to estimate transcription burst sizes and extrinsic noise strength and calculate the transcription and translation rate constants in single mammalian cells. Copyright © 2016 Elsevier Inc. All rights reserved.

  16. Direct liquid chromatography method for the simultaneous quantification of hydroxytyrosol and tyrosol in red wines.

    PubMed

    Piñeiro, Zulema; Cantos-Villar, Emma; Palma, Miguel; Puertas, Belen

    2011-11-09

    A validated HPLC method with fluorescence detection for the simultaneous quantification of hydroxytyrosol and tyrosol in red wines is described. Detection conditions for both compounds were optimized (excitation at 279 and 278 and emission at 631 and 598 nm for hydroxytyrosol and tyrosol, respectively). The validation of the analytical method was based on selectivity, linearity, robustness, detection and quantification limits, repeatability, and recovery. The detection and quantification limits in red wines were set at 0.023 and 0.076 mg L(-1) for hydroxytyrosol and at 0.007 and 0.024 mg L(-1) for tyrosol determination, respectively. Precision values, both within-day and between-day (n = 5), remained below 3% for both compounds. In addition, a fractional factorial experimental design was developed to analyze the influence of six different conditions on analysis. The final optimized HPLC-fluorescence method allowed the analysis of 30 nonpretreated Spanish red wines to evaluate their hydroxytyrosol and tyrosol contents.

  17. A new background subtraction method for Western blot densitometry band quantification through image analysis software.

    PubMed

    Gallo-Oller, Gabriel; Ordoñez, Raquel; Dotor, Javier

    2018-06-01

    Since its first description, Western blot has been widely used in molecular labs. It constitutes a multistep method that allows the detection and/or quantification of proteins from simple to complex protein mixtures. Western blot quantification method constitutes a critical step in order to obtain accurate and reproducible results. Due to the technical knowledge required for densitometry analysis together with the resources availability, standard office scanners are often used for the imaging acquisition of developed Western blot films. Furthermore, the use of semi-quantitative software as ImageJ (Java-based image-processing and analysis software) is clearly increasing in different scientific fields. In this work, we describe the use of office scanner coupled with the ImageJ software together with a new image background subtraction method for accurate Western blot quantification. The proposed method represents an affordable, accurate and reproducible approximation that could be used in the presence of limited resources availability. Copyright © 2018 Elsevier B.V. All rights reserved.

  18. Probabilistic Design of a Plate-Like Wing to Meet Flutter and Strength Requirements

    NASA Technical Reports Server (NTRS)

    Stroud, W. Jefferson; Krishnamurthy, T.; Mason, Brian H.; Smith, Steven A.; Naser, Ahmad S.

    2002-01-01

    An approach is presented for carrying out reliability-based design of a metallic, plate-like wing to meet strength and flutter requirements that are given in terms of risk/reliability. The design problem is to determine the thickness distribution such that wing weight is a minimum and the probability of failure is less than a specified value. Failure is assumed to occur if either the flutter speed is less than a specified allowable or the stress caused by a pressure loading is greater than a specified allowable. Four uncertain quantities are considered: wing thickness, calculated flutter speed, allowable stress, and magnitude of a uniform pressure load. The reliability-based design optimization approach described herein starts with a design obtained using conventional deterministic design optimization with margins on the allowables. Reliability is calculated using Monte Carlo simulation with response surfaces that provide values of stresses and flutter speed. During the reliability-based design optimization, the response surfaces and move limits are coordinated to ensure accuracy of the response surfaces. Studies carried out in the paper show the relationship between reliability and weight and indicate that, for the design problem considered, increases in reliability can be obtained with modest increases in weight.

  19. Stable isotope labelling methods in mass spectrometry-based quantitative proteomics.

    PubMed

    Chahrour, Osama; Cobice, Diego; Malone, John

    2015-09-10

    Mass-spectrometry based proteomics has evolved as a promising technology over the last decade and is undergoing a dramatic development in a number of different areas, such as; mass spectrometric instrumentation, peptide identification algorithms and bioinformatic computational data analysis. The improved methodology allows quantitative measurement of relative or absolute protein amounts, which is essential for gaining insights into their functions and dynamics in biological systems. Several different strategies involving stable isotopes label (ICAT, ICPL, IDBEST, iTRAQ, TMT, IPTL, SILAC), label-free statistical assessment approaches (MRM, SWATH) and absolute quantification methods (AQUA) are possible, each having specific strengths and weaknesses. Inductively coupled plasma mass spectrometry (ICP-MS), which is still widely recognised as elemental detector, has recently emerged as a complementary technique to the previous methods. The new application area for ICP-MS is targeting the fast growing field of proteomics related research, allowing absolute protein quantification using suitable elemental based tags. This document describes the different stable isotope labelling methods which incorporate metabolic labelling in live cells, ICP-MS based detection and post-harvest chemical label tagging for protein quantification, in addition to summarising their pros and cons. Copyright © 2015 Elsevier B.V. All rights reserved.

  20. Mixture quantification using PLS in plastic scintillation measurements.

    PubMed

    Bagán, H; Tarancón, A; Rauret, G; García, J F

    2011-06-01

    This article reports the capability of plastic scintillation (PS) combined with multivariate calibration (Partial least squares; PLS) to detect and quantify alpha and beta emitters in mixtures. While several attempts have been made with this purpose in mind using liquid scintillation (LS), no attempt was done using PS that has the great advantage of not producing mixed waste after the measurements are performed. Following this objective, ternary mixtures of alpha and beta emitters ((241)Am, (137)Cs and (90)Sr/(90)Y) have been quantified. Procedure optimisation has evaluated the use of the net spectra or the sample spectra, the inclusion of different spectra obtained at different values of the Pulse Shape Analysis parameter and the application of the PLS1 or PLS2 algorithms. The conclusions show that the use of PS+PLS2 applied to the sample spectra, without the use of any pulse shape discrimination, allows quantification of the activities with relative errors less than 10% in most of the cases. This procedure not only allows quantification of mixtures but also reduces measurement time (no blanks are required) and the application of this procedure does not require detectors that include the pulse shape analysis parameter. Copyright © 2011 Elsevier Ltd. All rights reserved.

  1. Detection, mapping, and quantification of single walled carbon nanotubes in histological specimens with photoacoustic microscopy.

    PubMed

    Avti, Pramod K; Hu, Song; Favazza, Christopher; Mikos, Antonios G; Jansen, John A; Shroyer, Kenneth R; Wang, Lihong V; Sitharaman, Balaji

    2012-01-01

    In the present study, the efficacy of multi-scale photoacoustic microscopy (PAM) was investigated to detect, map, and quantify trace amounts [nanograms (ng) to micrograms (µg)] of SWCNTs in a variety of histological tissue specimens consisting of cancer and benign tissue biopsies (histological specimens from implanted tissue engineering scaffolds). Optical-resolution (OR) and acoustic-resolution (AR)--Photoacoustic microscopy (PAM) was employed to detect, map and quantify the SWCNTs in a variety of tissue histological specimens and compared with other optical techniques (bright-field optical microscopy, Raman microscopy, near infrared (NIR) fluorescence microscopy). Both optical-resolution and acoustic-resolution PAM, allow the detection and quantification of SWCNTs in histological specimens with scalable spatial resolution and depth penetration. The noise-equivalent detection sensitivity to SWCNTs in the specimens was calculated to be as low as ∼7 pg. Image processing analysis further allowed the mapping, distribution, and quantification of the SWCNTs in the histological sections. The results demonstrate the potential of PAM as a promising imaging technique to detect, map, and quantify SWCNTs in histological specimens, and could complement the capabilities of current optical and electron microscopy techniques in the analysis of histological specimens containing SWCNTs.

  2. CEQer: a graphical tool for copy number and allelic imbalance detection from whole-exome sequencing data.

    PubMed

    Piazza, Rocco; Magistroni, Vera; Pirola, Alessandra; Redaelli, Sara; Spinelli, Roberta; Redaelli, Serena; Galbiati, Marta; Valletta, Simona; Giudici, Giovanni; Cazzaniga, Giovanni; Gambacorti-Passerini, Carlo

    2013-01-01

    Copy number alterations (CNA) are common events occurring in leukaemias and solid tumors. Comparative Genome Hybridization (CGH) is actually the gold standard technique to analyze CNAs; however, CGH analysis requires dedicated instruments and is able to perform only low resolution Loss of Heterozygosity (LOH) analyses. Here we present CEQer (Comparative Exome Quantification analyzer), a new graphical, event-driven tool for CNA/allelic-imbalance (AI) coupled analysis of exome sequencing data. By using case-control matched exome data, CEQer performs a comparative digital exonic quantification to generate CNA data and couples this information with exome-wide LOH and allelic imbalance detection. This data is used to build mixed statistical/heuristic models allowing the identification of CNA/AI events. To test our tool, we initially used in silico generated data, then we performed whole-exome sequencing from 20 leukemic specimens and corresponding matched controls and we analyzed the results using CEQer. Taken globally, these analyses showed that the combined use of comparative digital exon quantification and LOH/AI allows generating very accurate CNA data. Therefore, we propose CEQer as an efficient, robust and user-friendly graphical tool for the identification of CNA/AI in the context of whole-exome sequencing data.

  3. Improving the quantification of contrast enhanced ultrasound using a Bayesian approach

    NASA Astrophysics Data System (ADS)

    Rizzo, Gaia; Tonietto, Matteo; Castellaro, Marco; Raffeiner, Bernd; Coran, Alessandro; Fiocco, Ugo; Stramare, Roberto; Grisan, Enrico

    2017-03-01

    Contrast Enhanced Ultrasound (CEUS) is a sensitive imaging technique to assess tissue vascularity, that can be useful in the quantification of different perfusion patterns. This can be particularly important in the early detection and staging of arthritis. In a recent study we have shown that a Gamma-variate can accurately quantify synovial perfusion and it is flexible enough to describe many heterogeneous patterns. Moreover, we have shown that through a pixel-by-pixel analysis the quantitative information gathered characterizes more effectively the perfusion. However, the SNR ratio of the data and the nonlinearity of the model makes the parameter estimation difficult. Using classical non-linear-leastsquares (NLLS) approach the number of unreliable estimates (those with an asymptotic coefficient of variation greater than a user-defined threshold) is significant, thus affecting the overall description of the perfusion kinetics and of its heterogeneity. In this work we propose to solve the parameter estimation at the pixel level within a Bayesian framework using Variational Bayes (VB), and an automatic and data-driven prior initialization. When evaluating the pixels for which both VB and NLLS provided reliable estimates, we demonstrated that the parameter values provided by the two methods are well correlated (Pearson's correlation between 0.85 and 0.99). Moreover, the mean number of unreliable pixels drastically reduces from 54% (NLLS) to 26% (VB), without increasing the computational time (0.05 s/pixel for NLLS and 0.07 s/pixel for VB). When considering the efficiency of the algorithms as computational time per reliable estimate, VB outperforms NLLS (0.11 versus 0.25 seconds per reliable estimate respectively).

  4. Strawberry: Fast and accurate genome-guided transcript reconstruction and quantification from RNA-Seq.

    PubMed

    Liu, Ruolin; Dickerson, Julie

    2017-11-01

    We propose a novel method and software tool, Strawberry, for transcript reconstruction and quantification from RNA-Seq data under the guidance of genome alignment and independent of gene annotation. Strawberry consists of two modules: assembly and quantification. The novelty of Strawberry is that the two modules use different optimization frameworks but utilize the same data graph structure, which allows a highly efficient, expandable and accurate algorithm for dealing large data. The assembly module parses aligned reads into splicing graphs, and uses network flow algorithms to select the most likely transcripts. The quantification module uses a latent class model to assign read counts from the nodes of splicing graphs to transcripts. Strawberry simultaneously estimates the transcript abundances and corrects for sequencing bias through an EM algorithm. Based on simulations, Strawberry outperforms Cufflinks and StringTie in terms of both assembly and quantification accuracies. Under the evaluation of a real data set, the estimated transcript expression by Strawberry has the highest correlation with Nanostring probe counts, an independent experiment measure for transcript expression. Strawberry is written in C++14, and is available as open source software at https://github.com/ruolin/strawberry under the MIT license.

  5. Multiplex quantification of protein toxins in human biofluids and food matrices using immunoextraction and high-resolution targeted mass spectrometry.

    PubMed

    Dupré, Mathieu; Gilquin, Benoit; Fenaille, François; Feraudet-Tarisse, Cécile; Dano, Julie; Ferro, Myriam; Simon, Stéphanie; Junot, Christophe; Brun, Virginie; Becher, François

    2015-08-18

    The development of rapid methods for unambiguous identification and precise quantification of protein toxins in various matrices is essential for public health surveillance. Nowadays, analytical strategies classically rely on sensitive immunological assays, but mass spectrometry constitutes an attractive complementary approach thanks to direct measurement and protein characterization ability. We developed here an innovative multiplex immuno-LC-MS/MS method for the simultaneous and specific quantification of the three potential biological warfare agents, ricin, staphylococcal enterotoxin B, and epsilon toxin, in complex human biofluids and food matrices. At least 7 peptides were targeted for each toxin (43 peptides in total) with a quadrupole-Orbitrap high-resolution instrument for exquisite detection specificity. Quantification was performed using stable isotope-labeled toxin standards spiked early in the sample. Lower limits of quantification were determined at or close to 1 ng·mL(-1). The whole process was successfully applied to the quantitative analysis of toxins in complex samples such as milk, human urine, and plasma. Finally, we report new data on toxin stability with no evidence of toxin degradation in milk in a 48 h time frame, allowing relevant quantitative toxin analysis for samples collected in this time range.

  6. A microRNA detection system based on padlock probes and rolling circle amplification

    PubMed Central

    Jonstrup, Søren Peter; Koch, Jørn; Kjems, Jørgen

    2006-01-01

    The differential expression and the regulatory roles of microRNAs (miRNAs) are being studied intensively these years. Their minute size of only 19–24 nucleotides and strong sequence similarity among related species call for enhanced methods for reliable detection and quantification. Moreover, miRNA expression is generally restricted to a limited number of specific cells within an organism and therefore requires highly sensitive detection methods. Here we present a simple and reliable miRNA detection protocol based on padlock probes and rolling circle amplification. It can be performed without specialized equipment and is capable of measuring the content of specific miRNAs in a few nanograms of total RNA. PMID:16888321

  7. Simultaneous determination of airborne acetaldehyde, acetone, 2-butanone, and cyclohexanone using sampling tubes with 2,4-dinitrophenylhydrazine-coated solid sorbent.

    PubMed

    Binding, N; Schilder, K; Czeschinski, P A; Witting, U

    1998-08-01

    The 2,4-dinitrophenylhydrazine (2,4-DNPH) derivatization method mainly used for the determination of airborne formaldehyde was extended for acetaldehyde, acetone, 2-butanone, and cyclohexanone, the next four carbonyl compounds of industrial importance. Sampling devices and sampling conditions were adjusted for the respective limit value regulations. Analytical reliability criteria were established and compared to those of other recommended methods. With a minimum analytical range from one tenth to the 3-fold limit value in all cases and with relative standard deviations below 5%, the adjusted method meets all requirements for the reliable quantification of the four compounds in workplace air as well as in ambient air.

  8. Use of iris recognition camera technology for the quantification of corneal opacification in mucopolysaccharidoses.

    PubMed

    Aslam, Tariq Mehmood; Shakir, Savana; Wong, James; Au, Leon; Ashworth, Jane

    2012-12-01

    Mucopolysaccharidoses (MPS) can cause corneal opacification that is currently difficult to objectively quantify. With newer treatments for MPS comes an increased need for a more objective, valid and reliable index of disease severity for clinical and research use. Clinical evaluation by slit lamp is very subjective and techniques based on colour photography are difficult to standardise. In this article the authors present evidence for the utility of dedicated image analysis algorithms applied to images obtained by a highly sophisticated iris recognition camera that is small, manoeuvrable and adapted to achieve rapid, reliable and standardised objective imaging in a wide variety of patients while minimising artefactual interference in image quality.

  9. A microRNA detection system based on padlock probes and rolling circle amplification.

    PubMed

    Jonstrup, Søren Peter; Koch, Jørn; Kjems, Jørgen

    2006-09-01

    The differential expression and the regulatory roles of microRNAs (miRNAs) are being studied intensively these years. Their minute size of only 19-24 nucleotides and strong sequence similarity among related species call for enhanced methods for reliable detection and quantification. Moreover, miRNA expression is generally restricted to a limited number of specific cells within an organism and therefore requires highly sensitive detection methods. Here we present a simple and reliable miRNA detection protocol based on padlock probes and rolling circle amplification. It can be performed without specialized equipment and is capable of measuring the content of specific miRNAs in a few nanograms of total RNA.

  10. Confocal nanoscanning, bead picking (CONA): PickoScreen microscopes for automated and quantitative screening of one-bead one-compound libraries.

    PubMed

    Hintersteiner, Martin; Buehler, Christof; Uhl, Volker; Schmied, Mario; Müller, Jürgen; Kottig, Karsten; Auer, Manfred

    2009-01-01

    Solid phase combinatorial chemistry provides fast and cost-effective access to large bead based libraries with compound numbers easily exceeding tens of thousands of compounds. Incubating one-bead one-compound library beads with fluorescently labeled target proteins and identifying and isolating the beads which contain a bound target protein, potentially represents one of the most powerful generic primary high throughput screening formats. On-bead screening (OBS) based on this detection principle can be carried out with limited automation. Often hit bead detection, i.e. recognizing beads with a fluorescently labeled protein bound to the compound on the bead, relies on eye-inspection under a wide-field microscope. Using low resolution detection techniques, the identification of hit beads and their ranking is limited by a low fluorescence signal intensity and varying levels of the library beads' autofluorescence. To exploit the full potential of an OBS process, reliable methods for both automated quantitative detection of hit beads and their subsequent isolation are needed. In a joint collaborative effort with Evotec Technologies (now Perkin-Elmer Cellular Technologies Germany GmbH), we have built two confocal bead scanner and picker platforms PS02 and a high-speed variant PS04 dedicated to automated high resolution OBS. The PS0X instruments combine fully automated confocal large area scanning of a bead monolayer at the bottom of standard MTP plates with semiautomated isolation of individual hit beads via hydraulic-driven picker capillaries. The quantification of fluorescence intensities with high spatial resolution in the equatorial plane of each bead allows for a reliable discrimination between entirely bright autofluorescent beads and real hit beads which exhibit an increased fluorescence signal at the outer few micrometers of the bead. The achieved screening speed of up to 200,000 bead assayed in less than 7 h and the picking time of approximately 1 bead/min allow exploitation of one-bead one-compound libraries with high sensitivity, accuracy, and speed.

  11. Self-optimized construction of transition rate matrices from accelerated atomistic simulations with Bayesian uncertainty quantification

    NASA Astrophysics Data System (ADS)

    Swinburne, Thomas D.; Perez, Danny

    2018-05-01

    A massively parallel method to build large transition rate matrices from temperature-accelerated molecular dynamics trajectories is presented. Bayesian Markov model analysis is used to estimate the expected residence time in the known state space, providing crucial uncertainty quantification for higher-scale simulation schemes such as kinetic Monte Carlo or cluster dynamics. The estimators are additionally used to optimize where exploration is performed and the degree of temperature acceleration on the fly, giving an autonomous, optimal procedure to explore the state space of complex systems. The method is tested against exactly solvable models and used to explore the dynamics of C15 interstitial defects in iron. Our uncertainty quantification scheme allows for accurate modeling of the evolution of these defects over timescales of several seconds.

  12. Supercritical fluid chromatography coupled with tandem mass spectrometry: A high-efficiency detection technique to quantify Taxane drugs in whole-blood samples.

    PubMed

    Jin, Chan; Guan, Jibin; Zhang, Dong; Li, Bing; Liu, Hongzhuo; He, Zhonggui

    2017-10-01

    We present a technique to rapid determine taxane in blood samples by supercritical fluid chromatography together with mass spectrometry. The aim of this study was to develop a supercritical fluid chromatography with mass spectrometry method for the analysis of paclitaxel, cabazitaxel, and docetaxel in whole-blood samples of rats. Liquid-dry matrix spot extraction was selected in sample preparation procedure. Supercritical fluid chromatography separation of paclitaxel, cabazitaxel, docetaxel, and glyburide (internal standard) was accomplished within 3 min by using the gradient mobile phase consisted of methanol as the compensation solvent and carbon dioxide at a flow rate of 1.0 mL/min. The method was validated regarding specificity, the lower limit of quantification, repeatability, and reproducibility of quantification, extraction recovery, and matrix effects. The lower limit of quantification was found to be 10 ng/mL since it exhibited acceptable precision and accuracy at the corresponding level. All interday accuracies and precisions were within the accepted criteria of ±15% of the nominal value and within ±20% at the lower limit of quantification, implying that the method was reliable and reproducible. In conclusion, this method is a promising tool to support and improve preclinical or clinical pharmacokinetic studies with the taxanes anticancer drugs. © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  13. Single-step transesterification with simultaneous concentration and stable isotope analysis of fatty acid methyl esters by gas chromatography-combustion-isotope ratio mass spectrometry.

    PubMed

    Panetta, Robert J; Jahren, A Hope

    2011-05-30

    Gas chromatography-combustion-isotope ratio mass spectrometry (GC-C-IRMS) is increasingly applied to food and metabolic studies for stable isotope analysis (δ(13) C), with the quantification of analyte concentration often obtained via a second alternative method. We describe a rapid direct transesterification of triacylglycerides (TAGs) for fatty acid methyl ester (FAME) analysis by GC-C-IRMS demonstrating robust simultaneous quantification of amount of analyte (mean r(2) =0.99, accuracy ±2% for 37 FAMEs) and δ(13) C (±0.13‰) in a single analytical run. The maximum FAME yield and optimal δ(13) C values are obtained by derivatizing with 10% (v/v) acetyl chloride in methanol for 1 h, while lower levels of acetyl chloride and shorter reaction times skewed the δ(13) C values by as much as 0.80‰. A Bland-Altman evaluation of the GC-C-IRMS measurements resulted in excellent agreement for pure oils (±0.08‰) and oils extracted from French fries (±0.49‰), demonstrating reliable simultaneous quantification of FAME concentration and δ(13) C values. Thus, we conclude that for studies requiring both the quantification of analyte and δ(13) C data, such as authentication or metabolic flux studies, GC-C-IRMS can be used as the sole analytical method. Copyright © 2011 John Wiley & Sons, Ltd.

  14. Ct shift: A novel and accurate real-time PCR quantification model for direct comparison of different nucleic acid sequences and its application for transposon quantifications.

    PubMed

    Kolacsek, Orsolya; Pergel, Enikő; Varga, Nóra; Apáti, Ágota; Orbán, Tamás I

    2017-01-20

    There are numerous applications of quantitative PCR for both diagnostic and basic research. As in many other techniques the basis of quantification is that comparisons are made between different (unknown and known or reference) specimens of the same entity. When the aim is to compare real quantities of different species in samples, one cannot escape their separate precise absolute quantification. We have established a simple and reliable method for this purpose (Ct shift method) which combines the absolute and the relative approach. It requires a plasmid standard containing both sequences of amplicons to be compared (e.g. the target of interest and the endogenous control). It can serve as a reference sample with equal copies of templates for both targets. Using the ΔΔCt formula we can quantify the exact ratio of the two templates in each unknown sample. The Ct shift method has been successfully applied for transposon gene copy measurements, as well as for comparison of different mRNAs in cDNA samples. This study provides the proof of concept and introduces some potential applications of the method; the absolute nature of results even without the need for real reference samples can contribute to the universality of the method and comparability of different studies. Copyright © 2016 Elsevier B.V. All rights reserved.

  15. Quantification of Parvovirus B19 DNA Using COBAS AmpliPrep Automated Sample Preparation and LightCycler Real-Time PCR

    PubMed Central

    Schorling, Stefan; Schalasta, Gunnar; Enders, Gisela; Zauke, Michael

    2004-01-01

    The COBAS AmpliPrep instrument (Roche Diagnostics GmbH, D-68305 Mannheim, Germany) automates the entire sample preparation process of nucleic acid isolation from serum or plasma for polymerase chain reaction analysis. We report the analytical performance of the LightCycler Parvovirus B19 Quantification Kit (Roche Diagnostics) using nucleic acids isolated with the COBAS AmpliPrep instrument. Nucleic acids were extracted using the Total Nucleic Acid Isolation Kit (Roche Diagnostics) and amplified with the LightCycler Parvovirus B19 Quantification Kit. The kit combination processes 72 samples per 8-hour shift. The lower detection limit is 234 IU/ml at a 95% hit-rate, linear range approximately 104-1010 IU/ml, and overall precision 16 to 40%. Relative sensitivity and specificity in routine samples from pregnant women are 100% and 93%, respectively. Identification of a persistent parvovirus B19-infected individual by the polymerase chain reaction among 51 anti-parvovirus B19 IgM-negative samples underlines the importance of additional nucleic acid testing in pregnancy and its superiority to serology in identifying the risk of parvovirus B19 transmission via blood or blood products. Combination of the Total Nucleic Acid Isolation Kit on the COBAS AmpliPrep instrument with the LightCycler Parvovirus B19 Quantification Kit provides a reliable and time-saving tool for sensitive and accurate detection of parvovirus B19 DNA. PMID:14736825

  16. Quantification of mRNA expression by competitive PCR using non-homologous competitors containing a shifted restriction site

    PubMed Central

    Watzinger, Franz; Hörth, Elfriede; Lion, Thomas

    2001-01-01

    Despite the recent introduction of real-time PCR methods, competitive PCR techniques continue to play an important role in nucleic acid quantification because of the significantly lower cost of equipment and consumables. Here we describe a shifted restriction-site competitive PCR (SRS-cPCR) assay based on a modified type of competitor. The competitor fragments are designed to contain a recognition site for a restriction endonuclease that is also present in the target sequence to be quantified, but in a different position. Upon completion of the PCR, the amplicons are digested in the same tube with a single restriction enzyme, without the need to purify PCR products. The generated competitor- and target-specific restriction fragments display different sizes, and can be readily separated by electrophoresis and quantified by image analysis. Suboptimal digestion affects competitor- and target-derived amplicons to the same extent, thus eliminating the problem of incorrect quantification as a result of incomplete digestion of PCR products. We have established optimized conditions for a panel of 20 common restriction endonucleases permitting efficient digestion in PCR buffer. It is possible, therefore, to find a suitable restriction site for competitive PCR in virtually any sequence of interest. The assay presented is inexpensive, widely applicable, and permits reliable and accurate quantification of nucleic acid targets. PMID:11376164

  17. Comparative study of label and label-free techniques using shotgun proteomics for relative protein quantification.

    PubMed

    Sjödin, Marcus O D; Wetterhall, Magnus; Kultima, Kim; Artemenko, Konstantin

    2013-06-01

    The analytical performance of three different strategies, iTRAQ (isobaric tag for relative and absolute quantification), dimethyl labeling (DML) and label free (LF) for relative protein quantification using shotgun proteomics have been evaluated. The methods have been explored using samples containing (i) Bovine proteins in known ratios and (ii) Bovine proteins in known ratios spiked into Escherichia coli. The latter case mimics the actual conditions in a typical biological sample with a few differentially expressed proteins and a bulk of proteins with unchanged ratios. Additionally, the evaluation was performed on both QStar and LTQ-FTICR mass spectrometers. LF LTQ-FTICR was found to have the highest proteome coverage while the highest accuracy based on the artificially regulated proteins was found for DML LTQ-FTICR (54%). A varying linearity (k: 0.55-1.16, r(2): 0.61-0.96) was shown for all methods within selected dynamic ranges. All methods were found to consistently underestimate Bovine protein ratios when matrix proteins were added. However, LF LTQ-FTICR was more tolerant toward a compression effect. A single peptide was demonstrated to be sufficient for a reliable quantification using iTRAQ. A ranking system utilizing several parameters important for quantitative proteomics demonstrated that the overall performance of the five different methods was; DML LTQ-FTICR>iTRAQ QStar>LF LTQ-FTICR>DML QStar>LF QStar. Copyright © 2013 Elsevier B.V. All rights reserved.

  18. Sensitive Quantification of Aflatoxin B1 in Animal Feeds, Corn Feed Grain, and Yellow Corn Meal Using Immunomagnetic Bead-Based Recovery and Real-Time Immunoquantitative-PCR

    PubMed Central

    Babu, Dinesh; Muriana, Peter M.

    2014-01-01

    Aflatoxins are considered unavoidable natural mycotoxins encountered in foods, animal feeds, and feed grains. In this study, we demonstrate the application of our recently developed real-time immunoquantitative PCR (RT iq-PCR) assay for sensitive detection and quantification of aflatoxins in poultry feed, two types of dairy feed (1 and 2), horse feed, whole kernel corn feed grains, and retail yellow ground corn meal. Upon testing methanol/water (60:40) extractions of the above samples using competitive direct enzyme linked immunosorbent assay, the aflatoxin content was found to be <20 μg/kg. The RT iq-PCR assay exhibited high antigen hook effect in samples containing aflatoxin levels higher than the quantification limits (0.1–10 μg/kg), addressed by comparing the quantification results of undiluted and diluted extracts. In testing the reliability of the immuno-PCR assay, samples were spiked with 200 μg/kg of aflatoxin B1, but the recovery of spiked aflatoxin was found to be poor. Considering the significance of determining trace levels of aflatoxins and their serious implications for animal and human health, the RT iq-PCR method described in this study can be useful for quantifying low natural aflatoxin levels in complex matrices of food or animal feed samples without the requirement of extra sample cleanup. PMID:25474493

  19. Sensitive quantification of aflatoxin B1 in animal feeds, corn feed grain, and yellow corn meal using immunomagnetic bead-based recovery and real-time immunoquantitative-PCR.

    PubMed

    Babu, Dinesh; Muriana, Peter M

    2014-12-02

    Aflatoxins are considered unavoidable natural mycotoxins encountered in foods, animal feeds, and feed grains. In this study, we demonstrate the application of our recently developed real-time immunoquantitative PCR (RT iq-PCR) assay for sensitive detection and quantification of aflatoxins in poultry feed, two types of dairy feed (1 and 2), horse feed, whole kernel corn feed grains, and retail yellow ground corn meal. Upon testing methanol/water (60:40) extractions of the above samples using competitive direct enzyme linked immunosorbent assay, the aflatoxin content was found to be <20 μg/kg. The RT iq-PCR assay exhibited high antigen hook effect in samples containing aflatoxin levels higher than the quantification limits (0.1-10 μg/kg), addressed by comparing the quantification results of undiluted and diluted extracts. In testing the reliability of the immuno-PCR assay, samples were spiked with 200 μg/kg of aflatoxin B1, but the recovery of spiked aflatoxin was found to be poor. Considering the significance of determining trace levels of aflatoxins and their serious implications for animal and human health, the RT iq-PCR method described in this study can be useful for quantifying low natural aflatoxin levels in complex matrices of food or animal feed samples without the requirement of extra sample cleanup.

  20. Quantification of Holocene Asian monsoon rainfall from spatially separated cave records

    NASA Astrophysics Data System (ADS)

    Hu, Chaoyong; Henderson, Gideon M.; Huang, Junhua; Xie, Shucheng; Sun, Ying; Johnson, Kathleen R.

    2008-02-01

    A reconstruction of Holocene rainfall is presented for southwest China — an area prone to drought and flooding due to variability in the East Asian monsoon. The reconstruction is derived by comparing a new high-resolution stalagmite δ18O record with an existing record from the same moisture transport pathway. The new record is from Heshang Cave (30°27'N, 110°25'E; 294 m) and shows no sign of kinetic or evaporative effects so can be reliably interpreted as a record of local rainfall composition and temperature. Heshang lies 600 km downwind from Dongge Cave which has a published high-resolution δ18O record (Wang, Y.J., Cheng, H., Edwards, R.L., He, Y.Q., Kong, X.G., An, Z.S., Wu, J.Y., Kelly, M.J., Dykoski, C.A., Li, X.D., 2005. The Holocene Asian monsoon: links to solar changes and North Atlantic climate. Science 308, 854-857). By differencing co-eval δ18O values for the two caves, secondary controls on δ18O (e.g. moisture source, moisture transport, non-local rainfall, temperature) are circumvented and the resulting Δ δ18O signal is controlled directly by the amount of rain falling between the two sites. This is confirmed by comparison with rainfall data from the instrumental record, which also allows a calibration of the Δ δ18O proxy. The calibrated Δ δ18O record provides a quantitative history of rainfall in southwest China which demonstrates that rainfall was 8% higher than today during the Holocene climatic optimum (≈ 6 ka), but only 3% higher during the early Holocene. Significant multi-centennial variability also occurred, with notable dry periods at 8.2 ka, 4.8-4.1 ka, 3.7-3.1 ka, 1.4-1.0 ka and during the Little Ice Age. This Holocene rainfall record provides a good target with which to test climate models. The approach used here, of combining stalagmite records from more than one location, will also allow quantification of rainfall patterns for past times in other regions.

  1. Absolute quantification of DcR3 and GDF15 from human serum by LC-ESI MS

    PubMed Central

    Lancrajan, Ioana; Schneider-Stock, Regine; Naschberger, Elisabeth; Schellerer, Vera S; Stürzl, Michael; Enz, Ralf

    2015-01-01

    Biomarkers are widely used in clinical diagnosis, prognosis and therapy monitoring. Here, we developed a protocol for the efficient and selective enrichment of small and low concentrated biomarkers from human serum, involving a 95% effective depletion of high-abundant serum proteins by partial denaturation and enrichment of low-abundant biomarkers by size exclusion chromatography. The recovery of low-abundance biomarkers was above 97%. Using this protocol, we quantified the tumour markers DcR3 and growth/differentiation factor (GDF)15 from 100 μl human serum by isotope dilution mass spectrometry, using 15N metabolically labelled and concatamerized fingerprint peptides for the both proteins. Analysis of three different fingerprint peptides for each protein by liquid chromatography electrospray ionization mass spectrometry resulted in comparable concentrations in three healthy human serum samples (DcR3: 27.23 ± 2.49 fmol/ml; GDF15: 98.11 ± 0.49 fmol/ml). In contrast, serum levels were significantly elevated in tumour patients for DcR3 (116.94 ± 57.37 fmol/ml) and GDF15 (164.44 ± 79.31 fmol/ml). Obtained data were in good agreement with ELISA and qPCR measurements, as well as with literature data. In summary, our protocol allows the reliable quantification of biomarkers, shows a higher resolution at low biomarker concentrations than antibody-based strategies, and offers the possibility of multiplexing. Our proof-of-principle studies in patient sera encourage the future analysis of the prognostic value of DcR3 and GDF15 for colon cancer patients in larger patient cohorts. PMID:25823874

  2. Ultrasensitive Single Fluorescence-Labeled Probe-Mediated Single Universal Primer-Multiplex-Droplet Digital Polymerase Chain Reaction for High-Throughput Genetically Modified Organism Screening.

    PubMed

    Niu, Chenqi; Xu, Yuancong; Zhang, Chao; Zhu, Pengyu; Huang, Kunlun; Luo, Yunbo; Xu, Wentao

    2018-05-01

    As genetically modified (GM) technology develops and genetically modified organisms (GMOs) become more available, GMOs face increasing regulations and pressure to adhere to strict labeling guidelines. A singleplex detection method cannot perform the high-throughput analysis necessary for optimal GMO detection. Combining the advantages of multiplex detection and droplet digital polymerase chain reaction (ddPCR), a single universal primer-multiplex-ddPCR (SUP-M-ddPCR) strategy was proposed for accurate broad-spectrum screening and quantification. The SUP increases efficiency of the primers in PCR and plays an important role in establishing a high-throughput, multiplex detection method. Emerging ddPCR technology has been used for accurate quantification of nucleic acid molecules without a standard curve. Using maize as a reference point, four heterologous sequences ( 35S, NOS, NPTII, and PAT) were selected to evaluate the feasibility and applicability of this strategy. Surprisingly, these four genes cover more than 93% of the transgenic maize lines and serve as preliminary screening sequences. All screening probes were labeled with FAM fluorescence, which allows the signals from the samples with GMO content and those without to be easily differentiated. This fiveplex screening method is a new development in GMO screening. Utilizing an optimal amplification assay, the specificity, limit of detection (LOD), and limit of quantitation (LOQ) were validated. The LOD and LOQ of this GMO screening method were 0.1% and 0.01%, respectively, with a relative standard deviation (RSD) < 25%. This method could serve as an important tool for the detection of GM maize from different processed, commercially available products. Further, this screening method could be applied to other fields that require reliable and sensitive detection of DNA targets.

  3. Diagnosis and therapeutic monitoring of inborn errors of creatine metabolism and transport using liquid chromatography-tandem mass spectrometry in urine, plasma and CSF.

    PubMed

    Haas, Dorothea; Gan-Schreier, Hongying; Langhans, Claus-Dieter; Anninos, Alexandros; Haege, Gisela; Burgard, Peter; Schulze, Andreas; Hoffmann, Georg F; Okun, Jürgen G

    2014-03-15

    Biochemical detection of inborn errors of creatine metabolism or transport relies on the analysis of three main metabolites in biological fluids: guanidinoacetate (GAA), creatine (CT) and creatinine (CTN). Unspecific clinical presentation of the diseases might be the cause that only few patients have been diagnosed so far. We describe a LC-MS/MS method allowing fast and reliable diagnosis by simultaneous quantification of GAA, CT and CTN in urine, plasma and cerebrospinal fluid (CSF) and established reference values for each material. For quantification deuterated stable isotopes of each analyte were used as internal standards. GAA, CT and CTN were separated by reversed-phase HPLC. The characterization was carried out by scanning the ions of each compound by negative ion tandem mass spectrometry. Butylation is needed to achieve sufficient signal intensity for GAA and CT but it is not useful for analyzing CTN. The assay is linear in a broad range of analyte concentrations usually found in urine, plasma and CSF. Comparison of the "traditional" cation-exchange chromatography and LC-MS/MS showed proportional differences but linear relationships between the two methods. The described method is characterized by high speed and linearity over large concentration ranges comparable to other published LC-MS methods but with higher sensitivity for GAA and CT. In addition, we present the largest reference group ever published for guanidino compounds in all relevant body fluids. Therefore this method is applicable for high-throughput approaches for diagnosis and follow-up of inborn errors of creatine metabolism and transport. Copyright © 2014 Elsevier B.V. All rights reserved.

  4. Identification and validation of reference genes for quantification of target gene expression with quantitative real-time PCR for tall fescue under four abiotic stresses.

    PubMed

    Yang, Zhimin; Chen, Yu; Hu, Baoyun; Tan, Zhiqun; Huang, Bingru

    2015-01-01

    Tall fescue (Festuca arundinacea Schreb.) is widely utilized as a major forage and turfgrass species in the temperate regions of the world and is a valuable plant material for studying molecular mechanisms of grass stress tolerance due to its superior drought and heat tolerance among cool-season species. Selection of suitable reference genes for quantification of target gene expression is important for the discovery of molecular mechanisms underlying improved growth traits and stress tolerance. The stability of nine potential reference genes (ACT, TUB, EF1a, GAPDH, SAND, CACS, F-box, PEPKR1 and TIP41) was evaluated using four programs, GeNorm, NormFinder, BestKeeper, and RefFinder. The combinations of SAND and TUB or TIP41 and TUB were most stably expressed in salt-treated roots or leaves. The combinations of GAPDH with TIP41 or TUB were stable in roots and leaves under drought stress. TIP41 and PEPKR1 exhibited stable expression in cold-treated roots, and the combination of F-box, TIP41 and TUB was also stable in cold-treated leaves. CACS and TUB were the two most stable reference genes in heat-stressed roots. TIP41 combined with TUB and ACT was stably expressed in heat-stressed leaves. Finally, quantitative real-time polymerase chain reaction (qRT-PCR) assays of the target gene FaWRKY1 using the identified most stable reference genes confirmed the reliability of selected reference genes. The selection of suitable reference genes in tall fescue will allow for more accurate identification of stress-tolerance genes and molecular mechanisms conferring stress tolerance in this stress-tolerant species.

  5. Maximum Entropy Production Modeling of Evapotranspiration Partitioning on Heterogeneous Terrain and Canopy Cover: advantages and limitations.

    NASA Astrophysics Data System (ADS)

    Gutierrez-Jurado, H. A.; Guan, H.; Wang, J.; Wang, H.; Bras, R. L.; Simmons, C. T.

    2015-12-01

    Quantification of evapotranspiration (ET) and its partition over regions of heterogeneous topography and canopy poses a challenge using traditional approaches. In this study, we report the results of a novel field experiment design guided by the Maximum Entropy Production model of ET (MEP-ET), formulated for estimating evaporation and transpiration from homogeneous soil and canopy. A catchment with complex terrain and patchy vegetation in South Australia was instrumented to measure temperature, humidity and net radiation at soil and canopy surfaces. Performance of the MEP-ET model to quantify transpiration and soil evaporation was evaluated during wet and dry conditions with independently and directly measured transpiration from sapflow and soil evaporation using the Bowen Ratio Energy Balance (BREB). MEP-ET transpiration shows remarkable agreement with that obtained through sapflow measurements during wet conditions, but consistently overestimates the flux during dry periods. However, an additional term introduced to the original MEP-ET model accounting for higher stomatal regulation during dry spells, based on differences between leaf and air vapor pressure deficits and temperatures, significantly improves the model performance. On the other hand, MEP-ET soil evaporation is in good agreement with that from BREB regardless of moisture conditions. The experimental design allows a plot and tree scale quantification of evaporation and transpiration respectively. This study confirms for the first time that the MEP-ET originally developed for homogeneous open bare soil and closed canopy can be used for modeling ET over heterogeneous land surfaces. Furthermore, we show that with the addition of an empirical function simulating the plants ability to regulate transpiration, and based on the same measurements of temperature and humidity, the method can produce reliable estimates of ET during both wet and dry conditions without compromising its parsimony.

  6. Monitoring microbiological changes in drinking water systems using a fast and reproducible flow cytometric method.

    PubMed

    Prest, E I; Hammes, F; Kötzsch, S; van Loosdrecht, M C M; Vrouwenvelder, J S

    2013-12-01

    Flow cytometry (FCM) is a rapid, cultivation-independent tool to assess and evaluate bacteriological quality and biological stability of water. Here we demonstrate that a stringent, reproducible staining protocol combined with fixed FCM operational and gating settings is essential for reliable quantification of bacteria and detection of changes in aquatic bacterial communities. Triplicate measurements of diverse water samples with this protocol typically showed relative standard deviation values and 95% confidence interval values below 2.5% on all the main FCM parameters. We propose a straightforward and instrument-independent method for the characterization of water samples based on the combination of bacterial cell concentration and fluorescence distribution. Analysis of the fluorescence distribution (or so-called fluorescence fingerprint) was accomplished firstly through a direct comparison of the raw FCM data and subsequently simplified by quantifying the percentage of large and brightly fluorescent high nucleic acid (HNA) content bacteria in each sample. Our approach enables fast differentiation of dissimilar bacterial communities (less than 15 min from sampling to final result), and allows accurate detection of even small changes in aquatic environments (detection above 3% change). Demonstrative studies on (a) indigenous bacterial growth in water, (b) contamination of drinking water with wastewater, (c) household drinking water stagnation and (d) mixing of two drinking water types, univocally showed that this FCM approach enables detection and quantification of relevant bacterial water quality changes with high sensitivity. This approach has the potential to be used as a new tool for application in the drinking water field, e.g. for rapid screening of the microbial water quality and stability during water treatment and distribution in networks and premise plumbing. Copyright © 2013 Elsevier Ltd. All rights reserved.

  7. An explainable deep machine vision framework for plant stress phenotyping.

    PubMed

    Ghosal, Sambuddha; Blystone, David; Singh, Asheesh K; Ganapathysubramanian, Baskar; Singh, Arti; Sarkar, Soumik

    2018-05-01

    Current approaches for accurate identification, classification, and quantification of biotic and abiotic stresses in crop research and production are predominantly visual and require specialized training. However, such techniques are hindered by subjectivity resulting from inter- and intrarater cognitive variability. This translates to erroneous decisions and a significant waste of resources. Here, we demonstrate a machine learning framework's ability to identify and classify a diverse set of foliar stresses in soybean [ Glycine max (L.) Merr.] with remarkable accuracy. We also present an explanation mechanism, using the top-K high-resolution feature maps that isolate the visual symptoms used to make predictions. This unsupervised identification of visual symptoms provides a quantitative measure of stress severity, allowing for identification (type of foliar stress), classification (low, medium, or high stress), and quantification (stress severity) in a single framework without detailed symptom annotation by experts. We reliably identified and classified several biotic (bacterial and fungal diseases) and abiotic (chemical injury and nutrient deficiency) stresses by learning from over 25,000 images. The learned model is robust to input image perturbations, demonstrating viability for high-throughput deployment. We also noticed that the learned model appears to be agnostic to species, seemingly demonstrating an ability of transfer learning. The availability of an explainable model that can consistently, rapidly, and accurately identify and quantify foliar stresses would have significant implications in scientific research, plant breeding, and crop production. The trained model could be deployed in mobile platforms (e.g., unmanned air vehicles and automated ground scouts) for rapid, large-scale scouting or as a mobile application for real-time detection of stress by farmers and researchers. Copyright © 2018 the Author(s). Published by PNAS.

  8. A one step real time PCR method for the quantification of hepatitis delta virus RNA using an external armored RNA standard and intrinsic internal control.

    PubMed

    Karataylı, Ersin; Altunoğlu, Yasemin Çelik; Karataylı, Senem Ceren; Alagöz, S Gökçe K; Cınar, Kubilay; Yalçın, Kendal; Idilman, Ramazan; Yurdaydın, Cihan; Bozdayı, A Mithat

    2014-05-01

    Hepatitis delta virus (HDV) RNA viral load measurement is critical in diagnosis and monitoring the response to antiviral treatment. Our aim is to design a real time PCR method for accurate quantitation of HDV RNA in clinical specimens using an armored RNA as external standard, and an intrinsic internal control. A plasmid bearing delta antigen region of genotype I HDV genome was used to develop an armored RNA. Serial dilutions of the armored HDV RNA standard with 10(12)copy/mL were used as standards for quantitation. A primer-probe set derived from HDAg region was used in one step EZ RT PCR kit chemistry which uses rTth enzyme allowing reverse transcription and polymerization in the same tube. The kit also uses the advantage of uracil-N-glycosylase (UNG) enzyme treatment to prevent PCR contamination. The established assay has a dynamic range of 10(2)-10(11)copy/mL with a PCR efficiency of 96.9%. Detection limit was 858±32copy/mL with 95% confidence interval. Intra- and inter-assay variabilities were low for high, medium and low levels of viremia. Incorporation of freely circulating GAPDH in serum into the assay as an intrinsic internal control prevented false negative results and failures in PCR amplifications due to inhibitors, inefficient extraction procedures or enzymatic reactions. In conclusion, this study defines a novel assay for sensitive and reliable quantification of HDV RNA using an armored HDV RNA as a standard and GAPDH in plasma or serum as an intrinsic internal control in a single tube. Copyright © 2014 Elsevier B.V. All rights reserved.

  9. Simultaneous determination of LSD and 2-oxo-3-hydroxy LSD in hair and urine by LC-MS/MS and its application to forensic cases.

    PubMed

    Jang, Moonhee; Kim, Jihyun; Han, Inhoi; Yang, Wonkyung

    2015-11-10

    Lysergic acid diethylamide (LSD) is administered in low dosages, which makes its detection in biological matrices a major challenge in forensic toxicology. In this study, two sensitive and reliable methods based on liquid chromatography-tandem mass spectrometry (LC-MS/MS) were established and validated for the simultaneous determination of LSD and its metabolite, 2-oxo-3-hydroxy-LSD (O-H-LSD), in hair and urine. Target analytes in hair were extracted using methanol at 38°C for 15h and analyzed by LC-MS/MS. For urine sample preparation, liquid-liquid extraction was performed. Limits of detection (LODs) in hair were 0.25pg/mg for LSD and 0.5pg/mg for O-H-LSD. In urine, LODs were 0.01 and 0.025ng/ml for LSD and O-H-LSD, respectively. Method validation results showed good linearity and acceptable precision and accuracy. The developed methods were applied to authentic specimens from two legal cases of LSD ingestion, and allowed identification and quantification of LSD and O-H-LSD in the specimens. In the two cases, LSD concentrations in hair were 1.27 and 0.95pg/mg; O-H-LSD was detected in one case, but its concentration was below the limit of quantification. In urine samples collected from the two suspects 8 and 3h after ingestion, LSD concentrations were 0.48 and 2.70ng/ml, respectively, while O-H-LSD concentrations were 4.19 and 25.2ng/ml, respectively. These methods can be used for documenting LSD intake in clinical and forensic settings. Copyright © 2015 Elsevier B.V. All rights reserved.

  10. HPLC determination of flavonoid glycosides in Mongolian Dianthus versicolor Fisch. (Caryophyllaceae) compared with quantification by UV spectrophotometry.

    PubMed

    Obmann, Astrid; Purevsuren, Sodnomtseren; Zehl, Martin; Kletter, Christa; Reznicek, Gottfried; Narantuya, Samdan; Glasl, Sabine

    2012-01-01

    Dianthus versicolor is used in traditional Mongolian medicine against liver impairment. Fractions enriched in flavone-di- and triglycosides were shown to enhance bile secretion. Therefore, reliable and accurate analytical methods are needed for the determination of these flavonoids in the crude drug and extracts thereof. To provide a validated HPLC-DAD (diode array detector) method especially developed for the separation of polar flavonoids and to compare the data obtained with those evaluated by UV spectrophotometry. Separations were carried out on an Aquasil® C₁₈-column (4.6 mm × 250.0 mm, 5 µm) with a linear gradient of acetonitrile and water (adjusted to pH 2.8 with trifluoroacetic acid) as mobile phase. Rutoside was employed as internal standard with linear behavior in a concentration range of 0.007-3.5 mg/mL. Accuracy was determined by spiking the crude drug with saponarin resulting in recoveries between 92% and 102%. The method allows the quantification of highly polar flavonoid glycosides and the determination of their total content. For saponarin a linear response was evaluated within the range 0.007-3.5 mg/mL (R²  > 0.9999). It was proven that threefold sonication represents a time-saving, effective and cheap method for the extraction of the polar flavonoid glycosides. The contents determined by HPLC were shown to be in agreement with those obtained employing UV spectrophotometry. The study has indicated that the newly developed HPLC method represents a powerful technique for the quality control of D. versicolor. Ultraviolet spectrophotometry may be used alternatively provided that the less polar flavonoids are removed by purification. Copyright © 2011 John Wiley & Sons, Ltd.

  11. Fingerprint analysis of Radix Aconiti using ultra-performance liquid chromatography-electrospray ionization/ tandem mass spectrometry (UPLC-ESI/MS n) combined with stoichiometry.

    PubMed

    Zhu, Hongbin; Wang, Chunyan; Qi, Yao; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying

    2013-01-15

    A fingerprinting approach was developed by means of UPLC-ESI/MS(n) (ultra-performance liquid chromatography-electrospray ionization/mass spectrometry) for the quality control of processed Radix Aconiti, a widely used toxic traditional herbal medicine. The present fingerprinting approach was based on the two processing methods recorded in Chinese Pharmacopoeia for the purpose of reducing the toxicity and ensuring the clinical therapeutic efficacy. Similarity evaluation, hierarchical cluster analysis and principal component analysis were performed to evaluate the similarity and variation of the samples. The results showed that the well processed, unqualified processed and the raw Radix Aconiti could be clustered reasonably corresponding to the contents of their constituents. The loading plot shows that the main chemical markers having the most influence on the discrimination amongst the qualified and unqualified samples were mainly some monoester diterpenoid aconitines and diester diterpenoid aconitines. Finally, the UPLC-UV and UPLC-ESI/MS(n) characteristic fingerprints were established according to the well processed and purchased qualified samples. At the same time, a complementary quantification method of six Aconitine-type alkaloids was developed using UPLC-UV and UPLC-ESI/MS. The average recovery of the monoester diterpenoid aconitines was 95.4-99.1% and the average recovery of the diester diterpenoid aconitines was 103-112%. The proposed combined quantification method by UPLC-UV and UPLC-ESI/MS allows the samples analyzed in a wide concentration range. Therefore, the established fingerprinting approach in combination with chemometric analysis provides a flexible and reliable method for quality assessment of toxic herbal medicine. Copyright © 2012 Elsevier B.V. All rights reserved.

  12. Validation of the Mass-Extraction-Window for Quantitative Methods Using Liquid Chromatography High Resolution Mass Spectrometry.

    PubMed

    Glauser, Gaétan; Grund, Baptiste; Gassner, Anne-Laure; Menin, Laure; Henry, Hugues; Bromirski, Maciej; Schütz, Frédéric; McMullen, Justin; Rochat, Bertrand

    2016-03-15

    A paradigm shift is underway in the field of quantitative liquid chromatography-mass spectrometry (LC-MS) analysis thanks to the arrival of recent high-resolution mass spectrometers (HRMS). The capability of HRMS to perform sensitive and reliable quantifications of a large variety of analytes in HR-full scan mode is showing that it is now realistic to perform quantitative and qualitative analysis with the same instrument. Moreover, HR-full scan acquisition offers a global view of sample extracts and allows retrospective investigations as virtually all ionized compounds are detected with a high sensitivity. In time, the versatility of HRMS together with the increasing need for relative quantification of hundreds of endogenous metabolites should promote a shift from triple-quadrupole MS to HRMS. However, a current "pitfall" in quantitative LC-HRMS analysis is the lack of HRMS-specific guidance for validated quantitative analyses. Indeed, false positive and false negative HRMS detections are rare, albeit possible, if inadequate parameters are used. Here, we investigated two key parameters for the validation of LC-HRMS quantitative analyses: the mass accuracy (MA) and the mass-extraction-window (MEW) that is used to construct the extracted-ion-chromatograms. We propose MA-parameters, graphs, and equations to calculate rational MEW width for the validation of quantitative LC-HRMS methods. MA measurements were performed on four different LC-HRMS platforms. Experimentally determined MEW values ranged between 5.6 and 16.5 ppm and depended on the HRMS platform, its working environment, the calibration procedure, and the analyte considered. The proposed procedure provides a fit-for-purpose MEW determination and prevents false detections.

  13. A statistical kinematic source inversion approach based on the QUESO library for uncertainty quantification and prediction

    NASA Astrophysics Data System (ADS)

    Zielke, Olaf; McDougall, Damon; Mai, Martin; Babuska, Ivo

    2014-05-01

    Seismic, often augmented with geodetic data, are frequently used to invert for the spatio-temporal evolution of slip along a rupture plane. The resulting images of the slip evolution for a single event, inferred by different research teams, often vary distinctly, depending on the adopted inversion approach and rupture model parameterization. This observation raises the question, which of the provided kinematic source inversion solutions is most reliable and most robust, and — more generally — how accurate are fault parameterization and solution predictions? These issues are not included in "standard" source inversion approaches. Here, we present a statistical inversion approach to constrain kinematic rupture parameters from teleseismic body waves. The approach is based a) on a forward-modeling scheme that computes synthetic (body-)waves for a given kinematic rupture model, and b) on the QUESO (Quantification of Uncertainty for Estimation, Simulation, and Optimization) library that uses MCMC algorithms and Bayes theorem for sample selection. We present Bayesian inversions for rupture parameters in synthetic earthquakes (i.e. for which the exact rupture history is known) in an attempt to identify the cross-over at which further model discretization (spatial and temporal resolution of the parameter space) is no longer attributed to a decreasing misfit. Identification of this cross-over is of importance as it reveals the resolution power of the studied data set (i.e. teleseismic body waves), enabling one to constrain kinematic earthquake rupture histories of real earthquakes at a resolution that is supported by data. In addition, the Bayesian approach allows for mapping complete posterior probability density functions of the desired kinematic source parameters, thus enabling us to rigorously assess the uncertainties in earthquake source inversions.

  14. A High-Throughput UHPLC-MS/MS Method for the Quantification of Five Aged Butyrylcholinesterase Biomarkers from Human Exposure to Organophosphorus Nerve Agents

    PubMed Central

    Graham, Leigh Ann; Johnson, Darryl; Carter, Melissa D.; Stout, Emily G.; Erol, Huseyin A.; Isenberg, Samantha L.; Mathews, Thomas P.; Thomas, Jerry D.; Johnson, Rudolph C.

    2017-01-01

    Organophosphorus nerve agents (OPNAs) are toxic compounds that are classified as prohibited Schedule 1 chemical weapons. In the body, OPNAs bind to butyrylcholinesterase (BChE) to form nerve agent adducts (OPNA-BChE). OPNA-BChE adducts can provide a reliable, long-term protein biomarker for assessing human exposure. A major challenge facing OPNA-BChE detection is hydrolysis (aging), which can continue to occur after a clinical specimen has been collected. During aging, the o-alkyl phosphoester bond hydrolyzes, and the specific identity of the nerve agent is lost. To better identify OPNA exposure events, a high throughput method for the detection of five aged OPNA-BChE adducts was developed. This is the first diagnostic panel to allow for the simultaneous quantification of any Chemical Weapons Convention Schedule 1 OPNA by measuring the aged adducts methyl phosphonate (MeP-BChE), ethyl phosphonate (EtP-BChE), propyl phosphonate (PrP-BChE), ethyl phosphoryl (ExP-BChE), phosphoryl (P-BChE), and unadducted BChE. The calibration range for all analytes is 2.00 – 250. ng/mL, which is consistent with similar methodologies used to detect unaged OPNA-BChE adducts. Each analytical run is three minutes making the time to first unknown results, including calibration curve and quality controls, less than one hour. Analysis of commercially purchased individual serum samples demonstrated no potential interferences with detection of aged OPNA-BChE adducts, and quantitative measurements of endogenous levels of BChE were similar to those previously reported in other OPNA-BChE adduct assays. PMID:27572107

  15. CIEF separation, UV detection, and quantification of ampholytic antibiotics and bacteria from different matrices.

    PubMed

    Horká, Marie; Vykydalová, Marie; Růžička, Filip; Šalplachta, Jiří; Holá, Veronika; Dvořáčková, Milada; Kubesová, Anna; Šlais, Karel

    2014-10-01

    The effect of antibiotics on the microbial cells and concentration of antibiotics in the human body is essential for the effective use of antimicrobial therapy. The capillary isoelectric focusing is a suitable technique for the separation and the detection of bacteria, and amphoteric substances from nature. However, the determination of isoelectric points of ampholytic antibiotics by conventional techniques is time consuming. For this reason, capillary isoelectric focusing seems to be appropriate as a simple and reliable way for establishing them. The separation conditions for the capillary isoelectric focusing of selected ampholytic antibiotics with known isoelectric points and pK as, ampicillin (pI 4.9), ciprofloxacin (pI 7.4), ofloxacin (pI 7.1), tetracycline (pI 5.4), tigecycline (pI 9.7), and vancomycin (pI 8.1), were found and optimized in the suitable pH ranges pH 2.0-5.3, 2.0-9.6, and 9.0-10.4. The established values of isoelectric points correspond with those found in the literature except tigecycline. Its pI was not found in the literature. As an example of a possible procedure for direct detection of both ampholytic antibiotics and bacteria, Staphylococcus epidermidis, in the presence of culture media or whole human blood, was found. The changes of the bacterial cells after their treatment with tetracycline were confirmed by matrix-assisted laser desorption/ionization time-of-flight mass spectrometry. Capillary isoelectric focusing allows the fast and simple determination of isoelectric points of relevant antibiotics, their quantification from the environment, as well as studying their effectiveness on microorganisms in biological samples.

  16. Comparison of on-line and off-line methods to quantify reactive oxygen species (ROS) in atmospheric aerosols

    NASA Astrophysics Data System (ADS)

    Fuller, S. J.; Wragg, F. P. H.; Nutter, J.; Kalberer, M.

    2014-08-01

    Atmospheric aerosol particle concentrations have been linked with a wide range of pulmonary and cardio-vascular diseases but the particle properties responsible for these negative health effects are largely unknown. It is often speculated that reactive oxygen species (ROS) present in atmospheric particles lead to oxidative stress in, and ultimately disease of, the human lung. The quantification of ROS is highly challenging because some ROS components such as radicals are highly reactive and therefore short-lived. Thus, fast analysis methods are likely advantageous over methods with a long delay between aerosol sampling and ROS analysis. We present for the first time a detailed comparison of conventional off-line and fast on-line methods to quantify ROS in organic aerosols. For this comparison a new and fast on-line instrument was built and characterized to quantify ROS in aerosol particles with high sensitivity and a limit of detection of 4 nmol H2O2 equivalents per m3 air. ROS concentrations are measured with a time resolution of approximately 15 min, which allows the tracking of fast changing atmospheric conditions. The comparison of the off-line and on-line method shows that, in oxidized organic model aerosol particles, the majority of ROS have a very short lifetime of a few minutes whereas a small fraction is stable for a day or longer. This indicates that off-line techniques, where there is often a delay of hours to days between particle collection and ROS analysis, may severely underestimate true ROS concentrations and that fast on-line techniques are necessary for a reliable ROS quantification in atmospheric aerosol particles and a meaningful correlation with health outcomes.

  17. Quantification of the toxic hexavalent chromium content in an organic matrix by X-ray photoelectron spectroscopy (XPS) and ultra-low-angle microtomy (ULAM)

    NASA Astrophysics Data System (ADS)

    Greunz, Theresia; Duchaczek, Hubert; Sagl, Raffaela; Duchoslav, Jiri; Steinberger, Roland; Strauß, Bernhard; Stifter, David

    2017-02-01

    Cr(VI) is known for its corrosion inhibitive properties and is, despite legal regulations, still a potential candidate to be added to thin (1-3 μm) protective coatings applied on, e.g., electrical steel as used for transformers, etc. However, Cr(VI) is harmful to the environment and to the human health. Hence, a reliable quantification of it is of decisive interest. Commonly, an alkaline extraction with a photometric endpoint detection of Cr(VI) is used for such material systems. However, this procedure requires an accurate knowledge on sample parameters such as dry film thickness and coating density that are occasionally associated with significant experimental errors. We present a comprehensive study of a coating system with a defined Cr(VI) pigment concentration applied on electrical steel. X-ray photoelectron spectroscopy (XPS) was employed to resolve the elemental chromium concentration and the chemical state. Turning to the fact that XPS is extremely surface sensitive (<10 nm) and that the lowest commonly achievable lateral resolution is a number of times higher than the coating thickness (∼2 μm), a bulk analysis was achieved with XPS line scans on extended wedge-shaped tapers through the coating. For that purpose a special sample preparation step performed on an ultra-microtome was required prior to analysis. Since a temperature increase leads to a reduction of Cr(VI) we extend our method on samples, which were subjected to different curing temperatures. We show that our proposed approach now allows to determine the elemental and Cr(VI) concentration and distribution inside the coating.

  18. Hair analysis for long-term monitoring of buprenorphine intake in opiate withdrawal.

    PubMed

    Pirro, Valentina; Fusari, Ivana; Di Corcia, Daniele; Gerace, Enrico; De Vivo, Enrico; Salomone, Alberto; Vincenti, Marco

    2014-12-01

    Buprenorphine (BUP) is a psychoactive pharmaceutical drug largely used to treat opiate addiction. Short-term therapeutic monitoring is supported by toxicological analysis of blood and urine samples, whereas long-term monitoring by means of hair analysis is rarely used. Aim of this work was to develop and validate a highly sensitive ultrahigh-performance liquid chromatography tandem mass spectrometry method to detect BUP and norbuprenorphine (NBUP) in head hair. Interindividual correlation between oral dosage of BUP and head hair concentration was investigated. Furthermore, an intra-individual study by means of segmental analysis was performed on subjects with variable maintenance dosage. Hair samples from a population of 79 patients in treatment for opiate addiction were analyzed. The validated ultrahigh-performance liquid chromatography tandem mass spectrometry protocol allowed to obtain limits of detection and quantification at 0.6 and 2.2 pg/mg for BUP and 5.0 and 17 pg/mg for NBUP, respectively. Validation criteria were satisfied, assuring selective analyte identification, high detection capability, and precise and accurate quantification. Significant positive correlation was found between constant oral BUP dosage (1-32 mg/d) and the summed up head hair concentrations of BUP and NBUP. Nevertheless, substantial interindividual variability limits the chance to predict the oral dosage taken by each subject from the measured concentrations in head hair. In contrast, strong correlation was observed in the results of intra-individual segmental analysis, which proved reliable to detect oral dosage variations during therapy. Remarkably, all hair samples yielded BUP concentrations higher than 10 pg/mg, even when the lowest dosage was administered. Thus, these results support the selection of 10 pg/mg as a cutoff value.

  19. Automatic 3D segmentation of multiphoton images: a key step for the quantification of human skin.

    PubMed

    Decencière, Etienne; Tancrède-Bohin, Emmanuelle; Dokládal, Petr; Koudoro, Serge; Pena, Ana-Maria; Baldeweck, Thérèse

    2013-05-01

    Multiphoton microscopy has emerged in the past decade as a useful noninvasive imaging technique for in vivo human skin characterization. However, it has not been used until now in evaluation clinical trials, mainly because of the lack of specific image processing tools that would allow the investigator to extract pertinent quantitative three-dimensional (3D) information from the different skin components. We propose a 3D automatic segmentation method of multiphoton images which is a key step for epidermis and dermis quantification. This method, based on the morphological watershed and graph cuts algorithms, takes into account the real shape of the skin surface and of the dermal-epidermal junction, and allows separating in 3D the epidermis and the superficial dermis. The automatic segmentation method and the associated quantitative measurements have been developed and validated on a clinical database designed for aging characterization. The segmentation achieves its goals for epidermis-dermis separation and allows quantitative measurements inside the different skin compartments with sufficient relevance. This study shows that multiphoton microscopy associated with specific image processing tools provides access to new quantitative measurements on the various skin components. The proposed 3D automatic segmentation method will contribute to build a powerful tool for characterizing human skin condition. To our knowledge, this is the first 3D approach to the segmentation and quantification of these original images. © 2013 John Wiley & Sons A/S. Published by Blackwell Publishing Ltd.

  20. An improved 96-well turbidity assay for T4 lysozyme activity

    DTIC Science & Technology

    2015-05-13

    enzyme present in the reaction, resulting in a measure of activity in Unitsmg1. 10. To improve the reliability of the activity values, perform the...quantification of lysozyme activity with significantly lower enzyme concentrations, and the signal intensity can be enhanced by using greater amounts of... enzyme at the expense of a shorter linear reaction time. Several parameters of the assay are critical for obtaining reproducible activity

Top