Sample records for quantification software based

  1. A multi-center study benchmarks software tools for label-free proteome quantification

    PubMed Central

    Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan

    2016-01-01

    The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404

  2. A multicenter study benchmarks software tools for label-free proteome quantification.

    PubMed

    Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan

    2016-11-01

    Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.

  3. A new background subtraction method for Western blot densitometry band quantification through image analysis software.

    PubMed

    Gallo-Oller, Gabriel; Ordoñez, Raquel; Dotor, Javier

    2018-06-01

    Since its first description, Western blot has been widely used in molecular labs. It constitutes a multistep method that allows the detection and/or quantification of proteins from simple to complex protein mixtures. Western blot quantification method constitutes a critical step in order to obtain accurate and reproducible results. Due to the technical knowledge required for densitometry analysis together with the resources availability, standard office scanners are often used for the imaging acquisition of developed Western blot films. Furthermore, the use of semi-quantitative software as ImageJ (Java-based image-processing and analysis software) is clearly increasing in different scientific fields. In this work, we describe the use of office scanner coupled with the ImageJ software together with a new image background subtraction method for accurate Western blot quantification. The proposed method represents an affordable, accurate and reproducible approximation that could be used in the presence of limited resources availability. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. Reproducibility of Lobar Perfusion and Ventilation Quantification Using SPECT/CT Segmentation Software in Lung Cancer Patients.

    PubMed

    Provost, Karine; Leblond, Antoine; Gauthier-Lemire, Annie; Filion, Édith; Bahig, Houda; Lord, Martin

    2017-09-01

    Planar perfusion scintigraphy with 99m Tc-labeled macroaggregated albumin is often used for pretherapy quantification of regional lung perfusion in lung cancer patients, particularly those with poor respiratory function. However, subdividing lung parenchyma into rectangular regions of interest, as done on planar images, is a poor reflection of true lobar anatomy. New tridimensional methods using SPECT and SPECT/CT have been introduced, including semiautomatic lung segmentation software. The present study evaluated inter- and intraobserver agreement on quantification using SPECT/CT software and compared the results for regional lung contribution obtained with SPECT/CT and planar scintigraphy. Methods: Thirty lung cancer patients underwent ventilation-perfusion scintigraphy with 99m Tc-macroaggregated albumin and 99m Tc-Technegas. The regional lung contribution to perfusion and ventilation was measured on both planar scintigraphy and SPECT/CT using semiautomatic lung segmentation software by 2 observers. Interobserver and intraobserver agreement for the SPECT/CT software was assessed using the intraclass correlation coefficient, Bland-Altman plots, and absolute differences in measurements. Measurements from planar and tridimensional methods were compared using the paired-sample t test and mean absolute differences. Results: Intraclass correlation coefficients were in the excellent range (above 0.9) for both interobserver and intraobserver agreement using the SPECT/CT software. Bland-Altman analyses showed very narrow limits of agreement. Absolute differences were below 2.0% in 96% of both interobserver and intraobserver measurements. There was a statistically significant difference between planar and SPECT/CT methods ( P < 0.001) for quantification of perfusion and ventilation for all right lung lobes, with a maximal mean absolute difference of 20.7% for the right middle lobe. There was no statistically significant difference in quantification of perfusion and ventilation for the left lung lobes using either method; however, absolute differences reached 12.0%. The total right and left lung contributions were similar for the two methods, with a mean difference of 1.2% for perfusion and 2.0% for ventilation. Conclusion: Quantification of regional lung perfusion and ventilation using SPECT/CT-based lung segmentation software is highly reproducible. This tridimensional method yields statistically significant differences in measurements for right lung lobes when compared with planar scintigraphy. We recommend that SPECT/CT-based quantification be used for all lung cancer patients undergoing pretherapy evaluation of regional lung function. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.

  5. Quantification of protein expression in cells and cellular subcompartments on immunohistochemical sections using a computer supported image analysis system.

    PubMed

    Braun, Martin; Kirsten, Robert; Rupp, Niels J; Moch, Holger; Fend, Falko; Wernert, Nicolas; Kristiansen, Glen; Perner, Sven

    2013-05-01

    Quantification of protein expression based on immunohistochemistry (IHC) is an important step for translational research and clinical routine. Several manual ('eyeballing') scoring systems are used in order to semi-quantify protein expression based on chromogenic intensities and distribution patterns. However, manual scoring systems are time-consuming and subject to significant intra- and interobserver variability. The aim of our study was to explore, whether new image analysis software proves to be sufficient as an alternative tool to quantify protein expression. For IHC experiments, one nucleus specific marker (i.e., ERG antibody), one cytoplasmic specific marker (i.e., SLC45A3 antibody), and one marker expressed in both compartments (i.e., TMPRSS2 antibody) were chosen. Stainings were applied on TMAs, containing tumor material of 630 prostate cancer patients. A pathologist visually quantified all IHC stainings in a blinded manner, applying a four-step scoring system. For digital quantification, image analysis software (Tissue Studio v.2.1, Definiens AG, Munich, Germany) was applied to obtain a continuous spectrum of average staining intensity. For each of the three antibodies we found a strong correlation of the manual protein expression score and the score of the image analysis software. Spearman's rank correlation coefficient was 0.94, 0.92, and 0.90 for ERG, SLC45A3, and TMPRSS2, respectively (p⟨0.01). Our data suggest that the image analysis software Tissue Studio is a powerful tool for quantification of protein expression in IHC stainings. Further, since the digital analysis is precise and reproducible, computer supported protein quantification might help to overcome intra- and interobserver variability and increase objectivity of IHC based protein assessment.

  6. Decision peptide-driven: a free software tool for accurate protein quantification using gel electrophoresis and matrix assisted laser desorption ionization time of flight mass spectrometry.

    PubMed

    Santos, Hugo M; Reboiro-Jato, Miguel; Glez-Peña, Daniel; Nunes-Miranda, J D; Fdez-Riverola, Florentino; Carvallo, R; Capelo, J L

    2010-09-15

    The decision peptide-driven tool implements a software application for assisting the user in a protocol for accurate protein quantification based on the following steps: (1) protein separation through gel electrophoresis; (2) in-gel protein digestion; (3) direct and inverse (18)O-labeling and (4) matrix assisted laser desorption ionization time of flight mass spectrometry, MALDI analysis. The DPD software compares the MALDI results of the direct and inverse (18)O-labeling experiments and quickly identifies those peptides with paralleled loses in different sets of a typical proteomic workflow. Those peptides are used for subsequent accurate protein quantification. The interpretation of the MALDI data from direct and inverse labeling experiments is time-consuming requiring a significant amount of time to do all comparisons manually. The DPD software shortens and simplifies the searching of the peptides that must be used for quantification from a week to just some minutes. To do so, it takes as input several MALDI spectra and aids the researcher in an automatic mode (i) to compare data from direct and inverse (18)O-labeling experiments, calculating the corresponding ratios to determine those peptides with paralleled losses throughout different sets of experiments; and (ii) allow to use those peptides as internal standards for subsequent accurate protein quantification using (18)O-labeling. In this work the DPD software is presented and explained with the quantification of protein carbonic anhydrase. Copyright (c) 2010 Elsevier B.V. All rights reserved.

  7. Quantification of video-taped images in microcirculation research using inexpensive imaging software (Adobe Photoshop).

    PubMed

    Brunner, J; Krummenauer, F; Lehr, H A

    2000-04-01

    Study end-points in microcirculation research are usually video-taped images rather than numeric computer print-outs. Analysis of these video-taped images for the quantification of microcirculatory parameters usually requires computer-based image analysis systems. Most software programs for image analysis are custom-made, expensive, and limited in their applicability to selected parameters and study end-points. We demonstrate herein that an inexpensive, commercially available computer software (Adobe Photoshop), run on a Macintosh G3 computer with inbuilt graphic capture board provides versatile, easy to use tools for the quantification of digitized video images. Using images obtained by intravital fluorescence microscopy from the pre- and postischemic muscle microcirculation in the skinfold chamber model in hamsters, Photoshop allows simple and rapid quantification (i) of microvessel diameters, (ii) of the functional capillary density and (iii) of postischemic leakage of FITC-labeled high molecular weight dextran from postcapillary venules. We present evidence of the technical accuracy of the software tools and of a high degree of interobserver reliability. Inexpensive commercially available imaging programs (i.e., Adobe Photoshop) provide versatile tools for image analysis with a wide range of potential applications in microcirculation research.

  8. Find Pairs: The Module for Protein Quantification of the PeakQuant Software Suite

    PubMed Central

    Eisenacher, Martin; Kohl, Michael; Wiese, Sebastian; Hebeler, Romano; Meyer, Helmut E.

    2012-01-01

    Abstract Accurate quantification of proteins is one of the major tasks in current proteomics research. To address this issue, a wide range of stable isotope labeling techniques have been developed, allowing one to quantitatively study thousands of proteins by means of mass spectrometry. In this article, the FindPairs module of the PeakQuant software suite is detailed. It facilitates the automatic determination of protein abundance ratios based on the automated analysis of stable isotope-coded mass spectrometric data. Furthermore, it implements statistical methods to determine outliers due to biological as well as technical variance of proteome data obtained in replicate experiments. This provides an important means to evaluate the significance in obtained protein expression data. For demonstrating the high applicability of FindPairs, we focused on the quantitative analysis of proteome data acquired in 14N/15N labeling experiments. We further provide a comprehensive overview of the features of the FindPairs software, and compare these with existing quantification packages. The software presented here supports a wide range of proteomics applications, allowing one to quantitatively assess data derived from different stable isotope labeling approaches, such as 14N/15N labeling, SILAC, and iTRAQ. The software is publicly available at http://www.medizinisches-proteom-center.de/software and free for academic use. PMID:22909347

  9. StakeMeter: value-based stakeholder identification and quantification framework for value-based software systems.

    PubMed

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N A; Bin Zaheer, Kashif

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.

  10. The Effect of AOP on Software Engineering, with Particular Attention to OIF and Event Quantification

    NASA Technical Reports Server (NTRS)

    Havelund, Klaus; Filman, Robert; Korsmeyer, David (Technical Monitor)

    2003-01-01

    We consider the impact of Aspect-Oriented Programming on Software Engineering, and, in particular, analyze two AOP systems, one of which does component wrapping and the other, quantification over events, for their software engineering effects.

  11. In-depth evaluation of software tools for data-independent acquisition based label-free quantification.

    PubMed

    Kuharev, Jörg; Navarro, Pedro; Distler, Ute; Jahn, Olaf; Tenzer, Stefan

    2015-09-01

    Label-free quantification (LFQ) based on data-independent acquisition workflows currently experiences increasing popularity. Several software tools have been recently published or are commercially available. The present study focuses on the evaluation of three different software packages (Progenesis, synapter, and ISOQuant) supporting ion mobility enhanced data-independent acquisition data. In order to benchmark the LFQ performance of the different tools, we generated two hybrid proteome samples of defined quantitative composition containing tryptically digested proteomes of three different species (mouse, yeast, Escherichia coli). This model dataset simulates complex biological samples containing large numbers of both unregulated (background) proteins as well as up- and downregulated proteins with exactly known ratios between samples. We determined the number and dynamic range of quantifiable proteins and analyzed the influence of applied algorithms (retention time alignment, clustering, normalization, etc.) on quantification results. Analysis of technical reproducibility revealed median coefficients of variation of reported protein abundances below 5% for MS(E) data for Progenesis and ISOQuant. Regarding accuracy of LFQ, evaluation with synapter and ISOQuant yielded superior results compared to Progenesis. In addition, we discuss reporting formats and user friendliness of the software packages. The data generated in this study have been deposited to the ProteomeXchange Consortium with identifier PXD001240 (http://proteomecentral.proteomexchange.org/dataset/PXD001240). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  12. StakeMeter: Value-Based Stakeholder Identification and Quantification Framework for Value-Based Software Systems

    PubMed Central

    Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N. A.; Zaheer, Kashif Bin

    2015-01-01

    Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called ‘StakeMeter’. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error. PMID:25799490

  13. Comparative quantification of dietary supplemented neural creatine concentrations with (1)H-MRS peak fitting and basis spectrum methods.

    PubMed

    Turner, Clare E; Russell, Bruce R; Gant, Nicholas

    2015-11-01

    Magnetic resonance spectroscopy (MRS) is an analytical procedure that can be used to non-invasively measure the concentration of a range of neural metabolites. Creatine is an important neurometabolite with dietary supplementation offering therapeutic potential for neurological disorders with dysfunctional energetic processes. Neural creatine concentrations can be probed using proton MRS and quantified using a range of software packages based on different analytical methods. This experiment examines the differences in quantification performance of two commonly used analysis packages following a creatine supplementation strategy with potential therapeutic application. Human participants followed a seven day dietary supplementation regime in a placebo-controlled, cross-over design interspersed with a five week wash-out period. Spectroscopy data were acquired the day immediately following supplementation and analyzed with two commonly-used software packages which employ vastly different quantification methods. Results demonstrate that neural creatine concentration was augmented following creatine supplementation when analyzed using the peak fitting method of quantification (105.9%±10.1). In contrast, no change in neural creatine levels were detected with supplementation when analysis was conducted using the basis spectrum method of quantification (102.6%±8.6). Results suggest that software packages that employ the peak fitting procedure for spectral quantification are possibly more sensitive to subtle changes in neural creatine concentrations. The relative simplicity of the spectroscopy sequence and the data analysis procedure suggest that peak fitting procedures may be the most effective means of metabolite quantification when detection of subtle alterations in neural metabolites is necessary. The straightforward technique can be used on a clinical magnetic resonance imaging system. Copyright © 2015 Elsevier Inc. All rights reserved.

  14. A tool for selective inline quantification of co-eluting proteins in chromatography using spectral analysis and partial least squares regression.

    PubMed

    Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen

    2014-07-01

    Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. © 2014 Wiley Periodicals, Inc.

  15. The Infeasibility of Experimental Quantification of Life-Critical Software Reliability

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Finelli, George B.

    1991-01-01

    This paper affirms that quantification of life-critical software reliability is infeasible using statistical methods whether applied to standard software or fault-tolerant software. The key assumption of software fault tolerance|separately programmed versions fail independently|is shown to be problematic. This assumption cannot be justified by experimentation in the ultra-reliability region and subjective arguments in its favor are not sufficiently strong to justify it as an axiom. Also, the implications of the recent multi-version software experiments support this affirmation.

  16. MAMA User Guide v2.0.1

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Gaschen, Brian Keith; Bloch, Jeffrey Joseph; Porter, Reid

    Morphological signatures of bulk SNM materials have significant promise, but these potential signatures are not fully utilized. This document describes software tools, collectively called the MAMA (Morphological Analysis for Material Attribution) software that can help provide robust and accurate quantification of morphological features in bulk material microscopy images (Optical, SEM). Although many of the specific tools are not unique to Mama, the software package has been designed specifically for nuclear material morphological analysis, and is at a point where it can be easily adapted (by Los Alamos or by collaborators) in response to new, different, or changing forensics needs. Themore » current release of the MAMA software only includes the image quantification, descriptions, and annotation functionality. Only limited information on a sample, its pedigree, and its chemistry are recorded inside this part of the software. This was decision based on initial feedback and the fact that there are several analytical chemistry databases being developed within the community. Currently MAMA is a standalone program that can export quantification results in a basic text format that can be imported into other programs such as Excel and Access. There is also a basic report generating feature that produces HTML formatted pages of the same information. We will be working with collaborators to provide better integration of MAMA into their particular systems, databases and workflows.« less

  17. UQTools: The Uncertainty Quantification Toolbox - Introduction and Tutorial

    NASA Technical Reports Server (NTRS)

    Kenny, Sean P.; Crespo, Luis G.; Giesy, Daniel P.

    2012-01-01

    UQTools is the short name for the Uncertainty Quantification Toolbox, a software package designed to efficiently quantify the impact of parametric uncertainty on engineering systems. UQTools is a MATLAB-based software package and was designed to be discipline independent, employing very generic representations of the system models and uncertainty. Specifically, UQTools accepts linear and nonlinear system models and permits arbitrary functional dependencies between the system s measures of interest and the probabilistic or non-probabilistic parametric uncertainty. One of the most significant features incorporated into UQTools is the theoretical development centered on homothetic deformations and their application to set bounding and approximating failure probabilities. Beyond the set bounding technique, UQTools provides a wide range of probabilistic and uncertainty-based tools to solve key problems in science and engineering.

  18. Strawberry: Fast and accurate genome-guided transcript reconstruction and quantification from RNA-Seq.

    PubMed

    Liu, Ruolin; Dickerson, Julie

    2017-11-01

    We propose a novel method and software tool, Strawberry, for transcript reconstruction and quantification from RNA-Seq data under the guidance of genome alignment and independent of gene annotation. Strawberry consists of two modules: assembly and quantification. The novelty of Strawberry is that the two modules use different optimization frameworks but utilize the same data graph structure, which allows a highly efficient, expandable and accurate algorithm for dealing large data. The assembly module parses aligned reads into splicing graphs, and uses network flow algorithms to select the most likely transcripts. The quantification module uses a latent class model to assign read counts from the nodes of splicing graphs to transcripts. Strawberry simultaneously estimates the transcript abundances and corrects for sequencing bias through an EM algorithm. Based on simulations, Strawberry outperforms Cufflinks and StringTie in terms of both assembly and quantification accuracies. Under the evaluation of a real data set, the estimated transcript expression by Strawberry has the highest correlation with Nanostring probe counts, an independent experiment measure for transcript expression. Strawberry is written in C++14, and is available as open source software at https://github.com/ruolin/strawberry under the MIT license.

  19. Targeted Feature Detection for Data-Dependent Shotgun Proteomics

    PubMed Central

    2017-01-01

    Label-free quantification of shotgun LC–MS/MS data is the prevailing approach in quantitative proteomics but remains computationally nontrivial. The central data analysis step is the detection of peptide-specific signal patterns, called features. Peptide quantification is facilitated by associating signal intensities in features with peptide sequences derived from MS2 spectra; however, missing values due to imperfect feature detection are a common problem. A feature detection approach that directly targets identified peptides (minimizing missing values) but also offers robustness against false-positive features (by assigning meaningful confidence scores) would thus be highly desirable. We developed a new feature detection algorithm within the OpenMS software framework, leveraging ideas and algorithms from the OpenSWATH toolset for DIA/SRM data analysis. Our software, FeatureFinderIdentification (“FFId”), implements a targeted approach to feature detection based on information from identified peptides. This information is encoded in an MS1 assay library, based on which ion chromatogram extraction and detection of feature candidates are carried out. Significantly, when analyzing data from experiments comprising multiple samples, our approach distinguishes between “internal” and “external” (inferred) peptide identifications (IDs) for each sample. On the basis of internal IDs, two sets of positive (true) and negative (decoy) feature candidates are defined. A support vector machine (SVM) classifier is then trained to discriminate between the sets and is subsequently applied to the “uncertain” feature candidates from external IDs, facilitating selection and confidence scoring of the best feature candidate for each peptide. This approach also enables our algorithm to estimate the false discovery rate (FDR) of the feature selection step. We validated FFId based on a public benchmark data set, comprising a yeast cell lysate spiked with protein standards that provide a known ground-truth. The algorithm reached almost complete (>99%) quantification coverage for the full set of peptides identified at 1% FDR (PSM level). Compared with other software solutions for label-free quantification, this is an outstanding result, which was achieved at competitive quantification accuracy and reproducibility across replicates. The FDR for the feature selection was estimated at a low 1.5% on average per sample (3% for features inferred from external peptide IDs). The FFId software is open-source and freely available as part of OpenMS (www.openms.org). PMID:28673088

  20. Targeted Feature Detection for Data-Dependent Shotgun Proteomics.

    PubMed

    Weisser, Hendrik; Choudhary, Jyoti S

    2017-08-04

    Label-free quantification of shotgun LC-MS/MS data is the prevailing approach in quantitative proteomics but remains computationally nontrivial. The central data analysis step is the detection of peptide-specific signal patterns, called features. Peptide quantification is facilitated by associating signal intensities in features with peptide sequences derived from MS2 spectra; however, missing values due to imperfect feature detection are a common problem. A feature detection approach that directly targets identified peptides (minimizing missing values) but also offers robustness against false-positive features (by assigning meaningful confidence scores) would thus be highly desirable. We developed a new feature detection algorithm within the OpenMS software framework, leveraging ideas and algorithms from the OpenSWATH toolset for DIA/SRM data analysis. Our software, FeatureFinderIdentification ("FFId"), implements a targeted approach to feature detection based on information from identified peptides. This information is encoded in an MS1 assay library, based on which ion chromatogram extraction and detection of feature candidates are carried out. Significantly, when analyzing data from experiments comprising multiple samples, our approach distinguishes between "internal" and "external" (inferred) peptide identifications (IDs) for each sample. On the basis of internal IDs, two sets of positive (true) and negative (decoy) feature candidates are defined. A support vector machine (SVM) classifier is then trained to discriminate between the sets and is subsequently applied to the "uncertain" feature candidates from external IDs, facilitating selection and confidence scoring of the best feature candidate for each peptide. This approach also enables our algorithm to estimate the false discovery rate (FDR) of the feature selection step. We validated FFId based on a public benchmark data set, comprising a yeast cell lysate spiked with protein standards that provide a known ground-truth. The algorithm reached almost complete (>99%) quantification coverage for the full set of peptides identified at 1% FDR (PSM level). Compared with other software solutions for label-free quantification, this is an outstanding result, which was achieved at competitive quantification accuracy and reproducibility across replicates. The FDR for the feature selection was estimated at a low 1.5% on average per sample (3% for features inferred from external peptide IDs). The FFId software is open-source and freely available as part of OpenMS ( www.openms.org ).

  1. Semi-automatic computerized approach to radiological quantification in rheumatoid arthritis

    NASA Astrophysics Data System (ADS)

    Steiner, Wolfgang; Schoeffmann, Sylvia; Prommegger, Andrea; Boegl, Karl; Klinger, Thomas; Peloschek, Philipp; Kainberger, Franz

    2004-04-01

    Rheumatoid Arthritis (RA) is a common systemic disease predominantly involving the joints. Precise diagnosis and follow-up therapy requires objective quantification. For this purpose, radiological analyses using standardized scoring systems are considered to be the most appropriate method. The aim of our study is to develop a semi-automatic image analysis software, especially applicable for scoring of joints in rheumatic disorders. The X-Ray RheumaCoach software delivers various scoring systems (Larsen-Score and Ratingen-Rau-Score) which can be applied by the scorer. In addition to the qualitative assessment of joints performed by the radiologist, a semi-automatic image analysis for joint detection and measurements of bone diameters and swollen tissue supports the image assessment process. More than 3000 radiographs from hands and feet of more than 200 RA patients were collected, analyzed, and statistically evaluated. Radiographs were quantified using conventional paper-based Larsen score and the X-Ray RheumaCoach software. The use of the software shortened the scoring time by about 25 percent and reduced the rate of erroneous scorings in all our studies. Compared to paper-based scoring methods, the X-Ray RheumaCoach software offers several advantages: (i) Structured data analysis and input that minimizes variance by standardization, (ii) faster and more precise calculation of sum scores and indices, (iii) permanent data storing and fast access to the software"s database, (iv) the possibility of cross-calculation to other scores, (v) semi-automatic assessment of images, and (vii) reliable documentation of results in the form of graphical printouts.

  2. Final Technical Report on Quantifying Dependability Attributes of Software Based Safety Critical Instrumentation and Control Systems in Nuclear Power Plants

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Smidts, Carol; Huang, Funqun; Li, Boyuan

    With the current transition from analog to digital instrumentation and control systems in nuclear power plants, the number and variety of software-based systems have significantly increased. The sophisticated nature and increasing complexity of software raises trust in these systems as a significant challenge. The trust placed in a software system is typically termed software dependability. Software dependability analysis faces uncommon challenges since software systems’ characteristics differ from those of hardware systems. The lack of systematic science-based methods for quantifying the dependability attributes in software-based instrumentation as well as control systems in safety critical applications has proved itself to be amore » significant inhibitor to the expanded use of modern digital technology in the nuclear industry. Dependability refers to the ability of a system to deliver a service that can be trusted. Dependability is commonly considered as a general concept that encompasses different attributes, e.g., reliability, safety, security, availability and maintainability. Dependability research has progressed significantly over the last few decades. For example, various assessment models and/or design approaches have been proposed for software reliability, software availability and software maintainability. Advances have also been made to integrate multiple dependability attributes, e.g., integrating security with other dependability attributes, measuring availability and maintainability, modeling reliability and availability, quantifying reliability and security, exploring the dependencies between security and safety and developing integrated analysis models. However, there is still a lack of understanding of the dependencies between various dependability attributes as a whole and of how such dependencies are formed. To address the need for quantification and give a more objective basis to the review process -- therefore reducing regulatory uncertainty -- measures and methods are needed to assess dependability attributes early on, as well as throughout the life-cycle process of software development. In this research, extensive expert opinion elicitation is used to identify the measures and methods for assessing software dependability. Semi-structured questionnaires were designed to elicit expert knowledge. A new notation system, Causal Mechanism Graphing, was developed to extract and represent such knowledge. The Causal Mechanism Graphs were merged, thus, obtaining the consensus knowledge shared by the domain experts. In this report, we focus on how software contributes to dependability. However, software dependability is not discussed separately from the context of systems or socio-technical systems. Specifically, this report focuses on software dependability, reliability, safety, security, availability, and maintainability. Our research was conducted in the sequence of stages found below. Each stage is further examined in its corresponding chapter. Stage 1 (Chapter 2): Elicitation of causal maps describing the dependencies between dependability attributes. These causal maps were constructed using expert opinion elicitation. This chapter describes the expert opinion elicitation process, the questionnaire design, the causal map construction method and the causal maps obtained. Stage 2 (Chapter 3): Elicitation of the causal map describing the occurrence of the event of interest for each dependability attribute. The causal mechanisms for the “event of interest” were extracted for each of the software dependability attributes. The “event of interest” for a dependability attribute is generally considered to be the “attribute failure”, e.g. security failure. The extraction was based on the analysis of expert elicitation results obtained in Stage 1. Stage 3 (Chapter 4): Identification of relevant measurements. Measures for the “events of interest” and their causal mechanisms were obtained from expert opinion elicitation for each of the software dependability attributes. The measures extracted are presented in this chapter. Stage 4 (Chapter 5): Assessment of the coverage of the causal maps via measures. Coverage was assessed to determine whether the measures obtained were sufficient to quantify software dependability, and what measures are further required. Stage 5 (Chapter 6): Identification of “missing” measures and measurement approaches for concepts not covered. New measures, for concepts that had not been covered sufficiently as determined in Stage 4, were identified using supplementary expert opinion elicitation as well as literature reviews. Stage 6 (Chapter 7): Building of a detailed quantification model based on the causal maps and measurements obtained. Ability to derive such a quantification model shows that the causal models and measurements derived from the previous stages (Stage 1 to Stage 5) can form the technical basis for developing dependability quantification models. Scope restrictions have led us to prioritize this demonstration effort. The demonstration was focused on a critical system, i.e. the reactor protection system. For this system, a ranking of the software dependability attributes by nuclear stakeholders was developed. As expected for this application, the stakeholder ranking identified safety as the most critical attribute to be quantified. A safety quantification model limited to the requirements phase of development was built. Two case studies were conducted for verification. A preliminary control gate for software safety for the requirements stage was proposed and applied to the first case study. The control gate allows a cost effective selection of the duration of the requirements phase.« less

  3. Comprehensive evaluation of untargeted metabolomics data processing software in feature detection, quantification and discriminating marker selection.

    PubMed

    Li, Zhucui; Lu, Yan; Guo, Yufeng; Cao, Haijie; Wang, Qinhong; Shui, Wenqing

    2018-10-31

    Data analysis represents a key challenge for untargeted metabolomics studies and it commonly requires extensive processing of more than thousands of metabolite peaks included in raw high-resolution MS data. Although a number of software packages have been developed to facilitate untargeted data processing, they have not been comprehensively scrutinized in the capability of feature detection, quantification and marker selection using a well-defined benchmark sample set. In this study, we acquired a benchmark dataset from standard mixtures consisting of 1100 compounds with specified concentration ratios including 130 compounds with significant variation of concentrations. Five software evaluated here (MS-Dial, MZmine 2, XCMS, MarkerView, and Compound Discoverer) showed similar performance in detection of true features derived from compounds in the mixtures. However, significant differences between untargeted metabolomics software were observed in relative quantification of true features in the benchmark dataset. MZmine 2 outperformed the other software in terms of quantification accuracy and it reported the most true discriminating markers together with the fewest false markers. Furthermore, we assessed selection of discriminating markers by different software using both the benchmark dataset and a real-case metabolomics dataset to propose combined usage of two software for increasing confidence of biomarker identification. Our findings from comprehensive evaluation of untargeted metabolomics software would help guide future improvements of these widely used bioinformatics tools and enable users to properly interpret their metabolomics results. Copyright © 2018 Elsevier B.V. All rights reserved.

  4. AGScan: a pluggable microarray image quantification software based on the ImageJ library.

    PubMed

    Cathelin, R; Lopez, F; Klopp, Ch

    2007-01-15

    Many different programs are available to analyze microarray images. Most programs are commercial packages, some are free. In the latter group only few propose automatic grid alignment and batch mode. More often than not a program implements only one quantification algorithm. AGScan is an open source program that works on all major platforms. It is based on the ImageJ library [Rasband (1997-2006)] and offers a plug-in extension system to add new functions to manipulate images, align grid and quantify spots. It is appropriate for daily laboratory use and also as a framework for new algorithms. The program is freely distributed under X11 Licence. The install instructions can be found in the user manual. The software can be downloaded from http://mulcyber.toulouse.inra.fr/projects/agscan/. The questions and plug-ins can be sent to the contact listed below.

  5. Interactive 3D-PDF Presentations for the Simulation and Quantification of Extended Endoscopic Endonasal Surgical Approaches.

    PubMed

    Mavar-Haramija, Marija; Prats-Galino, Alberto; Méndez, Juan A Juanes; Puigdelívoll-Sánchez, Anna; de Notaris, Matteo

    2015-10-01

    A three-dimensional (3D) model of the skull base was reconstructed from the pre- and post-dissection head CT images and embedded in a Portable Document Format (PDF) file, which can be opened by freely available software and used offline. The CT images were segmented using a specific 3D software platform for biomedical data, and the resulting 3D geometrical models of anatomical structures were used for dual purpose: to simulate the extended endoscopic endonasal transsphenoidal approaches and to perform the quantitative analysis of the procedures. The analysis consisted of bone removal quantification and the calculation of quantitative parameters (surgical freedom and exposure area) of each procedure. The results are presented in three PDF documents containing JavaScript-based functions. The 3D-PDF files include reconstructions of the nasal structures (nasal septum, vomer, middle turbinates), the bony structures of the anterior skull base and maxillofacial region and partial reconstructions of the optic nerve, the hypoglossal and vidian canals and the internal carotid arteries. Alongside the anatomical model, axial, sagittal and coronal CT images are shown. Interactive 3D presentations were created to explain the surgery and the associated quantification methods step-by-step. The resulting 3D-PDF files allow the user to interact with the model through easily available software, free of charge and in an intuitive manner. The files are available for offline use on a personal computer and no previous specialized knowledge in informatics is required. The documents can be downloaded at http://hdl.handle.net/2445/55224 .

  6. Estimation of combined sewer overflow discharge: a software sensor approach based on local water level measurements.

    PubMed

    Ahm, Malte; Thorndahl, Søren; Nielsen, Jesper E; Rasmussen, Michael R

    2016-12-01

    Combined sewer overflow (CSO) structures are constructed to effectively discharge excess water during heavy rainfall, to protect the urban drainage system from hydraulic overload. Consequently, most CSO structures are not constructed according to basic hydraulic principles for ideal measurement weirs. It can, therefore, be a challenge to quantify the discharges from CSOs. Quantification of CSO discharges are important in relation to the increased environmental awareness of the receiving water bodies. Furthermore, CSO discharge quantification is essential for closing the rainfall-runoff mass-balance in combined sewer catchments. A closed mass-balance is an advantage for calibration of all urban drainage models based on mass-balance principles. This study presents three different software sensor concepts based on local water level sensors, which can be used to estimate CSO discharge volumes from hydraulic complex CSO structures. The three concepts were tested and verified under real practical conditions. All three concepts were accurate when compared to electromagnetic flow measurements.

  7. Volumetric quantification of lung nodules in CT with iterative reconstruction (ASiR and MBIR).

    PubMed

    Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Robins, Marthony; Colsher, James; Samei, Ehsan

    2013-11-01

    Volume quantifications of lung nodules with multidetector computed tomography (CT) images provide useful information for monitoring nodule developments. The accuracy and precision of the volume quantification, however, can be impacted by imaging and reconstruction parameters. This study aimed to investigate the impact of iterative reconstruction algorithms on the accuracy and precision of volume quantification with dose and slice thickness as additional variables. Repeated CT images were acquired from an anthropomorphic chest phantom with synthetic nodules (9.5 and 4.8 mm) at six dose levels, and reconstructed with three reconstruction algorithms [filtered backprojection (FBP), adaptive statistical iterative reconstruction (ASiR), and model based iterative reconstruction (MBIR)] into three slice thicknesses. The nodule volumes were measured with two clinical software (A: Lung VCAR, B: iNtuition), and analyzed for accuracy and precision. Precision was found to be generally comparable between FBP and iterative reconstruction with no statistically significant difference noted for different dose levels, slice thickness, and segmentation software. Accuracy was found to be more variable. For large nodules, the accuracy was significantly different between ASiR and FBP for all slice thicknesses with both software, and significantly different between MBIR and FBP for 0.625 mm slice thickness with Software A and for all slice thicknesses with Software B. For small nodules, the accuracy was more similar between FBP and iterative reconstruction, with the exception of ASIR vs FBP at 1.25 mm with Software A and MBIR vs FBP at 0.625 mm with Software A. The systematic difference between the accuracy of FBP and iterative reconstructions highlights the importance of extending current segmentation software to accommodate the image characteristics of iterative reconstructions. In addition, a calibration process may help reduce the dependency of accuracy on reconstruction algorithms, such that volumes quantified from scans of different reconstruction algorithms can be compared. The little difference found between the precision of FBP and iterative reconstructions could be a result of both iterative reconstruction's diminished noise reduction at the edge of the nodules as well as the loss of resolution at high noise levels with iterative reconstruction. The findings do not rule out potential advantage of IR that might be evident in a study that uses a larger number of nodules or repeated scans.

  8. CASL Dakota Capabilities Summary

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Simmons, Chris; Williams, Brian J.

    2017-10-10

    The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.

  9. Automated flow quantification in valvular heart disease based on backscattered Doppler power analysis: implementation on matrix-array ultrasound imaging systems.

    PubMed

    Buck, Thomas; Hwang, Shawn M; Plicht, Björn; Mucci, Ronald A; Hunold, Peter; Erbel, Raimund; Levine, Robert A

    2008-06-01

    Cardiac ultrasound imaging systems are limited in the noninvasive quantification of valvular regurgitation due to indirect measurements and inaccurate hemodynamic assumptions. We recently demonstrated that the principle of integration of backscattered acoustic Doppler power times velocity can be used for flow quantification in valvular regurgitation directly at the vena contracta of a regurgitant flow jet. We now aimed to accomplish implementation of automated Doppler power flow analysis software on a standard cardiac ultrasound system utilizing novel matrix-array transducer technology with detailed description of system requirements, components and software contributing to the system. This system based on a 3.5 MHz, matrix-array cardiac ultrasound scanner (Sonos 5500, Philips Medical Systems) was validated by means of comprehensive experimental signal generator trials, in vitro flow phantom trials and in vivo testing in 48 patients with mitral regurgitation of different severity and etiology using magnetic resonance imaging (MRI) for reference. All measurements displayed good correlation to the reference values, indicating successful implementation of automated Doppler power flow analysis on a matrix-array ultrasound imaging system. Systematic underestimation of effective regurgitant orifice areas >0.65 cm(2) and volumes >40 ml was found due to currently limited Doppler beam width that could be readily overcome by the use of new generation 2D matrix-array technology. Automated flow quantification in valvular heart disease based on backscattered Doppler power can be fully implemented on board a routinely used matrix-array ultrasound imaging systems. Such automated Doppler power flow analysis of valvular regurgitant flow directly, noninvasively, and user independent overcomes the practical limitations of current techniques.

  10. RNAbrowse: RNA-Seq de novo assembly results browser.

    PubMed

    Mariette, Jérôme; Noirot, Céline; Nabihoudine, Ibounyamine; Bardou, Philippe; Hoede, Claire; Djari, Anis; Cabau, Cédric; Klopp, Christophe

    2014-01-01

    Transcriptome analysis based on a de novo assembly of next generation RNA sequences is now performed routinely in many laboratories. The generated results, including contig sequences, quantification figures, functional annotations and variation discovery outputs are usually bulky and quite diverse. This article presents a user oriented storage and visualisation environment permitting to explore the data in a top-down manner, going from general graphical views to all possible details. The software package is based on biomart, easy to install and populate with local data. The software package is available under the GNU General Public License (GPL) at http://bioinfo.genotoul.fr/RNAbrowse.

  11. Automated lobar quantification of emphysema in patients with severe COPD.

    PubMed

    Revel, Marie-Pierre; Faivre, Jean-Baptiste; Remy-Jardin, Martine; Deken, Valérie; Duhamel, Alain; Marquette, Charles-Hugo; Tacelli, Nunzia; Bakai, Anne-Marie; Remy, Jacques

    2008-12-01

    Automated lobar quantification of emphysema has not yet been evaluated. Unenhanced 64-slice MDCT was performed in 47 patients evaluated before bronchoscopic lung-volume reduction. CT images reconstructed with a standard (B20) and high-frequency (B50) kernel were analyzed using a dedicated prototype software (MevisPULMO) allowing lobar quantification of emphysema extent. Lobar quantification was obtained following (a) a fully automatic delineation of the lobar limits by the software and (b) a semiautomatic delineation with manual correction of the lobar limits when necessary and was compared with the visual scoring of emphysema severity per lobe. No statistically significant difference existed between automated and semiautomated lobar quantification (p > 0.05 in the five lobes), with differences ranging from 0.4 to 3.9%. The agreement between the two methods (intraclass correlation coefficient, ICC) was excellent for left upper lobe (ICC = 0.94), left lower lobe (ICC = 0.98), and right lower lobe (ICC = 0.80). The agreement was good for right upper lobe (ICC = 0.68) and moderate for middle lobe (IC = 0.53). The Bland and Altman plots confirmed these results. A good agreement was observed between the software and visually assessed lobar predominance of emphysema (kappa 0.78; 95% CI 0.64-0.92). Automated and semiautomated lobar quantifications of emphysema are concordant and show good agreement with visual scoring.

  12. VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification

    NASA Astrophysics Data System (ADS)

    Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.

    2015-12-01

    A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.

  13. IsobariQ: software for isobaric quantitative proteomics using IPTL, iTRAQ, and TMT.

    PubMed

    Arntzen, Magnus Ø; Koehler, Christian J; Barsnes, Harald; Berven, Frode S; Treumann, Achim; Thiede, Bernd

    2011-02-04

    Isobaric peptide labeling plays an important role in relative quantitative comparisons of proteomes. Isobaric labeling techniques utilize MS/MS spectra for relative quantification, which can be either based on the relative intensities of reporter ions in the low mass region (iTRAQ and TMT) or on the relative intensities of quantification signatures throughout the spectrum due to isobaric peptide termini labeling (IPTL). Due to the increased quantitative information found in MS/MS fragment spectra generated by the recently developed IPTL approach, new software was required to extract the quantitative information. IsobariQ was specifically developed for this purpose; however, support for the reporter ion techniques iTRAQ and TMT is also included. In addition, to address recently emphasized issues about heterogeneity of variance in proteomics data sets, IsobariQ employs the statistical software package R and variance stabilizing normalization (VSN) algorithms available therein. Finally, the functionality of IsobariQ is validated with data sets of experiments using 6-plex TMT and IPTL. Notably, protein substrates resulting from cleavage by proteases can be identified as shown for caspase targets in apoptosis.

  14. Mass spectrometry data from label-free quantitative proteomic analysis of harmless and pathogenic strains of infectious microalgae, Prototheca spp.

    PubMed

    Murugaiyan, Jayaseelan; Eravci, Murat; Weise, Christoph; Roesler, Uwe

    2017-06-01

    Here, we provide the dataset associated with our research article 'label-free quantitative proteomic analysis of harmless and pathogenic strains of infectious microalgae, Prototheca spp.' (Murugaiyan et al., 2017) [1]. This dataset describes liquid chromatography-mass spectrometry (LC-MS)-based protein identification and quantification of a non-infectious strain, Prototheca zopfii genotype 1 and two strains associated with severe and mild infections, respectively, P. zopfii genotype 2 and Prototheca blaschkeae . Protein identification and label-free quantification was carried out by analysing MS raw data using the MaxQuant-Andromeda software suit. The expressional level differences of the identified proteins among the strains were computed using Perseus software and the results were presented in [1]. This DiB provides the MaxQuant output file and raw data deposited in the PRIDE repository with the dataset identifier PXD005305.

  15. Fully Automated Pulmonary Lobar Segmentation: Influence of Different Prototype Software Programs onto Quantitative Evaluation of Chronic Obstructive Lung Disease

    PubMed Central

    Lim, Hyun-ju; Weinheimer, Oliver; Wielpütz, Mark O.; Dinkel, Julien; Hielscher, Thomas; Gompelmann, Daniela; Kauczor, Hans-Ulrich; Heussel, Claus Peter

    2016-01-01

    Objectives Surgical or bronchoscopic lung volume reduction (BLVR) techniques can be beneficial for heterogeneous emphysema. Post-processing software tools for lobar emphysema quantification are useful for patient and target lobe selection, treatment planning and post-interventional follow-up. We aimed to evaluate the inter-software variability of emphysema quantification using fully automated lobar segmentation prototypes. Material and Methods 66 patients with moderate to severe COPD who underwent CT for planning of BLVR were included. Emphysema quantification was performed using 2 modified versions of in-house software (without and with prototype advanced lung vessel segmentation; programs 1 [YACTA v.2.3.0.2] and 2 [YACTA v.2.4.3.1]), as well as 1 commercial program 3 [Pulmo3D VA30A_HF2] and 1 pre-commercial prototype 4 [CT COPD ISP ver7.0]). The following parameters were computed for each segmented anatomical lung lobe and the whole lung: lobar volume (LV), mean lobar density (MLD), 15th percentile of lobar density (15th), emphysema volume (EV) and emphysema index (EI). Bland-Altman analysis (limits of agreement, LoA) and linear random effects models were used for comparison between the software. Results Segmentation using programs 1, 3 and 4 was unsuccessful in 1 (1%), 7 (10%) and 5 (7%) patients, respectively. Program 2 could analyze all datasets. The 53 patients with successful segmentation by all 4 programs were included for further analysis. For LV, program 1 and 4 showed the largest mean difference of 72 ml and the widest LoA of [-356, 499 ml] (p<0.05). Program 3 and 4 showed the largest mean difference of 4% and the widest LoA of [-7, 14%] for EI (p<0.001). Conclusions Only a single software program was able to successfully analyze all scheduled data-sets. Although mean bias of LV and EV were relatively low in lobar quantification, ranges of disagreement were substantial in both of them. For longitudinal emphysema monitoring, not only scanning protocol but also quantification software needs to be kept constant. PMID:27029047

  16. VideoHacking: Automated Tracking and Quantification of Locomotor Behavior with Open Source Software and Off-the-Shelf Video Equipment.

    PubMed

    Conklin, Emily E; Lee, Kathyann L; Schlabach, Sadie A; Woods, Ian G

    2015-01-01

    Differences in nervous system function can result in differences in behavioral output. Measurements of animal locomotion enable the quantification of these differences. Automated tracking of animal movement is less labor-intensive and bias-prone than direct observation, and allows for simultaneous analysis of multiple animals, high spatial and temporal resolution, and data collection over extended periods of time. Here, we present a new video-tracking system built on Python-based software that is free, open source, and cross-platform, and that can analyze video input from widely available video capture devices such as smartphone cameras and webcams. We validated this software through four tests on a variety of animal species, including larval and adult zebrafish (Danio rerio), Siberian dwarf hamsters (Phodopus sungorus), and wild birds. These tests highlight the capacity of our software for long-term data acquisition, parallel analysis of multiple animals, and application to animal species of different sizes and movement patterns. We applied the software to an analysis of the effects of ethanol on thigmotaxis (wall-hugging) behavior on adult zebrafish, and found that acute ethanol treatment decreased thigmotaxis behaviors without affecting overall amounts of motion. The open source nature of our software enables flexibility, customization, and scalability in behavioral analyses. Moreover, our system presents a free alternative to commercial video-tracking systems and is thus broadly applicable to a wide variety of educational settings and research programs.

  17. A Java program for LRE-based real-time qPCR that enables large-scale absolute quantification.

    PubMed

    Rutledge, Robert G

    2011-03-02

    Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples.

  18. A Java Program for LRE-Based Real-Time qPCR that Enables Large-Scale Absolute Quantification

    PubMed Central

    Rutledge, Robert G.

    2011-01-01

    Background Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Findings Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. Conclusions The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples. PMID:21407812

  19. Data-Independent MS/MS Quantification of Neuropeptides for Determination of Putative Feeding-Related Neurohormones in Microdialysate

    PubMed Central

    2015-01-01

    Food consumption is an important behavior that is regulated by an intricate array of neuropeptides (NPs). Although many feeding-related NPs have been identified in mammals, precise mechanisms are unclear and difficult to study in mammals, as current methods are not highly multiplexed and require extensive a priori knowledge about analytes. New advances in data-independent acquisition (DIA) MS/MS and the open-source quantification software Skyline have opened up the possibility to identify hundreds of compounds and quantify them from a single DIA MS/MS run. An untargeted DIA MSE quantification method using Skyline software for multiplexed, discovery-driven quantification was developed and found to produce linear calibration curves for peptides at physiologically relevant concentrations using a protein digest as internal standard. By using this method, preliminary relative quantification of the crab Cancer borealis neuropeptidome (<2 kDa, 137 peptides from 18 families) was possible in microdialysates from 8 replicate feeding experiments. Of these NPs, 55 were detected with an average mass error below 10 ppm. The time-resolved profiles of relative concentration changes for 6 are shown, and there is great potential for the use of this method in future experiments to aid in correlation of NP changes with behavior. This work presents an unbiased approach to winnowing candidate NPs related to a behavior of interest in a functionally relevant manner, and demonstrates the success of such a UPLC-MSE quantification method using the open source software Skyline. PMID:25552291

  20. Data-independent MS/MS quantification of neuropeptides for determination of putative feeding-related neurohormones in microdialysate.

    PubMed

    Schmerberg, Claire M; Liang, Zhidan; Li, Lingjun

    2015-01-21

    Food consumption is an important behavior that is regulated by an intricate array of neuropeptides (NPs). Although many feeding-related NPs have been identified in mammals, precise mechanisms are unclear and difficult to study in mammals, as current methods are not highly multiplexed and require extensive a priori knowledge about analytes. New advances in data-independent acquisition (DIA) MS/MS and the open-source quantification software Skyline have opened up the possibility to identify hundreds of compounds and quantify them from a single DIA MS/MS run. An untargeted DIA MS(E) quantification method using Skyline software for multiplexed, discovery-driven quantification was developed and found to produce linear calibration curves for peptides at physiologically relevant concentrations using a protein digest as internal standard. By using this method, preliminary relative quantification of the crab Cancer borealis neuropeptidome (<2 kDa, 137 peptides from 18 families) was possible in microdialysates from 8 replicate feeding experiments. Of these NPs, 55 were detected with an average mass error below 10 ppm. The time-resolved profiles of relative concentration changes for 6 are shown, and there is great potential for the use of this method in future experiments to aid in correlation of NP changes with behavior. This work presents an unbiased approach to winnowing candidate NPs related to a behavior of interest in a functionally relevant manner, and demonstrates the success of such a UPLC-MS(E) quantification method using the open source software Skyline.

  1. RNAbrowse: RNA-Seq De Novo Assembly Results Browser

    PubMed Central

    Mariette, Jérôme; Noirot, Céline; Nabihoudine, Ibounyamine; Bardou, Philippe; Hoede, Claire; Djari, Anis; Cabau, Cédric; Klopp, Christophe

    2014-01-01

    Transcriptome analysis based on a de novo assembly of next generation RNA sequences is now performed routinely in many laboratories. The generated results, including contig sequences, quantification figures, functional annotations and variation discovery outputs are usually bulky and quite diverse. This article presents a user oriented storage and visualisation environment permitting to explore the data in a top-down manner, going from general graphical views to all possible details. The software package is based on biomart, easy to install and populate with local data. The software package is available under the GNU General Public License (GPL) at http://bioinfo.genotoul.fr/RNAbrowse. PMID:24823498

  2. AQuA: An Automated Quantification Algorithm for High-Throughput NMR-Based Metabolomics and Its Application in Human Plasma.

    PubMed

    Röhnisch, Hanna E; Eriksson, Jan; Müllner, Elisabeth; Agback, Peter; Sandström, Corine; Moazzami, Ali A

    2018-02-06

    A key limiting step for high-throughput NMR-based metabolomics is the lack of rapid and accurate tools for absolute quantification of many metabolites. We developed, implemented, and evaluated an algorithm, AQuA (Automated Quantification Algorithm), for targeted metabolite quantification from complex 1 H NMR spectra. AQuA operates based on spectral data extracted from a library consisting of one standard calibration spectrum for each metabolite. It uses one preselected NMR signal per metabolite for determining absolute concentrations and does so by effectively accounting for interferences caused by other metabolites. AQuA was implemented and evaluated using experimental NMR spectra from human plasma. The accuracy of AQuA was tested and confirmed in comparison with a manual spectral fitting approach using the ChenomX software, in which 61 out of 67 metabolites quantified in 30 human plasma spectra showed a goodness-of-fit (r 2 ) close to or exceeding 0.9 between the two approaches. In addition, three quality indicators generated by AQuA, namely, occurrence, interference, and positional deviation, were studied. These quality indicators permit evaluation of the results each time the algorithm is operated. The efficiency was tested and confirmed by implementing AQuA for quantification of 67 metabolites in a large data set comprising 1342 experimental spectra from human plasma, in which the whole computation took less than 1 s.

  3. Quantification of Operational Risk Using A Data Mining

    NASA Technical Reports Server (NTRS)

    Perera, J. Sebastian

    1999-01-01

    What is Data Mining? - Data Mining is the process of finding actionable information hidden in raw data. - Data Mining helps find hidden patterns, trends, and important relationships often buried in a sea of data - Typically, automated software tools based on advanced statistical analysis and data modeling technology can be utilized to automate the data mining process

  4. Three-Dimensional Echocardiographic Assessment of Left Heart Chamber Size and Function with Fully Automated Quantification Software in Patients with Atrial Fibrillation.

    PubMed

    Otani, Kyoko; Nakazono, Akemi; Salgo, Ivan S; Lang, Roberto M; Takeuchi, Masaaki

    2016-10-01

    Echocardiographic determination of left heart chamber volumetric parameters by using manual tracings during multiple beats is tedious in atrial fibrillation (AF). The aim of this study was to determine the usefulness of fully automated left chamber quantification software with single-beat three-dimensional transthoracic echocardiographic data sets in patients with AF. Single-beat full-volume three-dimensional transthoracic echocardiographic data sets were prospectively acquired during consecutive multiple cardiac beats (≥10 beats) in 88 patients with AF. In protocol 1, left ventricular volumes, left ventricular ejection fraction, and maximal left atrial volume were validated using automated quantification against the manual tracing method in identical beats in 10 patients. In protocol 2, automated quantification-derived averaged values from multiple beats were compared with the corresponding values obtained from the indexed beat in all patients. Excellent correlations of left chamber parameters between automated quantification and the manual method were observed (r = 0.88-0.98) in protocol 1. The time required for the analysis with the automated quantification method (5 min) was significantly less compared with the manual method (27 min) (P < .0001). In protocol 2, there were excellent linear correlations between the averaged left chamber parameters and the corresponding values obtained from the indexed beat (r = 0.94-0.99), and test-retest variability of left chamber parameters was low (3.5%-4.8%). Three-dimensional transthoracic echocardiography with fully automated quantification software is a rapid and reliable way to measure averaged values of left heart chamber parameters during multiple consecutive beats. Thus, it is a potential new approach for left chamber quantification in patients with AF in daily routine practice. Copyright © 2016 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.

  5. Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marzouk, Youssef; Conrad, Patrick; Bigoni, Daniele

    QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a historymore » of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT Uncertainty Quantification library, called MUQ (\\url{muq.mit.edu}).« less

  6. A semi-automated volumetric software for segmentation and perfusion parameter quantification of brain tumors using 320-row multidetector computed tomography: a validation study.

    PubMed

    Chae, Soo Young; Suh, Sangil; Ryoo, Inseon; Park, Arim; Noh, Kyoung Jin; Shim, Hackjoon; Seol, Hae Young

    2017-05-01

    We developed a semi-automated volumetric software, NPerfusion, to segment brain tumors and quantify perfusion parameters on whole-brain CT perfusion (WBCTP) images. The purpose of this study was to assess the feasibility of the software and to validate its performance compared with manual segmentation. Twenty-nine patients with pathologically proven brain tumors who underwent preoperative WBCTP between August 2012 and February 2015 were included. Three perfusion parameters, arterial flow (AF), equivalent blood volume (EBV), and Patlak flow (PF, which is a measure of permeability of capillaries), of brain tumors were generated by a commercial software and then quantified volumetrically by NPerfusion, which also semi-automatically segmented tumor boundaries. The quantification was validated by comparison with that of manual segmentation in terms of the concordance correlation coefficient and Bland-Altman analysis. With NPerfusion, we successfully performed segmentation and quantified whole volumetric perfusion parameters of all 29 brain tumors that showed consistent perfusion trends with previous studies. The validation of the perfusion parameter quantification exhibited almost perfect agreement with manual segmentation, with Lin concordance correlation coefficients (ρ c ) for AF, EBV, and PF of 0.9988, 0.9994, and 0.9976, respectively. On Bland-Altman analysis, most differences between this software and manual segmentation on the commercial software were within the limit of agreement. NPerfusion successfully performs segmentation of brain tumors and calculates perfusion parameters of brain tumors. We validated this semi-automated segmentation software by comparing it with manual segmentation. NPerfusion can be used to calculate volumetric perfusion parameters of brain tumors from WBCTP.

  7. Integrated smart structures wingbox

    NASA Astrophysics Data System (ADS)

    Simon, Solomon H.

    1993-09-01

    One objective of smart structures development is to demonstrate the ability of a mechanical component to monitor its own structural integrity and health. Achievement of this objective requires the integration of different technologies, i.e.: (1) structures, (2) sensors, and (3) artificial intelligence. We coordinated a team of experts from these three fields. These experts used reliable knowledge towards the forefront of their technologies and combined the appropriate features into an integrated hardware/software smart structures wingbox (SSW) test article. A 1/4 in. hole was drilled into the SSW test article. Although the smart structure had never seen damage of this type, it correctly recognized and located the damage. Based on a knowledge-based simulation, quantification and assessment were also carried out. We have demonstrated that the SSW integrated hardware & software test article can perform six related functions: (1) identification of a defect; (2) location of the defect; (3) quantification of the amount of damage; (4) assessment of performance degradation; (5) continued monitoring in spite of damage; and (6) continuous recording of integrity data. We present the successful results of the integrated test article in this paper, along with plans for future development and deployment of the technology.

  8. quantGenius: implementation of a decision support system for qPCR-based gene quantification.

    PubMed

    Baebler, Špela; Svalina, Miha; Petek, Marko; Stare, Katja; Rotter, Ana; Pompe-Novak, Maruša; Gruden, Kristina

    2017-05-25

    Quantitative molecular biology remains a challenge for researchers due to inconsistent approaches for control of errors in the final results. Due to several factors that can influence the final result, quantitative analysis and interpretation of qPCR data are still not trivial. Together with the development of high-throughput qPCR platforms, there is a need for a tool allowing for robust, reliable and fast nucleic acid quantification. We have developed "quantGenius" ( http://quantgenius.nib.si ), an open-access web application for a reliable qPCR-based quantification of nucleic acids. The quantGenius workflow interactively guides the user through data import, quality control (QC) and calculation steps. The input is machine- and chemistry-independent. Quantification is performed using the standard curve approach, with normalization to one or several reference genes. The special feature of the application is the implementation of user-guided QC-based decision support system, based on qPCR standards, that takes into account pipetting errors, assay amplification efficiencies, limits of detection and quantification of the assays as well as the control of PCR inhibition in individual samples. The intermediate calculations and final results are exportable in a data matrix suitable for further statistical analysis or visualization. We additionally compare the most important features of quantGenius with similar advanced software tools and illustrate the importance of proper QC system in the analysis of qPCR data in two use cases. To our knowledge, quantGenius is the only qPCR data analysis tool that integrates QC-based decision support and will help scientists to obtain reliable results which are the basis for biologically meaningful data interpretation.

  9. Automated quantification of neuronal networks and single-cell calcium dynamics using calcium imaging

    PubMed Central

    Patel, Tapan P.; Man, Karen; Firestein, Bonnie L.; Meaney, David F.

    2017-01-01

    Background Recent advances in genetically engineered calcium and membrane potential indicators provide the potential to estimate the activation dynamics of individual neurons within larger, mesoscale networks (100s–1000 +neurons). However, a fully integrated automated workflow for the analysis and visualization of neural microcircuits from high speed fluorescence imaging data is lacking. New method Here we introduce FluoroSNNAP, Fluorescence Single Neuron and Network Analysis Package. FluoroSNNAP is an open-source, interactive software developed in MATLAB for automated quantification of numerous biologically relevant features of both the calcium dynamics of single-cells and network activity patterns. FluoroSNNAP integrates and improves upon existing tools for spike detection, synchronization analysis, and inference of functional connectivity, making it most useful to experimentalists with little or no programming knowledge. Results We apply FluoroSNNAP to characterize the activity patterns of neuronal microcircuits undergoing developmental maturation in vitro. Separately, we highlight the utility of single-cell analysis for phenotyping a mixed population of neurons expressing a human mutant variant of the microtubule associated protein tau and wild-type tau. Comparison with existing method(s) We show the performance of semi-automated cell segmentation using spatiotemporal independent component analysis and significant improvement in detecting calcium transients using a template-based algorithm in comparison to peak-based or wavelet-based detection methods. Our software further enables automated analysis of microcircuits, which is an improvement over existing methods. Conclusions We expect the dissemination of this software will facilitate a comprehensive analysis of neuronal networks, promoting the rapid interrogation of circuits in health and disease. PMID:25629800

  10. Performance of Different Analytical Software Packages in Quantification of DNA Methylation by Pyrosequencing.

    PubMed

    Grasso, Chiara; Trevisan, Morena; Fiano, Valentina; Tarallo, Valentina; De Marco, Laura; Sacerdote, Carlotta; Richiardi, Lorenzo; Merletti, Franco; Gillio-Tos, Anna

    2016-01-01

    Pyrosequencing has emerged as an alternative method of nucleic acid sequencing, well suited for many applications which aim to characterize single nucleotide polymorphisms, mutations, microbial types and CpG methylation in the target DNA. The commercially available pyrosequencing systems can harbor two different types of software which allow analysis in AQ or CpG mode, respectively, both widely employed for DNA methylation analysis. Aim of the study was to assess the performance for DNA methylation analysis at CpG sites of the two pyrosequencing software which allow analysis in AQ or CpG mode, respectively. Despite CpG mode having been specifically generated for CpG methylation quantification, many investigations on this topic have been carried out with AQ mode. As proof of equivalent performance of the two software for this type of analysis is not available, the focus of this paper was to evaluate if the two modes currently used for CpG methylation assessment by pyrosequencing may give overlapping results. We compared the performance of the two software in quantifying DNA methylation in the promoter of selected genes (GSTP1, MGMT, LINE-1) by testing two case series which include DNA from paraffin embedded prostate cancer tissues (PC study, N = 36) and DNA from blood fractions of healthy people (DD study, N = 28), respectively. We found discrepancy in the two pyrosequencing software-based quality assignment of DNA methylation assays. Compared to the software for analysis in the AQ mode, less permissive criteria are supported by the Pyro Q-CpG software, which enables analysis in CpG mode. CpG mode warns the operators about potential unsatisfactory performance of the assay and ensures a more accurate quantitative evaluation of DNA methylation at CpG sites. The implementation of CpG mode is strongly advisable in order to improve the reliability of the methylation analysis results achievable by pyrosequencing.

  11. Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ghanem, Roger

    QUEST was a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, the Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The USC effort centered on the development of reduced models and efficient algorithms for implementing various components of the UQ pipeline. USC personnel were responsible for the development of adaptive bases, adaptive quadrature, and reduced modelsmore » to be used in estimation and inference.« less

  12. Myocardial blood flow quantification by Rb-82 cardiac PET/CT: A detailed reproducibility study between two semi-automatic analysis programs.

    PubMed

    Dunet, Vincent; Klein, Ran; Allenbach, Gilles; Renaud, Jennifer; deKemp, Robert A; Prior, John O

    2016-06-01

    Several analysis software packages for myocardial blood flow (MBF) quantification from cardiac PET studies exist, but they have not been compared using concordance analysis, which can characterize precision and bias separately. Reproducible measurements are needed for quantification to fully develop its clinical potential. Fifty-one patients underwent dynamic Rb-82 PET at rest and during adenosine stress. Data were processed with PMOD and FlowQuant (Lortie model). MBF and myocardial flow reserve (MFR) polar maps were quantified and analyzed using a 17-segment model. Comparisons used Pearson's correlation ρ (measuring precision), Bland and Altman limit-of-agreement and Lin's concordance correlation ρc = ρ·C b (C b measuring systematic bias). Lin's concordance and Pearson's correlation values were very similar, suggesting no systematic bias between software packages with an excellent precision ρ for MBF (ρ = 0.97, ρc = 0.96, C b = 0.99) and good precision for MFR (ρ = 0.83, ρc = 0.76, C b = 0.92). On a per-segment basis, no mean bias was observed on Bland-Altman plots, although PMOD provided slightly higher values than FlowQuant at higher MBF and MFR values (P < .0001). Concordance between software packages was excellent for MBF and MFR, despite higher values by PMOD at higher MBF values. Both software packages can be used interchangeably for quantification in daily practice of Rb-82 cardiac PET.

  13. A survey of tools for the analysis of quantitative PCR (qPCR) data.

    PubMed

    Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas

    2014-09-01

    Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.

  14. Automating Embedded Analysis Capabilities and Managing Software Complexity in Multiphysics Simulation, Part II: Application to Partial Differential Equations

    DOE PAGES

    Pawlowski, Roger P.; Phipps, Eric T.; Salinger, Andrew G.; ...

    2012-01-01

    A template-based generic programming approach was presented in Part I of this series of papers [Sci. Program. 20 (2012), 197–219] that separates the development effort of programming a physical model from that of computing additional quantities, such as derivatives, needed for embedded analysis algorithms. In this paper, we describe the implementation details for using the template-based generic programming approach for simulation and analysis of partial differential equations (PDEs). We detail several of the hurdles that we have encountered, and some of the software infrastructure developed to overcome them. We end with a demonstration where we present shape optimization and uncertaintymore » quantification results for a 3D PDE application.« less

  15. RNA-Skim: a rapid method for RNA-Seq quantification at transcript level

    PubMed Central

    Zhang, Zhaojun; Wang, Wei

    2014-01-01

    Motivation: RNA-Seq technique has been demonstrated as a revolutionary means for exploring transcriptome because it provides deep coverage and base pair-level resolution. RNA-Seq quantification is proven to be an efficient alternative to Microarray technique in gene expression study, and it is a critical component in RNA-Seq differential expression analysis. Most existing RNA-Seq quantification tools require the alignments of fragments to either a genome or a transcriptome, entailing a time-consuming and intricate alignment step. To improve the performance of RNA-Seq quantification, an alignment-free method, Sailfish, has been recently proposed to quantify transcript abundances using all k-mers in the transcriptome, demonstrating the feasibility of designing an efficient alignment-free method for transcriptome quantification. Even though Sailfish is substantially faster than alternative alignment-dependent methods such as Cufflinks, using all k-mers in the transcriptome quantification impedes the scalability of the method. Results: We propose a novel RNA-Seq quantification method, RNA-Skim, which partitions the transcriptome into disjoint transcript clusters based on sequence similarity, and introduces the notion of sig-mers, which are a special type of k-mers uniquely associated with each cluster. We demonstrate that the sig-mer counts within a cluster are sufficient for estimating transcript abundances with accuracy comparable with any state-of-the-art method. This enables RNA-Skim to perform transcript quantification on each cluster independently, reducing a complex optimization problem into smaller optimization tasks that can be run in parallel. As a result, RNA-Skim uses <4% of the k-mers and <10% of the CPU time required by Sailfish. It is able to finish transcriptome quantification in <10 min per sample by using just a single thread on a commodity computer, which represents >100 speedup over the state-of-the-art alignment-based methods, while delivering comparable or higher accuracy. Availability and implementation: The software is available at http://www.csbio.unc.edu/rs. Contact: weiwang@cs.ucla.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24931995

  16. Global Relative Quantification with Liquid Chromatography–Matrix-assisted Laser Desorption Ionization Time-of-flight (LC-MALDI-TOF)—Cross–validation with LTQ-Orbitrap Proves Reliability and Reveals Complementary Ionization Preferences*

    PubMed Central

    Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte

    2013-01-01

    Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC–electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments. PMID:23788530

  17. Global relative quantification with liquid chromatography-matrix-assisted laser desorption ionization time-of-flight (LC-MALDI-TOF)--cross-validation with LTQ-Orbitrap proves reliability and reveals complementary ionization preferences.

    PubMed

    Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte

    2013-10-01

    Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC-electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments.

  18. Consistency of flow quantifications in tridirectional phase-contrast MRI

    NASA Astrophysics Data System (ADS)

    Unterhinninghofen, R.; Ley, S.; Dillmann, R.

    2009-02-01

    Tridirectionally encoded phase-contrast MRI is a technique to non-invasively acquire time-resolved velocity vector fields of blood flow. These may not only be used to analyze pathological flow patterns, but also to quantify flow at arbitrary positions within the acquired volume. In this paper we examine the validity of this approach by analyzing the consistency of related quantifications instead of comparing it with an external reference measurement. Datasets of the thoracic aorta were acquired from 6 pigs, 1 healthy volunteer and 3 patients with artificial aortic valves. Using in-house software an elliptical flow quantification plane was placed manually at 6 positions along the descending aorta where it was rotated to 5 different angles. For each configuration flow was computed based on the original data and data that had been corrected for phase offsets. Results reveal that quantifications are more dependent on changes in position than on changes in angle. Phase offset correction considerably reduces this dependency. Overall consistency is good with a maximum variation coefficient of 9.9% and a mean variation coefficient of 7.2%.

  19. Plasma protein absolute quantification by nano-LC Q-TOF UDMSE for clinical biomarker verification

    PubMed Central

    ILIES, MARIA; IUGA, CRISTINA ADELA; LOGHIN, FELICIA; DHOPLE, VISHNU MUKUND; HAMMER, ELKE

    2017-01-01

    Background and aims Proteome-based biomarker studies are targeting proteins that could serve as diagnostic, prognosis, and prediction molecules. In the clinical routine, immunoassays are currently used for the absolute quantification of such biomarkers, with the major limitation that only one molecule can be targeted per assay. The aim of our study was to test a mass spectrometry based absolute quantification method for the verification of plasma protein sets which might serve as reliable biomarker panels for the clinical practice. Methods Six EDTA plasma samples were analyzed after tryptic digestion using a high throughput data independent acquisition nano-LC Q-TOF UDMSE proteomics approach. Synthetic Escherichia coli standard peptides were spiked in each sample for the absolute quantification. Data analysis was performed using ProgenesisQI v2.0 software (Waters Corporation). Results Our method ensured absolute quantification of 242 non redundant plasma proteins in a single run analysis. The dynamic range covered was 105. 86% were represented by classical plasma proteins. The overall median coefficient of variation was 0.36, while a set of 63 proteins was found to be highly stable. Absolute protein concentrations strongly correlated with values reviewed in the literature. Conclusions Nano-LC Q-TOF UDMSE proteomic analysis can be used for a simple and rapid determination of absolute amounts of plasma proteins. A large number of plasma proteins could be analyzed, while a wide dynamic range was covered with low coefficient of variation at protein level. The method proved to be a reliable tool for the quantification of protein panel for biomarker verification in the clinical practice. PMID:29151793

  20. Final Technical Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Knio, Omar M.

    QUEST is a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, University of Southern California, Massachusetts Institute of Technology, University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The Duke effort focused on the development of algorithms and utility software for non-intrusive sparse UQ representations, and on participation in the organization of annual workshops and tutorials to disseminate UQ tools to the community, and to gather inputmore » in order to adapt approaches to the needs of SciDAC customers. In particular, fundamental developments were made in (a) multiscale stochastic preconditioners, (b) gradient-based approaches to inverse problems, (c) adaptive pseudo-spectral approximations, (d) stochastic limit cycles, and (e) sensitivity analysis tools for noisy systems. In addition, large-scale demonstrations were performed, namely in the context of ocean general circulation models.« less

  1. Generating standardized image data for testing and calibrating quantification of volumes, surfaces, lengths, and object counts in fibrous and porous materials using X-ray microtomography.

    PubMed

    Jiřík, Miroslav; Bartoš, Martin; Tomášek, Petr; Malečková, Anna; Kural, Tomáš; Horáková, Jana; Lukáš, David; Suchý, Tomáš; Kochová, Petra; Hubálek Kalbáčová, Marie; Králíčková, Milena; Tonar, Zbyněk

    2018-06-01

    Quantification of the structure and composition of biomaterials using micro-CT requires image segmentation due to the low contrast and overlapping radioopacity of biological materials. The amount of bias introduced by segmentation procedures is generally unknown. We aim to develop software that generates three-dimensional models of fibrous and porous structures with known volumes, surfaces, lengths, and object counts in fibrous materials and to provide a software tool that calibrates quantitative micro-CT assessments. Virtual image stacks were generated using the newly developed software TeIGen, enabling the simulation of micro-CT scans of unconnected tubes, connected tubes, and porosities. A realistic noise generator was incorporated. Forty image stacks were evaluated using micro-CT, and the error between the true known and estimated data was quantified. Starting with geometric primitives, the error of the numerical estimation of surfaces and volumes was eliminated, thereby enabling the quantification of volumes and surfaces of colliding objects. Analysis of the sensitivity of the thresholding upon parameters of generated testing image sets revealed the effects of decreasing resolution and increasing noise on the accuracy of the micro-CT quantification. The size of the error increased with decreasing resolution when the voxel size exceeded 1/10 of the typical object size, which simulated the effect of the smallest details that could still be reliably quantified. Open-source software for calibrating quantitative micro-CT assessments by producing and saving virtually generated image data sets with known morphometric data was made freely available to researchers involved in morphometry of three-dimensional fibrillar and porous structures in micro-CT scans. © 2018 Wiley Periodicals, Inc.

  2. RipleyGUI: software for analyzing spatial patterns in 3D cell distributions

    PubMed Central

    Hansson, Kristin; Jafari-Mamaghani, Mehrdad; Krieger, Patrik

    2013-01-01

    The true revolution in the age of digital neuroanatomy is the ability to extensively quantify anatomical structures and thus investigate structure-function relationships in great detail. To facilitate the quantification of neuronal cell patterns we have developed RipleyGUI, a MATLAB-based software that can be used to detect patterns in the 3D distribution of cells. RipleyGUI uses Ripley's K-function to analyze spatial distributions. In addition the software contains statistical tools to determine quantitative statistical differences, and tools for spatial transformations that are useful for analyzing non-stationary point patterns. The software has a graphical user interface making it easy to use without programming experience, and an extensive user manual explaining the basic concepts underlying the different statistical tools used to analyze spatial point patterns. The described analysis tool can be used for determining the spatial organization of neurons that is important for a detailed study of structure-function relationships. For example, neocortex that can be subdivided into six layers based on cell density and cell types can also be analyzed in terms of organizational principles distinguishing the layers. PMID:23658544

  3. Standardless quantification by parameter optimization in electron probe microanalysis

    NASA Astrophysics Data System (ADS)

    Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.

    2012-11-01

    A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively.

  4. Classification of type 2 diabetes rats based on urine amino acids metabolic profiling by liquid chromatography coupled with tandem mass spectrometry.

    PubMed

    Wang, Chunyan; Zhu, Hongbin; Pi, Zifeng; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying

    2013-09-15

    An analytical method for quantifying underivatized amino acids (AAs) in urine samples of rats was developed by using liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS). Classification of type 2 diabetes rats was based on urine amino acids metabolic profiling. LC-MS/MS analysis was applied through chromatographic separation and multiple reactions monitoring (MRM) transitions of MS/MS. Multivariate profile-wide predictive models were constructed using partial least squares discriminant analysis (PLS-DA) by SIMAC-P 11.5 version software package and hierarchical cluster analysis (HCA) by SPSS 18.0 version software. Some amino acids in urine of rats have significant change. The results of the present study prove that this method could perform the quantification of free AAs in urine of rats by using LC-MS/MS. In summary, the PLS-DA and HCA statistical analysis in our research were preferable to differentiate healthy rats and type 2 diabetes rats by the quantification of AAs in their urine samples. In addition, comparing with health group the seven increased amino acids in urine of type 2 rats were returned to normal under the treatment of acarbose. Copyright © 2013 Elsevier B.V. All rights reserved.

  5. Compositional Solution Space Quantification for Probabilistic Software Analysis

    NASA Technical Reports Server (NTRS)

    Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem

    2014-01-01

    Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.

  6. The role of PET quantification in cardiovascular imaging.

    PubMed

    Slomka, Piotr; Berman, Daniel S; Alexanderson, Erick; Germano, Guido

    2014-08-01

    Positron Emission Tomography (PET) has several clinical and research applications in cardiovascular imaging. Myocardial perfusion imaging with PET allows accurate global and regional measurements of myocardial perfusion, myocardial blood flow and function at stress and rest in one exam. Simultaneous assessment of function and perfusion by PET with quantitative software is currently the routine practice. Combination of ejection fraction reserve with perfusion information may improve the identification of severe disease. The myocardial viability can be estimated by quantitative comparison of fluorodeoxyglucose ( 18 FDG) and rest perfusion imaging. The myocardial blood flow and coronary flow reserve measurements are becoming routinely included in the clinical assessment due to enhanced dynamic imaging capabilities of the latest PET/CT scanners. Absolute flow measurements allow evaluation of the coronary microvascular dysfunction and provide additional prognostic and diagnostic information for coronary disease. Standard quantitative approaches to compute myocardial blood flow from kinetic PET data in automated and rapid fashion have been developed for 13 N-ammonia, 15 O-water and 82 Rb radiotracers. The agreement between software methods available for such analysis is excellent. Relative quantification of 82 Rb PET myocardial perfusion, based on comparisons to normal databases, demonstrates high performance for the detection of obstructive coronary disease. New tracers, such as 18 F-flurpiridaz may allow further improvements in the disease detection. Computerized analysis of perfusion at stress and rest reduces the variability of the assessment as compared to visual analysis. PET quantification can be enhanced by precise coregistration with CT angiography. In emerging clinical applications, the potential to identify vulnerable plaques by quantification of atherosclerotic plaque uptake of 18 FDG and 18 F-sodium fluoride tracers in carotids, aorta and coronary arteries has been demonstrated.

  7. LFQProfiler and RNP(xl): Open-Source Tools for Label-Free Quantification and Protein-RNA Cross-Linking Integrated into Proteome Discoverer.

    PubMed

    Veit, Johannes; Sachsenberg, Timo; Chernev, Aleksandar; Aicheler, Fabian; Urlaub, Henning; Kohlbacher, Oliver

    2016-09-02

    Modern mass spectrometry setups used in today's proteomics studies generate vast amounts of raw data, calling for highly efficient data processing and analysis tools. Software for analyzing these data is either monolithic (easy to use, but sometimes too rigid) or workflow-driven (easy to customize, but sometimes complex). Thermo Proteome Discoverer (PD) is a powerful software for workflow-driven data analysis in proteomics which, in our eyes, achieves a good trade-off between flexibility and usability. Here, we present two open-source plugins for PD providing additional functionality: LFQProfiler for label-free quantification of peptides and proteins, and RNP(xl) for UV-induced peptide-RNA cross-linking data analysis. LFQProfiler interacts with existing PD nodes for peptide identification and validation and takes care of the entire quantitative part of the workflow. We show that it performs at least on par with other state-of-the-art software solutions for label-free quantification in a recently published benchmark ( Ramus, C.; J. Proteomics 2016 , 132 , 51 - 62 ). The second workflow, RNP(xl), represents the first software solution to date for identification of peptide-RNA cross-links including automatic localization of the cross-links at amino acid resolution and localization scoring. It comes with a customized integrated cross-link fragment spectrum viewer for convenient manual inspection and validation of the results.

  8. Normal Databases for the Relative Quantification of Myocardial Perfusion

    PubMed Central

    Rubeaux, Mathieu; Xu, Yuan; Germano, Guido; Berman, Daniel S.; Slomka, Piotr J.

    2016-01-01

    Purpose of review Myocardial perfusion imaging (MPI) with SPECT is performed clinically worldwide to detect and monitor coronary artery disease (CAD). MPI allows an objective quantification of myocardial perfusion at stress and rest. This established technique relies on normal databases to compare patient scans against reference normal limits. In this review, we aim to introduce the process of MPI quantification with normal databases and describe the associated perfusion quantitative measures that are used. Recent findings New equipment and new software reconstruction algorithms have been introduced which require the development of new normal limits. The appearance and regional count variations of normal MPI scan may differ between these new scanners and standard Anger cameras. Therefore, these new systems may require the determination of new normal limits to achieve optimal accuracy in relative myocardial perfusion quantification. Accurate diagnostic and prognostic results rivaling those obtained by expert readers can be obtained by this widely used technique. Summary Throughout this review, we emphasize the importance of the different normal databases and the need for specific databases relative to distinct imaging procedures. use of appropriate normal limits allows optimal quantification of MPI by taking into account subtle image differences due to the hardware and software used, and the population studied. PMID:28138354

  9. CalQuo: automated, simultaneous single-cell and population-level quantification of global intracellular Ca2+ responses.

    PubMed

    Fritzsche, Marco; Fernandes, Ricardo A; Colin-York, Huw; Santos, Ana M; Lee, Steven F; Lagerholm, B Christoffer; Davis, Simon J; Eggeling, Christian

    2015-11-13

    Detecting intracellular calcium signaling with fluorescent calcium indicator dyes is often coupled with microscopy techniques to follow the activation state of non-excitable cells, including lymphocytes. However, the analysis of global intracellular calcium responses both at the single-cell level and in large ensembles simultaneously has yet to be automated. Here, we present a new software package, CalQuo (Calcium Quantification), which allows the automated analysis and simultaneous monitoring of global fluorescent calcium reporter-based signaling responses in up to 1000 single cells per experiment, at temporal resolutions of sub-seconds to seconds. CalQuo quantifies the number and fraction of responding cells, the temporal dependence of calcium signaling and provides global and individual calcium-reporter fluorescence intensity profiles. We demonstrate the utility of the new method by comparing the calcium-based signaling responses of genetically manipulated human lymphocytic cell lines.

  10. Π4U: A high performance computing framework for Bayesian uncertainty quantification of complex models

    NASA Astrophysics Data System (ADS)

    Hadjidoukas, P. E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.

    2015-03-01

    We present Π4U, an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.

  11. ATAQS: A computational software tool for high throughput transition optimization and validation for selected reaction monitoring mass spectrometry

    PubMed Central

    2011-01-01

    Background Since its inception, proteomics has essentially operated in a discovery mode with the goal of identifying and quantifying the maximal number of proteins in a sample. Increasingly, proteomic measurements are also supporting hypothesis-driven studies, in which a predetermined set of proteins is consistently detected and quantified in multiple samples. Selected reaction monitoring (SRM) is a targeted mass spectrometric technique that supports the detection and quantification of specific proteins in complex samples at high sensitivity and reproducibility. Here, we describe ATAQS, an integrated software platform that supports all stages of targeted, SRM-based proteomics experiments including target selection, transition optimization and post acquisition data analysis. This software will significantly facilitate the use of targeted proteomic techniques and contribute to the generation of highly sensitive, reproducible and complete datasets that are particularly critical for the discovery and validation of targets in hypothesis-driven studies in systems biology. Result We introduce a new open source software pipeline, ATAQS (Automated and Targeted Analysis with Quantitative SRM), which consists of a number of modules that collectively support the SRM assay development workflow for targeted proteomic experiments (project management and generation of protein, peptide and transitions and the validation of peptide detection by SRM). ATAQS provides a flexible pipeline for end-users by allowing the workflow to start or end at any point of the pipeline, and for computational biologists, by enabling the easy extension of java algorithm classes for their own algorithm plug-in or connection via an external web site. This integrated system supports all steps in a SRM-based experiment and provides a user-friendly GUI that can be run by any operating system that allows the installation of the Mozilla Firefox web browser. Conclusions Targeted proteomics via SRM is a powerful new technique that enables the reproducible and accurate identification and quantification of sets of proteins of interest. ATAQS is the first open-source software that supports all steps of the targeted proteomics workflow. ATAQS also provides software API (Application Program Interface) documentation that enables the addition of new algorithms to each of the workflow steps. The software, installation guide and sample dataset can be found in http://tools.proteomecenter.org/ATAQS/ATAQS.html PMID:21414234

  12. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection

    NASA Astrophysics Data System (ADS)

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-10-01

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets.

  13. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection.

    PubMed

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-10-14

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets.

  14. Development and Evaluation of a Parallel Reaction Monitoring Strategy for Large-Scale Targeted Metabolomics Quantification.

    PubMed

    Zhou, Juntuo; Liu, Huiying; Liu, Yang; Liu, Jia; Zhao, Xuyang; Yin, Yuxin

    2016-04-19

    Recent advances in mass spectrometers which have yielded higher resolution and faster scanning speeds have expanded their application in metabolomics of diverse diseases. Using a quadrupole-Orbitrap LC-MS system, we developed an efficient large-scale quantitative method targeting 237 metabolites involved in various metabolic pathways using scheduled, parallel reaction monitoring (PRM). We assessed the dynamic range, linearity, reproducibility, and system suitability of the PRM assay by measuring concentration curves, biological samples, and clinical serum samples. The quantification performances of PRM and MS1-based assays in Q-Exactive were compared, and the MRM assay in QTRAP 6500 was also compared. The PRM assay monitoring 237 polar metabolites showed greater reproducibility and quantitative accuracy than MS1-based quantification and also showed greater flexibility in postacquisition assay refinement than the MRM assay in QTRAP 6500. We present a workflow for convenient PRM data processing using Skyline software which is free of charge. In this study we have established a reliable PRM methodology on a quadrupole-Orbitrap platform for evaluation of large-scale targeted metabolomics, which provides a new choice for basic and clinical metabolomics study.

  15. Ariadne's Thread: A Robust Software Solution Leading to Automated Absolute and Relative Quantification of SRM Data.

    PubMed

    Nasso, Sara; Goetze, Sandra; Martens, Lennart

    2015-09-04

    Selected reaction monitoring (SRM) MS is a highly selective and sensitive technique to quantify protein abundances in complex biological samples. To enhance the pace of SRM large studies, a validated, robust method to fully automate absolute quantification and to substitute for interactive evaluation would be valuable. To address this demand, we present Ariadne, a Matlab software. To quantify monitored targets, Ariadne exploits metadata imported from the transition lists, and targets can be filtered according to mProphet output. Signal processing and statistical learning approaches are combined to compute peptide quantifications. To robustly estimate absolute abundances, the external calibration curve method is applied, ensuring linearity over the measured dynamic range. Ariadne was benchmarked against mProphet and Skyline by comparing its quantification performance on three different dilution series, featuring either noisy/smooth traces without background or smooth traces with complex background. Results, evaluated as efficiency, linearity, accuracy, and precision of quantification, showed that Ariadne's performance is independent of data smoothness and complex background presence and that Ariadne outperforms mProphet on the noisier data set and improved 2-fold Skyline's accuracy and precision for the lowest abundant dilution with complex background. Remarkably, Ariadne could statistically distinguish from each other all different abundances, discriminating dilutions as low as 0.1 and 0.2 fmol. These results suggest that Ariadne offers reliable and automated analysis of large-scale SRM differential expression studies.

  16. An open tool for input function estimation and quantification of dynamic PET FDG brain scans.

    PubMed

    Bertrán, Martín; Martínez, Natalia; Carbajal, Guillermo; Fernández, Alicia; Gómez, Álvaro

    2016-08-01

    Positron emission tomography (PET) analysis of clinical studies is mostly restricted to qualitative evaluation. Quantitative analysis of PET studies is highly desirable to be able to compute an objective measurement of the process of interest in order to evaluate treatment response and/or compare patient data. But implementation of quantitative analysis generally requires the determination of the input function: the arterial blood or plasma activity which indicates how much tracer is available for uptake in the brain. The purpose of our work was to share with the community an open software tool that can assist in the estimation of this input function, and the derivation of a quantitative map from the dynamic PET study. Arterial blood sampling during the PET study is the gold standard method to get the input function, but is uncomfortable and risky for the patient so it is rarely used in routine studies. To overcome the lack of a direct input function, different alternatives have been devised and are available in the literature. These alternatives derive the input function from the PET image itself (image-derived input function) or from data gathered from previous similar studies (population-based input function). In this article, we present ongoing work that includes the development of a software tool that integrates several methods with novel strategies for the segmentation of blood pools and parameter estimation. The tool is available as an extension to the 3D Slicer software. Tests on phantoms were conducted in order to validate the implemented methods. We evaluated the segmentation algorithms over a range of acquisition conditions and vasculature size. Input function estimation algorithms were evaluated against ground truth of the phantoms, as well as on their impact over the final quantification map. End-to-end use of the tool yields quantification maps with [Formula: see text] relative error in the estimated influx versus ground truth on phantoms. The main contribution of this article is the development of an open-source, free to use tool that encapsulates several well-known methods for the estimation of the input function and the quantification of dynamic PET FDG studies. Some alternative strategies are also proposed and implemented in the tool for the segmentation of blood pools and parameter estimation. The tool was tested on phantoms with encouraging results that suggest that even bloodless estimators could provide a viable alternative to blood sampling for quantification using graphical analysis. The open tool is a promising opportunity for collaboration among investigators and further validation on real studies.

  17. FracPaQ: a MATLAB™ Toolbox for the Quantification of Fracture Patterns

    NASA Astrophysics Data System (ADS)

    Healy, D.; Rizzo, R. E.; Cornwell, D. G.; Timms, N.; Farrell, N. J.; Watkins, H.; Gomez-Rivas, E.; Smith, M.

    2016-12-01

    The patterns of fractures in deformed rocks are rarely uniform or random. Fracture orientations, sizes, shapes and spatial distributions often exhibit some kind of order. In detail, there may be relationships among the different fracture attributes e.g. small fractures dominated by one orientation, larger fractures by another. These relationships are important because the mechanical (e.g. strength, anisotropy) and transport (e.g. fluids, heat) properties of rock depend on these fracture patterns and fracture attributes. This presentation describes an open source toolbox to quantify fracture patterns, including distributions in fracture attributes and their spatial variation. Software has been developed to quantify fracture patterns from 2-D digital images, such as thin section micrographs, geological maps, outcrop or aerial photographs or satellite images. The toolbox comprises a suite of MATLAB™ scripts based on published quantitative methods for the analysis of fracture attributes: orientations, lengths, intensity, density and connectivity. An estimate of permeability in 2-D is made using a parallel plate model. The software provides an objective and consistent methodology for quantifying fracture patterns and their variations in 2-D across a wide range of length scales. Our current focus for the application of the software is on quantifying the fracture patterns in and around fault zones. There is a large body of published work on the quantification of relatively simple joint patterns, but fault zones present a bigger, and arguably more important, challenge. The method presented is inherently scale independent, and a key task will be to analyse and integrate quantitative fracture pattern data from micro- to macro-scales. Planned future releases will incorporate multi-scale analyses based on a wavelet method to look for scale transitions, and combining fracture traces from multiple 2-D images to derive the statistically equivalent 3-D fracture pattern.

  18. CognitionMaster: an object-based image analysis framework

    PubMed Central

    2013-01-01

    Background Automated image analysis methods are becoming more and more important to extract and quantify image features in microscopy-based biomedical studies and several commercial or open-source tools are available. However, most of the approaches rely on pixel-wise operations, a concept that has limitations when high-level object features and relationships between objects are studied and if user-interactivity on the object-level is desired. Results In this paper we present an open-source software that facilitates the analysis of content features and object relationships by using objects as basic processing unit instead of individual pixels. Our approach enables also users without programming knowledge to compose “analysis pipelines“ that exploit the object-level approach. We demonstrate the design and use of example pipelines for the immunohistochemistry-based cell proliferation quantification in breast cancer and two-photon fluorescence microscopy data about bone-osteoclast interaction, which underline the advantages of the object-based concept. Conclusions We introduce an open source software system that offers object-based image analysis. The object-based concept allows for a straight-forward development of object-related interactive or fully automated image analysis solutions. The presented software may therefore serve as a basis for various applications in the field of digital image analysis. PMID:23445542

  19. Dakota Graphical User Interface v. 1.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Friedman-Hill, Ernest; Glickman, Matthew; Gibson, Marcus

    Graphical analysis environment for Sandia’s Dakota software for optimization and uncertainty quantification. The Dakota GUI is an interactive graphical analysis environment for creating, running, and interpreting Dakota optimization and uncertainty quantification studies. It includes problem (Dakota study) set-up, option specification, simulation interfacing, analysis execution, and results visualization. Through the use of wizards, templates, and views, Dakota GUI helps uses navigate Dakota’s complex capability landscape.

  20. LipidMiner: A Software for Automated Identification and Quantification of Lipids from Multiple Liquid Chromatography-Mass Spectrometry Data Files

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Meng, Da; Zhang, Qibin; Gao, Xiaoli

    2014-04-30

    We have developed a tool for automated, high-throughput analysis of LC-MS/MS data files, which greatly simplifies LC-MS based lipidomics analysis. Our results showed that LipidMiner is accurate and comprehensive in identification and quantification of lipid molecular species. In addition, the workflow implemented in LipidMiner is not limited to identification and quantification of lipids. If a suitable metabolite library is implemented in the library matching module, LipidMiner could be reconfigured as a tool for general metabolomics data analysis. It is of note that LipidMiner currently is limited to singly charged ions, although it is adequate for the purpose of lipidomics sincemore » lipids are rarely multiply charged,[14] even for the polyphosphoinositides. LipidMiner also only processes file formats generated from mass spectrometers from Thermo, i.e. the .RAW format. In the future, we are planning to accommodate file formats generated by mass spectrometers from other predominant instrument vendors to make this tool more universal.« less

  1. Quantification of 235U and 238U activity concentrations for undeclared nuclear materials by a digital gamma-gamma coincidence spectroscopy.

    PubMed

    Zhang, Weihua; Yi, Jing; Mekarski, Pawel; Ungar, Kurt; Hauck, Barry; Kramer, Gary H

    2011-06-01

    The purpose of this study is to investigate the possibility of verifying depleted uranium (DU), natural uranium (NU), low enriched uranium (LEU) and high enriched uranium (HEU) by a developed digital gamma-gamma coincidence spectroscopy. The spectroscopy consists of two NaI(Tl) scintillators and XIA LLC Digital Gamma Finder (DGF)/Pixie-4 software and card package. The results demonstrate that the spectroscopy provides an effective method of (235)U and (238)U quantification based on the count rate of their gamma-gamma coincidence counting signatures. The main advantages of this approach over the conventional gamma spectrometry include the facts of low background continuum near coincident signatures of (235)U and (238)U, less interference from other radionuclides by the gamma-gamma coincidence counting, and region-of-interest (ROI) imagine analysis for uranium enrichment determination. Compared to conventional gamma spectrometry, the method offers additional advantage of requiring minimal calibrations for (235)U and (238)U quantification at different sample geometries. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.

  2. MRMPROBS suite for metabolomics using large-scale MRM assays.

    PubMed

    Tsugawa, Hiroshi; Kanazawa, Mitsuhiro; Ogiwara, Atsushi; Arita, Masanori

    2014-08-15

    We developed new software environment for the metabolome analysis of large-scale multiple reaction monitoring (MRM) assays. It supports the data format of four major mass spectrometer vendors and mzML common data format. This program provides a process pipeline from the raw-format import to high-dimensional statistical analyses. The novel aspect is graphical user interface-based visualization to perform peak quantification, to interpolate missing values and to normalize peaks interactively based on quality control samples. Together with the software platform, the MRM standard library of 301 metabolites with 775 transitions is also available, which contributes to the reliable peak identification by using retention time and ion abundances. MRMPROBS is available for Windows OS under the creative-commons by-attribution license at http://prime.psc.riken.jp. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  3. P-MartCancer-Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets.

    PubMed

    Webb-Robertson, Bobbie-Jo M; Bramer, Lisa M; Jensen, Jeffrey L; Kobold, Markus A; Stratton, Kelly G; White, Amanda M; Rodland, Karin D

    2017-11-01

    P-MartCancer is an interactive web-based software environment that enables statistical analyses of peptide or protein data, quantitated from mass spectrometry-based global proteomics experiments, without requiring in-depth knowledge of statistical programming. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification, and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access and the capability to analyze multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium at the peptide, gene, and protein levels. P-MartCancer is deployed as a web service (https://pmart.labworks.org/cptac.html), alternatively available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/). Cancer Res; 77(21); e47-50. ©2017 AACR . ©2017 American Association for Cancer Research.

  4. UMI-tools: modeling sequencing errors in Unique Molecular Identifiers to improve quantification accuracy

    PubMed Central

    2017-01-01

    Unique Molecular Identifiers (UMIs) are random oligonucleotide barcodes that are increasingly used in high-throughput sequencing experiments. Through a UMI, identical copies arising from distinct molecules can be distinguished from those arising through PCR amplification of the same molecule. However, bioinformatic methods to leverage the information from UMIs have yet to be formalized. In particular, sequencing errors in the UMI sequence are often ignored or else resolved in an ad hoc manner. We show that errors in the UMI sequence are common and introduce network-based methods to account for these errors when identifying PCR duplicates. Using these methods, we demonstrate improved quantification accuracy both under simulated conditions and real iCLIP and single-cell RNA-seq data sets. Reproducibility between iCLIP replicates and single-cell RNA-seq clustering are both improved using our proposed network-based method, demonstrating the value of properly accounting for errors in UMIs. These methods are implemented in the open source UMI-tools software package. PMID:28100584

  5. Multiplex quantification of four DNA targets in one reaction with Bio-Rad droplet digital PCR system for GMO detection

    PubMed Central

    Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana

    2016-01-01

    The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets. PMID:27739510

  6. MRM-DIFF: data processing strategy for differential analysis in large scale MRM-based lipidomics studies.

    PubMed

    Tsugawa, Hiroshi; Ohta, Erika; Izumi, Yoshihiro; Ogiwara, Atsushi; Yukihira, Daichi; Bamba, Takeshi; Fukusaki, Eiichiro; Arita, Masanori

    2014-01-01

    Based on theoretically calculated comprehensive lipid libraries, in lipidomics as many as 1000 multiple reaction monitoring (MRM) transitions can be monitored for each single run. On the other hand, lipid analysis from each MRM chromatogram requires tremendous manual efforts to identify and quantify lipid species. Isotopic peaks differing by up to a few atomic masses further complicate analysis. To accelerate the identification and quantification process we developed novel software, MRM-DIFF, for the differential analysis of large-scale MRM assays. It supports a correlation optimized warping (COW) algorithm to align MRM chromatograms and utilizes quality control (QC) sample datasets to automatically adjust the alignment parameters. Moreover, user-defined reference libraries that include the molecular formula, retention time, and MRM transition can be used to identify target lipids and to correct peak abundances by considering isotopic peaks. Here, we demonstrate the software pipeline and introduce key points for MRM-based lipidomics research to reduce the mis-identification and overestimation of lipid profiles. The MRM-DIFF program, example data set and the tutorials are downloadable at the "Standalone software" section of the PRIMe (Platform for RIKEN Metabolomics, http://prime.psc.riken.jp/) database website.

  7. A novel informatics concept for high-throughput shotgun lipidomics based on the molecular fragmentation query language

    PubMed Central

    2011-01-01

    Shotgun lipidome profiling relies on direct mass spectrometric analysis of total lipid extracts from cells, tissues or organisms and is a powerful tool to elucidate the molecular composition of lipidomes. We present a novel informatics concept of the molecular fragmentation query language implemented within the LipidXplorer open source software kit that supports accurate quantification of individual species of any ionizable lipid class in shotgun spectra acquired on any mass spectrometry platform. PMID:21247462

  8. A neurite quality index and machine vision software for improved quantification of neurodegeneration.

    PubMed

    Romero, Peggy; Miller, Ted; Garakani, Arman

    2009-12-01

    Current methods to assess neurodegradation in dorsal root ganglion cultures as a model for neurodegenerative diseases are imprecise and time-consuming. Here we describe two new methods to quantify neuroprotection in these cultures. The neurite quality index (NQI) builds upon earlier manual methods, incorporating additional morphological events to increase detection sensitivity for the detection of early degeneration events. Neurosight is a machine vision-based method that recapitulates many of the strengths of NQI while enabling high-throughput screening applications with decreased costs.

  9. Development and validation of an open source quantification tool for DSC-MRI studies.

    PubMed

    Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J

    2015-03-01

    This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.

  10. SU-C-201-04: Quantification of Perfusion Heterogeneity Based On Texture Analysis for Fully Automatic Detection of Ischemic Deficits From Myocardial Perfusion Imaging

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Fang, Y; Huang, H; Su, T

    Purpose: Texture-based quantification of image heterogeneity has been a popular topic for imaging studies in recent years. As previous studies mainly focus on oncological applications, we report our recent efforts of applying such techniques on cardiac perfusion imaging. A fully automated procedure has been developed to perform texture analysis for measuring the image heterogeneity. Clinical data were used to evaluate the preliminary performance of such methods. Methods: Myocardial perfusion images of Thallium-201 scans were collected from 293 patients with suspected coronary artery disease. Each subject underwent a Tl-201 scan and a percutaneous coronary intervention (PCI) within three months. The PCImore » Result was used as the gold standard of coronary ischemia of more than 70% stenosis. Each Tl-201 scan was spatially normalized to an image template for fully automatic segmentation of the LV. The segmented voxel intensities were then carried into the texture analysis with our open-source software Chang Gung Image Texture Analysis toolbox (CGITA). To evaluate the clinical performance of the image heterogeneity for detecting the coronary stenosis, receiver operating characteristic (ROC) analysis was used to compute the overall accuracy, sensitivity and specificity as well as the area under curve (AUC). Those indices were compared to those obtained from the commercially available semi-automatic software QPS. Results: With the fully automatic procedure to quantify heterogeneity from Tl-201 scans, we were able to achieve a good discrimination with good accuracy (74%), sensitivity (73%), specificity (77%) and AUC of 0.82. Such performance is similar to those obtained from the semi-automatic QPS software that gives a sensitivity of 71% and specificity of 77%. Conclusion: Based on fully automatic procedures of data processing, our preliminary data indicate that the image heterogeneity of myocardial perfusion imaging can provide useful information for automatic determination of the myocardial ischemia.« less

  11. Intramyocellular lipid quantification: repeatability with 1H MR spectroscopy.

    PubMed

    Torriani, Martin; Thomas, Bijoy J; Halpern, Elkan F; Jensen, Megan E; Rosenthal, Daniel I; Palmer, William E

    2005-08-01

    To prospectively determine the repeatability and variability of tibialis anterior intramyocellular lipid (IMCL) quantifications performed by using 1.5-T hydrogen 1 (1H) magnetic resonance (MR) spectroscopy in healthy subjects. Institutional review board approval and written informed consent were obtained for this Health Insurance Portability and Accountability Act-compliant study. The authors examined the anterior tibial muscles of 27 healthy subjects aged 19-48 years (12 men, 15 women; mean age, 25 years) by using single-voxel short-echo-time point-resolved 1H MR spectroscopy. During a first visit, the subjects underwent 1H MR spectroscopy before and after being repositioned in the magnet bore, with voxels carefully placed on the basis of osseous landmarks. Measurements were repeated after a mean interval of 12 days. All spectra were fitted by using Java-based MR user interface (jMRUI) and LCModel software, and lipid peaks were scaled to the unsuppressed water peak (at 4.7 ppm) and the total creatine peak (at approximately 3.0 ppm). A one-way random-effects variance components model was used to determine intraday and intervisit coefficients of variation (CVs). A power analysis was performed to determine the detectable percentage change in lipid measurements for two subject sample sizes. Measurements of the IMCL methylene protons peak at a resonance of 1.3 ppm scaled to the unsuppressed water peak (IMCL(W)) that were obtained by using jMRUI software yielded the lowest CVs overall (intraday and intervisit CVs, 13.4% and 14.4%, respectively). The random-effects variance components model revealed that nonbiologic factors (equipment and repositioning) accounted for 50% of the total variability in IMCL quantifications. Power analysis for a sample size of 20 subjects revealed that changes in IMCL(W) of greater than 15% could be confidently detected between 1H MR spectroscopic measurements obtained on different days. 1H MR spectroscopy is feasible for repeatable quantification of IMCL concentrations in longitudinal studies of muscle metabolism.

  12. Designing Control System Application Software for Change

    NASA Technical Reports Server (NTRS)

    Boulanger, Richard

    2001-01-01

    The Unified Modeling Language (UML) was used to design the Environmental Systems Test Stand (ESTS) control system software. The UML was chosen for its ability to facilitate a clear dialog between software designer and customer, from which requirements are discovered and documented in a manner which transposes directly to program objects. Applying the UML to control system software design has resulted in a baseline set of documents from which change and effort of that change can be accurately measured. As the Environmental Systems Test Stand evolves, accurate estimates of the time and effort required to change the control system software will be made. Accurate quantification of the cost of software change can be before implementation, improving schedule and budget accuracy.

  13. The Infeasibility of Quantifying the Reliability of Life-Critical Real-Time Software

    NASA Technical Reports Server (NTRS)

    Butler, Ricky W.; Finelli, George B.

    1991-01-01

    This paper affirms that the quantification of life-critical software reliability is infeasible using statistical methods whether applied to standard software or fault-tolerant software. The classical methods of estimating reliability are shown to lead to exhorbitant amounts of testing when applied to life-critical software. Reliability growth models are examined and also shown to be incapable of overcoming the need for excessive amounts of testing. The key assumption of software fault tolerance separately programmed versions fail independently is shown to be problematic. This assumption cannot be justified by experimentation in the ultrareliability region and subjective arguments in its favor are not sufficiently strong to justify it as an axiom. Also, the implications of the recent multiversion software experiments support this affirmation.

  14. Statistical modeling of software reliability

    NASA Technical Reports Server (NTRS)

    Miller, Douglas R.

    1992-01-01

    This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.

  15. Mapping proteins in the presence of paralogs using units of coevolution

    PubMed Central

    2013-01-01

    Background We study the problem of mapping proteins between two protein families in the presence of paralogs. This problem occurs as a difficult subproblem in coevolution-based computational approaches for protein-protein interaction prediction. Results Similar to prior approaches, our method is based on the idea that coevolution implies equal rates of sequence evolution among the interacting proteins, and we provide a first attempt to quantify this notion in a formal statistical manner. We call the units that are central to this quantification scheme the units of coevolution. A unit consists of two mapped protein pairs and its score quantifies the coevolution of the pairs. This quantification allows us to provide a maximum likelihood formulation of the paralog mapping problem and to cast it into a binary quadratic programming formulation. Conclusion CUPID, our software tool based on a Lagrangian relaxation of this formulation, makes it, for the first time, possible to compute state-of-the-art quality pairings in a few minutes of runtime. In summary, we suggest a novel alternative to the earlier available approaches, which is statistically sound and computationally feasible. PMID:24564758

  16. HTAPP: High-Throughput Autonomous Proteomic Pipeline

    PubMed Central

    Yu, Kebing; Salomon, Arthur R.

    2011-01-01

    Recent advances in the speed and sensitivity of mass spectrometers and in analytical methods, the exponential acceleration of computer processing speeds, and the availability of genomic databases from an array of species and protein information databases have led to a deluge of proteomic data. The development of a lab-based automated proteomic software platform for the automated collection, processing, storage, and visualization of expansive proteomic datasets is critically important. The high-throughput autonomous proteomic pipeline (HTAPP) described here is designed from the ground up to provide critically important flexibility for diverse proteomic workflows and to streamline the total analysis of a complex proteomic sample. This tool is comprised of software that controls the acquisition of mass spectral data along with automation of post-acquisition tasks such as peptide quantification, clustered MS/MS spectral database searching, statistical validation, and data exploration within a user-configurable lab-based relational database. The software design of HTAPP focuses on accommodating diverse workflows and providing missing software functionality to a wide range of proteomic researchers to accelerate the extraction of biological meaning from immense proteomic data sets. Although individual software modules in our integrated technology platform may have some similarities to existing tools, the true novelty of the approach described here is in the synergistic and flexible combination of these tools to provide an integrated and efficient analysis of proteomic samples. PMID:20336676

  17. pyQms enables universal and accurate quantification of mass spectrometry data.

    PubMed

    Leufken, Johannes; Niehues, Anna; Sarin, L Peter; Wessel, Florian; Hippler, Michael; Leidel, Sebastian A; Fufezan, Christian

    2017-10-01

    Quantitative mass spectrometry (MS) is a key technique in many research areas (1), including proteomics, metabolomics, glycomics, and lipidomics. Because all of the corresponding molecules can be described by chemical formulas, universal quantification tools are highly desirable. Here, we present pyQms, an open-source software for accurate quantification of all types of molecules measurable by MS. pyQms uses isotope pattern matching that offers an accurate quality assessment of all quantifications and the ability to directly incorporate mass spectrometer accuracy. pyQms is, due to its universal design, applicable to every research field, labeling strategy, and acquisition technique. This opens ultimate flexibility for researchers to design experiments employing innovative and hitherto unexplored labeling strategies. Importantly, pyQms performs very well to accurately quantify partially labeled proteomes in large scale and high throughput, the most challenging task for a quantification algorithm. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.

  18. freeQuant: A Mass Spectrometry Label-Free Quantification Software Tool for Complex Proteome Analysis.

    PubMed

    Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong

    2015-01-01

    Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.

  19. A new semiquantitative method for evaluation of metastasis progression.

    PubMed

    Volarevic, A; Ljujic, B; Volarevic, V; Milovanovic, M; Kanjevac, T; Lukic, A; Arsenijevic, N

    2012-01-01

    Although recent technical advancements are directed toward developing novel assays and methods for detection of micro and macro metastasis, there are still no reports of reliable, simple to use imaging software that could be used for the detection and quantification of metastasis in tissue sections. We herein report a new semiquantitative method for evaluation of metastasis progression in a well established 4T1 orthotopic mouse model of breast cancer metastasis. The new semiquantitative method presented here was implemented by using the Autodesk AutoCAD 2012 program, a computer-aided design program used primarily for preparing technical drawings in 2 dimensions. By using the Autodesk AutoCAD 2012 software- aided graphical evaluation we managed to detect each metastatic lesion and we precisely calculated the average percentage of lung and liver tissue parenchyma with metastasis in 4T1 tumor-bearing mice. The data were highly specific and relevant to descriptive histological analysis, confirming reliability and accuracy of the AutoCAD 2012 software as new method for quantification of metastatic lesions. The new semiquantitative method using AutoCAD 2012 software provides a novel approach for the estimation of metastatic progression in histological tissue sections.

  20. Segmentation and quantification of subcellular structures in fluorescence microscopy images using Squassh.

    PubMed

    Rizk, Aurélien; Paul, Grégory; Incardona, Pietro; Bugarski, Milica; Mansouri, Maysam; Niemann, Axel; Ziegler, Urs; Berger, Philipp; Sbalzarini, Ivo F

    2014-03-01

    Detection and quantification of fluorescently labeled molecules in subcellular compartments is a key step in the analysis of many cell biological processes. Pixel-wise colocalization analyses, however, are not always suitable, because they do not provide object-specific information, and they are vulnerable to noise and background fluorescence. Here we present a versatile protocol for a method named 'Squassh' (segmentation and quantification of subcellular shapes), which is used for detecting, delineating and quantifying subcellular structures in fluorescence microscopy images. The workflow is implemented in freely available, user-friendly software. It works on both 2D and 3D images, accounts for the microscope optics and for uneven image background, computes cell masks and provides subpixel accuracy. The Squassh software enables both colocalization and shape analyses. The protocol can be applied in batch, on desktop computers or computer clusters, and it usually requires <1 min and <5 min for 2D and 3D images, respectively. Basic computer-user skills and some experience with fluorescence microscopy are recommended to successfully use the protocol.

  1. An overview of the mathematical and statistical analysis component of RICIS

    NASA Technical Reports Server (NTRS)

    Hallum, Cecil R.

    1987-01-01

    Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.

  2. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action

    PubMed Central

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels—from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures. PMID:27920748

  3. Multidimensional Recurrence Quantification Analysis (MdRQA) for the Analysis of Multidimensional Time-Series: A Software Implementation in MATLAB and Its Application to Group-Level Data in Joint Action.

    PubMed

    Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan

    2016-01-01

    We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels-from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures.

  4. The software-cycle model for re-engineering and reuse

    NASA Technical Reports Server (NTRS)

    Bailey, John W.; Basili, Victor R.

    1992-01-01

    This paper reports on the progress of a study which will contribute to our ability to perform high-level, component-based programming by describing means to obtain useful components, methods for the configuration and integration of those components, and an underlying economic model of the costs and benefits associated with this approach to reuse. One goal of the study is to develop and demonstrate methods to recover reusable components from domain-specific software through a combination of tools, to perform the identification, extraction, and re-engineering of components, and domain experts, to direct the applications of those tools. A second goal of the study is to enable the reuse of those components by identifying techniques for configuring and recombining the re-engineered software. This component-recovery or software-cycle model addresses not only the selection and re-engineering of components, but also their recombination into new programs. Once a model of reuse activities has been developed, the quantification of the costs and benefits of various reuse options will enable the development of an adaptable economic model of reuse, which is the principal goal of the overall study. This paper reports on the conception of the software-cycle model and on several supporting techniques of software recovery, measurement, and reuse which will lead to the development of the desired economic model.

  5. Mapping and Quantification of Vascular Branching in Plants, Animals and Humans by VESGEN Software

    NASA Technical Reports Server (NTRS)

    Parsons-Wingerter, Patricia A.; Vickerman, Mary B.; Keith, Patricia A.

    2010-01-01

    Humans face daunting challenges in the successful exploration and colonization of space, including adverse alterations in gravity and radiation. The Earth-determined biology of humans, animals and plants is significantly modified in such extraterrestrial environments. One physiological requirement shared by humans with larger plants and animals is a complex, highly branching vascular system that is dynamically responsive to cellular metabolism, immunological protection and specialized cellular/tissue function. The VESsel GENeration (VESGEN) Analysis has been developed as a mature beta version, pre-release research software for mapping and quantification of the fractal-based complexity of vascular branching. Alterations in vascular branching pattern can provide informative read-outs of altered vascular regulation. Originally developed for biomedical applications in angiogenesis, VESGEN 2D has provided novel insights into the cytokine, transgenic and therapeutic regulation of angiogenesis, lymphangiogenesis and other microvascular remodeling phenomena. Vascular trees, networks and tree-network composites are mapped and quantified. Applications include disease progression from clinical ophthalmic images of the human retina; experimental regulation of vascular remodeling in the mouse retina; avian and mouse coronary vasculature, and other experimental models in vivo. We envision that altered branching in the leaves of plants studied on ISS such as Arabidopsis thaliana cans also be analyzed.

  6. Mapping and Quantification of Vascular Branching in Plants, Animals and Humans by VESGEN Software

    NASA Technical Reports Server (NTRS)

    Parsons-Wingerter, P. A.; Vickerman, M. B.; Keith, P. A.

    2010-01-01

    Humans face daunting challenges in the successful exploration and colonization of space, including adverse alterations in gravity and radiation. The Earth-determined biology of plants, animals and humans is significantly modified in such extraterrestrial environments. One physiological requirement shared by larger plants and animals with humans is a complex, highly branching vascular system that is dynamically responsive to cellular metabolism, immunological protection and specialized cellular/tissue function. VESsel GENeration (VESGEN) Analysis has been developed as a mature beta version, pre-release research software for mapping and quantification of the fractal-based complexity of vascular branching. Alterations in vascular branching pattern can provide informative read-outs of altered vascular regulation. Originally developed for biomedical applications in angiogenesis, VESGEN 2D has provided novel insights into the cytokine, transgenic and therapeutic regulation of angiogenesis, lymphangiogenesis and other microvascular remodeling phenomena. Vascular trees, networks and tree-network composites are mapped and quantified. Applications include disease progression from clinical ophthalmic images of the human retina; experimental regulation of vascular remodeling in the mouse retina; avian and mouse coronary vasculature, and other experimental models in vivo. We envision that altered branching in the leaves of plants studied on ISS such as Arabidopsis thaliana cans also be analyzed.

  7. Application of synchrotron radiation computed microtomography for quantification of bone microstructure in human and rat bones

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Parreiras Nogueira, Liebert; Barroso, Regina Cely; Pereira de Almeida, Andre

    2012-05-17

    This work aims to evaluate histomorphometric quantification by synchrotron radiation computed microto-mography in bones of human and rat specimens. Bones specimens are classified as normal and pathological (for human samples) and irradiated and non-irradiated samples (for rat ones). Human bones are specimens which were affected by some injury, or not. Rat bones are specimens which were irradiated, simulating radiotherapy procedures, or not. Images were obtained on SYRMEP beamline at the Elettra Synchrotron Laboratory in Trieste, Italy. The system generated 14 {mu}m tomographic images. The quantification of bone structures were performed directly by the 3D rendered images using a home-made software.more » Resolution yielded was excellent what facilitate quantification of bone microstructures.« less

  8. Statistical Validation for Clinical Measures: Repeatability and Agreement of Kinect™-Based Software.

    PubMed

    Lopez, Natalia; Perez, Elisa; Tello, Emanuel; Rodrigo, Alejandro; Valentinuzzi, Max E

    2018-01-01

    The rehabilitation process is a fundamental stage for recovery of people's capabilities. However, the evaluation of the process is performed by physiatrists and medical doctors, mostly based on their observations, that is, a subjective appreciation of the patient's evolution. This paper proposes a tracking platform of the movement made by an individual's upper limb using Kinect sensor(s) to be applied for the patient during the rehabilitation process. The main contribution is the development of quantifying software and the statistical validation of its performance, repeatability, and clinical use in the rehabilitation process. The software determines joint angles and upper limb trajectories for the construction of a specific rehabilitation protocol and quantifies the treatment evolution. In turn, the information is presented via a graphical interface that allows the recording, storage, and report of the patient's data. For clinical purposes, the software information is statistically validated with three different methodologies, comparing the measures with a goniometer in terms of agreement and repeatability. The agreement of joint angles measured with the proposed software and goniometer is evaluated with Bland-Altman plots; all measurements fell well within the limits of agreement, meaning interchangeability of both techniques. Additionally, the results of Bland-Altman analysis of repeatability show 95% confidence. Finally, the physiotherapists' qualitative assessment shows encouraging results for the clinical use. The main conclusion is that the software is capable of offering a clinical history of the patient and is useful for quantification of the rehabilitation success. The simplicity, low cost, and visualization possibilities enhance the use of the software Kinect for rehabilitation and other applications, and the expert's opinion endorses the choice of our approach for clinical practice. Comparison of the new measurement technique with established goniometric methods determines that the proposed software agrees sufficiently to be used interchangeably.

  9. Practical considerations of image analysis and quantification of signal transduction IHC staining.

    PubMed

    Grunkin, Michael; Raundahl, Jakob; Foged, Niels T

    2011-01-01

    The dramatic increase in computer processing power in combination with the availability of high-quality digital cameras during the last 10 years has fertilized the grounds for quantitative microscopy based on digital image analysis. With the present introduction of robust scanners for whole slide imaging in both research and routine, the benefits of automation and objectivity in the analysis of tissue sections will be even more obvious. For in situ studies of signal transduction, the combination of tissue microarrays, immunohistochemistry, digital imaging, and quantitative image analysis will be central operations. However, immunohistochemistry is a multistep procedure including a lot of technical pitfalls leading to intra- and interlaboratory variability of its outcome. The resulting variations in staining intensity and disruption of original morphology are an extra challenge for the image analysis software, which therefore preferably should be dedicated to the detection and quantification of histomorphometrical end points.

  10. MaxReport: An Enhanced Proteomic Result Reporting Tool for MaxQuant.

    PubMed

    Zhou, Tao; Li, Chuyu; Zhao, Wene; Wang, Xinru; Wang, Fuqiang; Sha, Jiahao

    2016-01-01

    MaxQuant is a proteomic software widely used for large-scale tandem mass spectrometry data. We have designed and developed an enhanced result reporting tool for MaxQuant, named as MaxReport. This tool can optimize the results of MaxQuant and provide additional functions for result interpretation. MaxReport can generate report tables for protein N-terminal modifications. It also supports isobaric labelling based relative quantification at the protein, peptide or site level. To obtain an overview of the results, MaxReport performs general descriptive statistical analyses for both identification and quantification results. The output results of MaxReport are well organized and therefore helpful for proteomic users to better understand and share their data. The script of MaxReport, which is freely available at http://websdoor.net/bioinfo/maxreport/, is developed using Python code and is compatible across multiple systems including Windows and Linux.

  11. Remote sensing research for agricultural applications. [San Joaquin County, California and Snake River Plain and Twin Falls area, Idaho

    NASA Technical Reports Server (NTRS)

    Colwell, R. N. (Principal Investigator); Wall, S. L.; Beck, L. H.; Degloria, S. D.; Ritter, P. R.; Thomas, R. W.; Travlos, A. J.; Fakhoury, E.

    1984-01-01

    Materials and methods used to characterize selected soil properties and agricultural crops in San Joaquin County, California are described. Results show that: (1) the location and widths of TM bands are suitable for detecting differences in selected soil properties; (2) the number of TM spectral bands allows the quantification of soil spectral curve form and magnitude; and (3) the spatial and geometric quality of TM data allows for the discrimination and quantification of within field variability of soil properties. The design of the LANDSAT based multiple crop acreage estimation experiment for the Idaho Department of Water Resources is described including the use of U.C. Berkeley's Survey Modeling Planning Model. Progress made on Peditor software development on MIDAS, and cooperative computing using local and remote systems is reported as well as development of MIDAS microcomputer systems.

  12. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics

    PubMed Central

    Röst, Hannes L.; Liu, Yansheng; D’Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C.; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-01-01

    Large scale, quantitative proteomic studies have become essential for the analysis of clinical cohorts, large perturbation experiments and systems biology studies. While next-generation mass spectrometric techniques such as SWATH-MS have substantially increased throughput and reproducibility, ensuring consistent quantification of thousands of peptide analytes across multiple LC-MS/MS runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we have developed the TRIC software which utilizes fragment ion data to perform cross-run alignment, consistent peak-picking and quantification for high throughput targeted proteomics. TRIC uses a graph-based alignment strategy based on non-linear retention time correction to integrate peak elution information from all LC-MS/MS runs acquired in a study. When compared to state-of-the-art SWATH-MS data analysis, the algorithm was able to reduce the identification error by more than 3-fold at constant recall, while correcting for highly non-linear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem (iPS) cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups and substantially increased the quantitative completeness and biological information in the data, providing insights into protein dynamics of iPS cells. Overall, this study demonstrates the importance of consistent quantification in highly challenging experimental setups, and proposes an algorithm to automate this task, constituting the last missing piece in a pipeline for automated analysis of massively parallel targeted proteomics datasets. PMID:27479329

  13. P-MartCancer–Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Webb-Robertson, Bobbie-Jo M.; Bramer, Lisa M.; Jensen, Jeffrey L.

    P-MartCancer is a new interactive web-based software environment that enables biomedical and biological scientists to perform in-depth analyses of global proteomics data without requiring direct interaction with the data or with statistical software. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access to multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium (CPTAC) at the peptide, gene and protein levels. P-MartCancer is deployed using Azure technologies (http://pmart.labworks.org/cptac.html), the web-service is alternativelymore » available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/) and many statistical functions can be utilized directly from an R package available on GitHub (https://github.com/pmartR).« less

  14. Development of image analysis software for quantification of viable cells in microchips.

    PubMed

    Georg, Maximilian; Fernández-Cabada, Tamara; Bourguignon, Natalia; Karp, Paola; Peñaherrera, Ana B; Helguera, Gustavo; Lerner, Betiana; Pérez, Maximiliano S; Mertelsmann, Roland

    2018-01-01

    Over the past few years, image analysis has emerged as a powerful tool for analyzing various cell biology parameters in an unprecedented and highly specific manner. The amount of data that is generated requires automated methods for the processing and analysis of all the resulting information. The software available so far are suitable for the processing of fluorescence and phase contrast images, but often do not provide good results from transmission light microscopy images, due to the intrinsic variation of the acquisition of images technique itself (adjustment of brightness / contrast, for instance) and the variability between image acquisition introduced by operators / equipment. In this contribution, it has been presented an image processing software, Python based image analysis for cell growth (PIACG), that is able to calculate the total area of the well occupied by cells with fusiform and rounded morphology in response to different concentrations of fetal bovine serum in microfluidic chips, from microscopy images in transmission light, in a highly efficient way.

  15. The Effects of Word Processing Software on User Satisfaction: An Empirical Study of Micro, Mini, and Mainframe Computers Using an Interactive Artificial Intelligence Expert-System.

    ERIC Educational Resources Information Center

    Rushinek, Avi; Rushinek, Sara

    1984-01-01

    Describes results of a system rating study in which users responded to WPS (word processing software) questions. Study objectives were data collection and evaluation of variables; statistical quantification of WPS's contribution (along with other variables) to user satisfaction; design of an expert system to evaluate WPS; and database update and…

  16. An Excel‐based implementation of the spectral method of action potential alternans analysis

    PubMed Central

    Pearman, Charles M.

    2014-01-01

    Abstract Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro‐arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T‐wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. PMID:25501439

  17. FracPaQ: a MATLAB™ toolbox for the quantification of fracture patterns

    NASA Astrophysics Data System (ADS)

    Healy, David; Rizzo, Roberto; Farrell, Natalie; Watkins, Hannah; Cornwell, David; Gomez-Rivas, Enrique; Timms, Nick

    2017-04-01

    The patterns of fractures in deformed rocks are rarely uniform or random. Fracture orientations, sizes, shapes and spatial distributions often exhibit some kind of order. In detail, there may be relationships among the different fracture attributes e.g. small fractures dominated by one orientation, larger fractures by another. These relationships are important because the mechanical (e.g. strength, anisotropy) and transport (e.g. fluids, heat) properties of rock depend on these fracture patterns and fracture attributes. This presentation describes an open source toolbox to quantify fracture patterns, including distributions in fracture attributes and their spatial variation. Software has been developed to quantify fracture patterns from 2-D digital images, such as thin section micrographs, geological maps, outcrop or aerial photographs or satellite images. The toolbox comprises a suite of MATLAB™ scripts based on published quantitative methods for the analysis of fracture attributes: orientations, lengths, intensity, density and connectivity. An estimate of permeability in 2-D is made using a parallel plate model. The software provides an objective and consistent methodology for quantifying fracture patterns and their variations in 2-D across a wide range of length scales. Our current focus for the application of the software is on quantifying crack and fracture patterns in and around fault zones. There is a large body of published work on the quantification of relatively simple joint patterns, but fault zones present a bigger, and arguably more important, challenge. The methods presented are inherently scale independent, and a key task will be to analyse and integrate quantitative fracture pattern data from micro- to macro-scales. New features in this release include multi-scale analyses based on a wavelet method to look for scale transitions, support for multi-colour traces in the input file processed as separate fracture sets, and combining fracture traces from multiple 2-D images to derive the statistically equivalent 3-D fracture pattern expressed as a 2nd rank crack tensor.

  18. Dakota, a multilevel parallel object-oriented framework for design optimization, parameter estimation, uncertainty quantification, and sensitivity analysis :

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.

    The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.« less

  19. jmzTab: a java interface to the mzTab data standard.

    PubMed

    Xu, Qing-Wei; Griss, Johannes; Wang, Rui; Jones, Andrew R; Hermjakob, Henning; Vizcaíno, Juan Antonio

    2014-06-01

    mzTab is the most recent standard format developed by the Proteomics Standards Initiative. mzTab is a flexible tab-delimited file that can capture identification and quantification results coming from MS-based proteomics and metabolomics approaches. We here present an open-source Java application programming interface for mzTab called jmzTab. The software allows the efficient processing of mzTab files, providing read and write capabilities, and is designed to be embedded in other software packages. The second key feature of the jmzTab model is that it provides a flexible framework to maintain the logical integrity between the metadata and the table-based sections in the mzTab files. In this article, as two example implementations, we also describe two stand-alone tools that can be used to validate mzTab files and to convert PRIDE XML files to mzTab. The library is freely available at http://mztab.googlecode.com. © 2014 The Authors PROTEOMICS Published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.

  20. A novel method for automated tracking and quantification of adult zebrafish behaviour during anxiety.

    PubMed

    Nema, Shubham; Hasan, Whidul; Bhargava, Anamika; Bhargava, Yogesh

    2016-09-15

    Behavioural neuroscience relies on software driven methods for behavioural assessment, but the field lacks cost-effective, robust, open source software for behavioural analysis. Here we propose a novel method which we called as ZebraTrack. It includes cost-effective imaging setup for distraction-free behavioural acquisition, automated tracking using open-source ImageJ software and workflow for extraction of behavioural endpoints. Our ImageJ algorithm is capable of providing control to users at key steps while maintaining automation in tracking without the need for the installation of external plugins. We have validated this method by testing novelty induced anxiety behaviour in adult zebrafish. Our results, in agreement with established findings, showed that during state-anxiety, zebrafish showed reduced distance travelled, increased thigmotaxis and freezing events. Furthermore, we proposed a method to represent both spatial and temporal distribution of choice-based behaviour which is currently not possible to represent using simple videograms. ZebraTrack method is simple and economical, yet robust enough to give results comparable with those obtained from costly proprietary software like Ethovision XT. We have developed and validated a novel cost-effective method for behavioural analysis of adult zebrafish using open-source ImageJ software. Copyright © 2016 Elsevier B.V. All rights reserved.

  1. Partition resampling and extrapolation averaging: approximation methods for quantifying gene expression in large numbers of short oligonucleotide arrays.

    PubMed

    Goldstein, Darlene R

    2006-10-01

    Studies of gene expression using high-density short oligonucleotide arrays have become a standard in a variety of biological contexts. Of the expression measures that have been proposed to quantify expression in these arrays, multi-chip-based measures have been shown to perform well. As gene expression studies increase in size, however, utilizing multi-chip expression measures is more challenging in terms of computing memory requirements and time. A strategic alternative to exact multi-chip quantification on a full large chip set is to approximate expression values based on subsets of chips. This paper introduces an extrapolation method, Extrapolation Averaging (EA), and a resampling method, Partition Resampling (PR), to approximate expression in large studies. An examination of properties indicates that subset-based methods can perform well compared with exact expression quantification. The focus is on short oligonucleotide chips, but the same ideas apply equally well to any array type for which expression is quantified using an entire set of arrays, rather than for only a single array at a time. Software implementing Partition Resampling and Extrapolation Averaging is under development as an R package for the BioConductor project.

  2. Existing Fortran interfaces to Trilinos in preparation for exascale ForTrilinos development

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Evans, Katherine J.; Young, Mitchell T.; Collins, Benjamin S.

    This report summarizes the current state of Fortran interfaces to the Trilinos library within several key applications of the Exascale Computing Program (ECP), with the aim of informing developers about strategies to develop ForTrilinos, an exascale-ready, Fortran interface software package within Trilinos. The two software projects assessed within are the DOE Office of Science's Accelerated Climate Model for Energy (ACME) atmosphere component, CAM, and the DOE Office of Nuclear Energy's core-simulator portion of VERA, a nuclear reactor simulation code. Trilinos is an object-oriented, C++ based software project, and spans a collection of algorithms and other enabling technologies such as uncertaintymore » quantification and mesh generation. To date, Trilinos has enabled these codes to achieve large-scale simulation results, however the simulation needs of CAM and VERA-CS will approach exascale over the next five years. A Fortran interface to Trilinos that enables efficient use of programming models and more advanced algorithms is necessary. Where appropriate, the needs of the CAM and VERA-CS software to achieve their simulation goals are called out specifically. With this report, a design document and execution plan for ForTrilinos development can proceed.« less

  3. IPLaminator: an ImageJ plugin for automated binning and quantification of retinal lamination.

    PubMed

    Li, Shuai; Woodfin, Michael; Long, Seth S; Fuerst, Peter G

    2016-01-16

    Information in the brain is often segregated into spatially organized layers that reflect the function of the embedded circuits. This is perhaps best exemplified in the layering, or lamination, of the retinal inner plexiform layer (IPL). The neurites of the retinal ganglion, amacrine and bipolar cell subtypes that form synapses in the IPL are precisely organized in highly refined strata within the IPL. Studies focused on developmental organization and cell morphology often use this layered stratification to characterize cells and identify the function of genes in development of the retina. A current limitation to such analysis is the lack of standardized tools to quantitatively analyze this complex structure. Most previous work on neuron stratification in the IPL is qualitative and descriptive. In this study we report the development of an intuitive platform to rapidly and reproducibly assay IPL lamination. The novel ImageJ based software plugin we developed: IPLaminator, rapidly analyzes neurite stratification patterns in the retina and other neural tissues. A range of user options allows researchers to bin IPL stratification based on fixed points, such as the neurites of cholinergic amacrine cells, or to define a number of bins into which the IPL will be divided. Options to analyze tissues such as cortex were also added. Statistical analysis of the output then allows a quantitative value to be assigned to differences in laminar patterning observed in different models, genotypes or across developmental time. IPLaminator is an easy to use software application that will greatly speed and standardize quantification of neuron organization.

  4. Comparison of the manual, semiautomatic, and automatic selection and leveling of hot spots in whole slide images for Ki-67 quantification in meningiomas.

    PubMed

    Swiderska, Zaneta; Korzynska, Anna; Markiewicz, Tomasz; Lorent, Malgorzata; Zak, Jakub; Wesolowska, Anna; Roszkowiak, Lukasz; Slodkowska, Janina; Grala, Bartlomiej

    2015-01-01

    Background. This paper presents the study concerning hot-spot selection in the assessment of whole slide images of tissue sections collected from meningioma patients. The samples were immunohistochemically stained to determine the Ki-67/MIB-1 proliferation index used for prognosis and treatment planning. Objective. The observer performance was examined by comparing results of the proposed method of automatic hot-spot selection in whole slide images, results of traditional scoring under a microscope, and results of a pathologist's manual hot-spot selection. Methods. The results of scoring the Ki-67 index using optical scoring under a microscope, software for Ki-67 index quantification based on hot spots selected by two pathologists (resp., once and three times), and the same software but on hot spots selected by proposed automatic methods were compared using Kendall's tau-b statistics. Results. Results show intra- and interobserver agreement. The agreement between Ki-67 scoring with manual and automatic hot-spot selection is high, while agreement between Ki-67 index scoring results in whole slide images and traditional microscopic examination is lower. Conclusions. The agreement observed for the three scoring methods shows that automation of area selection is an effective tool in supporting physicians and in increasing the reliability of Ki-67 scoring in meningioma.

  5. Digital quantification of fibrosis in liver biopsy sections: description of a new method by Photoshop software.

    PubMed

    Dahab, Gamal M; Kheriza, Mohamed M; El-Beltagi, Hussien M; Fouda, Abdel-Motaal M; El-Din, Osama A Sharaf

    2004-01-01

    The precise quantification of fibrous tissue in liver biopsy sections is extremely important in the classification, diagnosis and grading of chronic liver disease, as well as in evaluating the response to antifibrotic therapy. Because the recently described methods of digital image analysis of fibrosis in liver biopsy sections have major flaws, including the use of out-dated techniques in image processing, inadequate precision and inability to detect and quantify perisinusoidal fibrosis, we developed a new technique in computerized image analysis of liver biopsy sections based on Adobe Photoshop software. We prepared an experimental model of liver fibrosis involving treatment of rats with oral CCl4 for 6 weeks. After staining liver sections with Masson's trichrome, a series of computer operations were performed including (i) reconstitution of seamless widefield images from a number of acquired fields of liver sections; (ii) image size and solution adjustment; (iii) color correction; (iv) digital selection of a specified color range representing all fibrous tissue in the image and; (v) extraction and calculation. This technique is fully computerized with no manual interference at any step, and thus could be very reliable for objectively quantifying any pattern of fibrosis in liver biopsy sections and in assessing the response to antifibrotic therapy. It could also be a valuable tool in the precise assessment of antifibrotic therapy to other tissue regardless of the pattern of tissue or fibrosis.

  6. Comparison of the Manual, Semiautomatic, and Automatic Selection and Leveling of Hot Spots in Whole Slide Images for Ki-67 Quantification in Meningiomas

    PubMed Central

    Swiderska, Zaneta; Korzynska, Anna; Markiewicz, Tomasz; Lorent, Malgorzata; Zak, Jakub; Wesolowska, Anna; Roszkowiak, Lukasz; Slodkowska, Janina; Grala, Bartlomiej

    2015-01-01

    Background. This paper presents the study concerning hot-spot selection in the assessment of whole slide images of tissue sections collected from meningioma patients. The samples were immunohistochemically stained to determine the Ki-67/MIB-1 proliferation index used for prognosis and treatment planning. Objective. The observer performance was examined by comparing results of the proposed method of automatic hot-spot selection in whole slide images, results of traditional scoring under a microscope, and results of a pathologist's manual hot-spot selection. Methods. The results of scoring the Ki-67 index using optical scoring under a microscope, software for Ki-67 index quantification based on hot spots selected by two pathologists (resp., once and three times), and the same software but on hot spots selected by proposed automatic methods were compared using Kendall's tau-b statistics. Results. Results show intra- and interobserver agreement. The agreement between Ki-67 scoring with manual and automatic hot-spot selection is high, while agreement between Ki-67 index scoring results in whole slide images and traditional microscopic examination is lower. Conclusions. The agreement observed for the three scoring methods shows that automation of area selection is an effective tool in supporting physicians and in increasing the reliability of Ki-67 scoring in meningioma. PMID:26240787

  7. MSiReader v1.0: Evolving Open-Source Mass Spectrometry Imaging Software for Targeted and Untargeted Analyses.

    PubMed

    Bokhart, Mark T; Nazari, Milad; Garrard, Kenneth P; Muddiman, David C

    2018-01-01

    A major update to the mass spectrometry imaging (MSI) software MSiReader is presented, offering a multitude of newly added features critical to MSI analyses. MSiReader is a free, open-source, and vendor-neutral software written in the MATLAB platform and is capable of analyzing most common MSI data formats. A standalone version of the software, which does not require a MATLAB license, is also distributed. The newly incorporated data analysis features expand the utility of MSiReader beyond simple visualization of molecular distributions. The MSiQuantification tool allows researchers to calculate absolute concentrations from quantification MSI experiments exclusively through MSiReader software, significantly reducing data analysis time. An image overlay feature allows the incorporation of complementary imaging modalities to be displayed with the MSI data. A polarity filter has also been incorporated into the data loading step, allowing the facile analysis of polarity switching experiments without the need for data parsing prior to loading the data file into MSiReader. A quality assurance feature to generate a mass measurement accuracy (MMA) heatmap for an analyte of interest has also been added to allow for the investigation of MMA across the imaging experiment. Most importantly, as new features have been added performance has not degraded, in fact it has been dramatically improved. These new tools and the improvements to the performance in MSiReader v1.0 enable the MSI community to evaluate their data in greater depth and in less time. Graphical Abstract ᅟ.

  8. A tool for design of primers for microRNA-specific quantitative RT-qPCR.

    PubMed

    Busk, Peter K

    2014-01-28

    MicroRNAs are small but biologically important RNA molecules. Although different methods can be used for quantification of microRNAs, quantitative PCR is regarded as the reference that is used to validate other methods. Several commercial qPCR assays are available but they often come at a high price and the sequences of the primers are not disclosed. An alternative to commercial assays is to manually design primers but this work is tedious and, hence, not practical for the design of primers for a larger number of targets. I have developed the software miRprimer for automatic design of primers for the method miR-specific RT-qPCR, which is one of the best performing microRNA qPCR methods available. The algorithm is based on an implementation of the previously published rules for manual design of miR-specific primers with the additional feature of evaluating the propensity of formation of secondary structures and primer dimers. Testing of the primers showed that 76 out of 79 primers (96%) worked for quantification of microRNAs by miR-specific RT-qPCR of mammalian RNA samples. This success rate corresponds to the success rate of manual primer design. Furthermore, primers designed by this method have been distributed to several labs and used successfully in published studies. The software miRprimer is an automatic and easy method for design of functional primers for miR-specific RT-qPCR. The application is available as stand-alone software that will work on the MS Windows platform and in a developer version written in the Ruby programming language.

  9. Study the effect of reservoir spatial heterogeneity on CO2 sequestration under an uncertainty quantification (UQ) software framework

    NASA Astrophysics Data System (ADS)

    Fang, Y.; Hou, J.; Engel, D.; Lin, G.; Yin, J.; Han, B.; Fang, Z.; Fountoulakis, V.

    2011-12-01

    In this study, we introduce an uncertainty quantification (UQ) software framework for carbon sequestration, with the focus of studying being the effect of spatial heterogeneity of reservoir properties on CO2 migration. We use a sequential Gaussian method (SGSIM) to generate realizations of permeability fields with various spatial statistical attributes. To deal with the computational difficulties, we integrate the following ideas/approaches: 1) firstly, we use three different sampling approaches (probabilistic collocation, quasi-Monte Carlo, and adaptive sampling approaches) to reduce the required forward calculations while trying to explore the parameter space and quantify the input uncertainty; 2) secondly, we use eSTOMP as the forward modeling simulator. eSTOMP is implemented using the Global Arrays toolkit (GA) that is based on one-sided inter-processor communication and supports a shared memory programming style on distributed memory platforms. It provides highly-scalable performance. It uses a data model to partition most of the large scale data structures into a relatively small number of distinct classes. The lower level simulator infrastructure (e.g. meshing support, associated data structures, and data mapping to processors) is separated from the higher level physics and chemistry algorithmic routines using a grid component interface; and 3) besides the faster model and more efficient algorithms to speed up the forward calculation, we built an adaptive system infrastructure to select the best possible data transfer mechanisms, to optimally allocate system resources to improve performance, and to integrate software packages and data for composing carbon sequestration simulation, computation, analysis, estimation and visualization. We will demonstrate the framework with a given CO2 injection scenario in a heterogeneous sandstone reservoir.

  10. Workflow for high-content, individual cell quantification of fluorescent markers from universal microscope data, supported by open source software.

    PubMed

    Stockwell, Simon R; Mittnacht, Sibylle

    2014-12-16

    Advances in understanding the control mechanisms governing the behavior of cells in adherent mammalian tissue culture models are becoming increasingly dependent on modes of single-cell analysis. Methods which deliver composite data reflecting the mean values of biomarkers from cell populations risk losing subpopulation dynamics that reflect the heterogeneity of the studied biological system. In keeping with this, traditional approaches are being replaced by, or supported with, more sophisticated forms of cellular assay developed to allow assessment by high-content microscopy. These assays potentially generate large numbers of images of fluorescent biomarkers, which enabled by accompanying proprietary software packages, allows for multi-parametric measurements per cell. However, the relatively high capital costs and overspecialization of many of these devices have prevented their accessibility to many investigators. Described here is a universally applicable workflow for the quantification of multiple fluorescent marker intensities from specific subcellular regions of individual cells suitable for use with images from most fluorescent microscopes. Key to this workflow is the implementation of the freely available Cell Profiler software(1) to distinguish individual cells in these images, segment them into defined subcellular regions and deliver fluorescence marker intensity values specific to these regions. The extraction of individual cell intensity values from image data is the central purpose of this workflow and will be illustrated with the analysis of control data from a siRNA screen for G1 checkpoint regulators in adherent human cells. However, the workflow presented here can be applied to analysis of data from other means of cell perturbation (e.g., compound screens) and other forms of fluorescence based cellular markers and thus should be useful for a wide range of laboratories.

  11. Quantification of myocardial fibrosis by digital image analysis and interactive stereology

    PubMed Central

    2014-01-01

    Background Cardiac fibrosis disrupts the normal myocardial structure and has a direct impact on heart function and survival. Despite already available digital methods, the pathologist’s visual score is still widely considered as ground truth and used as a primary method in histomorphometric evaluations. The aim of this study was to compare the accuracy of digital image analysis tools and the pathologist’s visual scoring for evaluating fibrosis in human myocardial biopsies, based on reference data obtained by point counting performed on the same images. Methods Endomyocardial biopsy material from 38 patients diagnosed with inflammatory dilated cardiomyopathy was used. The extent of total cardiac fibrosis was assessed by image analysis on Masson’s trichrome-stained tissue specimens using automated Colocalization and Genie software, by Stereology grid count and manually by Pathologist’s visual score. Results A total of 116 slides were analyzed. The mean results obtained by the Colocalization software (13.72 ± 12.24%) were closest to the reference value of stereology (RVS), while the Genie software and Pathologist score gave a slight underestimation. RVS values correlated strongly with values obtained using the Colocalization and Genie (r > 0.9, p < 0.001) software as well as the pathologist visual score. Differences in fibrosis quantification by Colocalization and RVS were statistically insignificant. However, significant bias was found in the results obtained by using Genie versus RVS and pathologist score versus RVS with mean difference values of: -1.61% and 2.24%. Bland-Altman plots showed a bidirectional bias dependent on the magnitude of the measurement: Colocalization software overestimated the area fraction of fibrosis in the lower end, and underestimated in the higher end of the RVS values. Meanwhile, Genie software as well as the pathologist score showed more uniform results throughout the values, with a slight underestimation in the mid-range for both. Conclusion Both applied digital image analysis methods revealed almost perfect correlation with the criterion standard obtained by stereology grid count and, in terms of accuracy, outperformed the pathologist’s visual score. Genie algorithm proved to be the method of choice with the only drawback of a slight underestimation bias, which is considered acceptable for both clinical and research evaluations. Virtual slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/9857909611227193 PMID:24912374

  12. Quantification of myocardial fibrosis by digital image analysis and interactive stereology.

    PubMed

    Daunoravicius, Dainius; Besusparis, Justinas; Zurauskas, Edvardas; Laurinaviciene, Aida; Bironaite, Daiva; Pankuweit, Sabine; Plancoulaine, Benoit; Herlin, Paulette; Bogomolovas, Julius; Grabauskiene, Virginija; Laurinavicius, Arvydas

    2014-06-09

    Cardiac fibrosis disrupts the normal myocardial structure and has a direct impact on heart function and survival. Despite already available digital methods, the pathologist's visual score is still widely considered as ground truth and used as a primary method in histomorphometric evaluations. The aim of this study was to compare the accuracy of digital image analysis tools and the pathologist's visual scoring for evaluating fibrosis in human myocardial biopsies, based on reference data obtained by point counting performed on the same images. Endomyocardial biopsy material from 38 patients diagnosed with inflammatory dilated cardiomyopathy was used. The extent of total cardiac fibrosis was assessed by image analysis on Masson's trichrome-stained tissue specimens using automated Colocalization and Genie software, by Stereology grid count and manually by Pathologist's visual score. A total of 116 slides were analyzed. The mean results obtained by the Colocalization software (13.72 ± 12.24%) were closest to the reference value of stereology (RVS), while the Genie software and Pathologist score gave a slight underestimation. RVS values correlated strongly with values obtained using the Colocalization and Genie (r>0.9, p<0.001) software as well as the pathologist visual score. Differences in fibrosis quantification by Colocalization and RVS were statistically insignificant. However, significant bias was found in the results obtained by using Genie versus RVS and pathologist score versus RVS with mean difference values of: -1.61% and 2.24%. Bland-Altman plots showed a bidirectional bias dependent on the magnitude of the measurement: Colocalization software overestimated the area fraction of fibrosis in the lower end, and underestimated in the higher end of the RVS values. Meanwhile, Genie software as well as the pathologist score showed more uniform results throughout the values, with a slight underestimation in the mid-range for both. Both applied digital image analysis methods revealed almost perfect correlation with the criterion standard obtained by stereology grid count and, in terms of accuracy, outperformed the pathologist's visual score. Genie algorithm proved to be the method of choice with the only drawback of a slight underestimation bias, which is considered acceptable for both clinical and research evaluations. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/9857909611227193.

  13. Enhanced Visualization of Subtle Outer Retinal Pathology by En Face Optical Coherence Tomography and Correlation with Multi-Modal Imaging

    PubMed Central

    Chew, Avenell L.; Lamey, Tina; McLaren, Terri; De Roach, John

    2016-01-01

    Purpose To present en face optical coherence tomography (OCT) images generated by graph-search theory algorithm-based custom software and examine correlation with other imaging modalities. Methods En face OCT images derived from high density OCT volumetric scans of 3 healthy subjects and 4 patients using a custom algorithm (graph-search theory) and commercial software (Heidelberg Eye Explorer software (Heidelberg Engineering)) were compared and correlated with near infrared reflectance, fundus autofluorescence, adaptive optics flood-illumination ophthalmoscopy (AO-FIO) and microperimetry. Results Commercial software was unable to generate accurate en face OCT images in eyes with retinal pigment epithelium (RPE) pathology due to segmentation error at the level of Bruch’s membrane (BM). Accurate segmentation of the basal RPE and BM was achieved using custom software. The en face OCT images from eyes with isolated interdigitation or ellipsoid zone pathology were of similar quality between custom software and Heidelberg Eye Explorer software in the absence of any other significant outer retinal pathology. En face OCT images demonstrated angioid streaks, lesions of acute macular neuroretinopathy, hydroxychloroquine toxicity and Bietti crystalline deposits that correlated with other imaging modalities. Conclusions Graph-search theory algorithm helps to overcome the limitations of outer retinal segmentation inaccuracies in commercial software. En face OCT images can provide detailed topography of the reflectivity within a specific layer of the retina which correlates with other forms of fundus imaging. Our results highlight the need for standardization of image reflectivity to facilitate quantification of en face OCT images and longitudinal analysis. PMID:27959968

  14. Radio-frequency energy quantification in magnetic resonance imaging

    NASA Astrophysics Data System (ADS)

    Alon, Leeor

    Mapping of radio frequency (RF) energy deposition has been challenging for 50+ years, especially, when scanning patients in the magnetic resonance imaging (MRI) environment. As result, electromagnetic simulation software is often used for estimating the specific absorption rate (SAR), the rate of RF energy deposition in tissue. The thesis work presents challenges associated with aligning information provided by electromagnetic simulation and MRI experiments. As result of the limitations of simulations, experimental methods for the quantification of SAR were established. A system for quantification of the total RF energy deposition was developed for parallel transmit MRI (a system that uses multiple antennas to excite and image the body). The system is capable of monitoring and predicting channel-by-channel RF energy deposition, whole body SAR and capable of tracking potential hardware failures that occur in the transmit chain and may cause the deposition of excessive energy into patients. Similarly, we demonstrated that local RF power deposition can be mapped and predicted for parallel transmit systems based on a series of MRI temperature mapping acquisitions. Resulting from the work, we developed tools for optimal reconstruction temperature maps from MRI acquisitions. The tools developed for temperature mapping paved the way for utilizing MRI as a diagnostic tool for evaluation of RF/microwave emitting device safety. Quantification of the RF energy was demonstrated for both MRI compatible and non-MRI-compatible devices (such as cell phones), while having the advantage of being noninvasive, of providing millimeter resolution and high accuracy.

  15. Effects of RNA integrity on transcript quantification by total RNA sequencing of clinically collected human placental samples.

    PubMed

    Reiman, Mario; Laan, Maris; Rull, Kristiina; Sõber, Siim

    2017-08-01

    RNA degradation is a ubiquitous process that occurs in living and dead cells, as well as during handling and storage of extracted RNA. Reduced RNA quality caused by degradation is an established source of uncertainty for all RNA-based gene expression quantification techniques. RNA sequencing is an increasingly preferred method for transcriptome analyses, and dependence of its results on input RNA integrity is of significant practical importance. This study aimed to characterize the effects of varying input RNA integrity [estimated as RNA integrity number (RIN)] on transcript level estimates and delineate the characteristic differences between transcripts that differ in degradation rate. The study used ribodepleted total RNA sequencing data from a real-life clinically collected set ( n = 32) of human solid tissue (placenta) samples. RIN-dependent alterations in gene expression profiles were quantified by using DESeq2 software. Our results indicate that small differences in RNA integrity affect gene expression quantification by introducing a moderate and pervasive bias in expression level estimates that significantly affected 8.1% of studied genes. The rapidly degrading transcript pool was enriched in pseudogenes, short noncoding RNAs, and transcripts with extended 3' untranslated regions. Typical slowly degrading transcripts (median length, 2389 nt) represented protein coding genes with 4-10 exons and high guanine-cytosine content.-Reiman, M., Laan, M., Rull, K., Sõber, S. Effects of RNA integrity on transcript quantification by total RNA sequencing of clinically collected human placental samples. © FASEB.

  16. Design and Use of a Full Flow Sampling System (FFS) for the Quantification of Methane Emissions

    PubMed Central

    Johnson, Derek R.; Covington, April N.; Clark, Nigel N.

    2016-01-01

    The use of natural gas continues to grow with increased discovery and production of unconventional shale resources. At the same time, the natural gas industry faces continued scrutiny for methane emissions from across the supply chain, due to methane's relatively high global warming potential (25-84x that of carbon dioxide, according to the Energy Information Administration). Currently, a variety of techniques of varied uncertainties exists to measure or estimate methane emissions from components or facilities. Currently, only one commercial system is available for quantification of component level emissions and recent reports have highlighted its weaknesses. In order to improve accuracy and increase measurement flexibility, we have designed, developed, and implemented a novel full flow sampling system (FFS) for quantification of methane emissions and greenhouse gases based on transportation emissions measurement principles. The FFS is a modular system that consists of an explosive-proof blower(s), mass airflow sensor(s) (MAF), thermocouple, sample probe, constant volume sampling pump, laser based greenhouse gas sensor, data acquisition device, and analysis software. Dependent upon the blower and hose configuration employed, the current FFS is able to achieve a flow rate ranging from 40 to 1,500 standard cubic feet per minute (SCFM). Utilization of laser-based sensors mitigates interference from higher hydrocarbons (C2+). Co-measurement of water vapor allows for humidity correction. The system is portable, with multiple configurations for a variety of applications ranging from being carried by a person to being mounted in a hand drawn cart, on-road vehicle bed, or from the bed of utility terrain vehicles (UTVs). The FFS is able to quantify methane emission rates with a relative uncertainty of ± 4.4%. The FFS has proven, real world operation for the quantification of methane emissions occurring in conventional and remote facilities. PMID:27341646

  17. Design and Use of a Full Flow Sampling System (FFS) for the Quantification of Methane Emissions.

    PubMed

    Johnson, Derek R; Covington, April N; Clark, Nigel N

    2016-06-12

    The use of natural gas continues to grow with increased discovery and production of unconventional shale resources. At the same time, the natural gas industry faces continued scrutiny for methane emissions from across the supply chain, due to methane's relatively high global warming potential (25-84x that of carbon dioxide, according to the Energy Information Administration). Currently, a variety of techniques of varied uncertainties exists to measure or estimate methane emissions from components or facilities. Currently, only one commercial system is available for quantification of component level emissions and recent reports have highlighted its weaknesses. In order to improve accuracy and increase measurement flexibility, we have designed, developed, and implemented a novel full flow sampling system (FFS) for quantification of methane emissions and greenhouse gases based on transportation emissions measurement principles. The FFS is a modular system that consists of an explosive-proof blower(s), mass airflow sensor(s) (MAF), thermocouple, sample probe, constant volume sampling pump, laser based greenhouse gas sensor, data acquisition device, and analysis software. Dependent upon the blower and hose configuration employed, the current FFS is able to achieve a flow rate ranging from 40 to 1,500 standard cubic feet per minute (SCFM). Utilization of laser-based sensors mitigates interference from higher hydrocarbons (C2+). Co-measurement of water vapor allows for humidity correction. The system is portable, with multiple configurations for a variety of applications ranging from being carried by a person to being mounted in a hand drawn cart, on-road vehicle bed, or from the bed of utility terrain vehicles (UTVs). The FFS is able to quantify methane emission rates with a relative uncertainty of ± 4.4%. The FFS has proven, real world operation for the quantification of methane emissions occurring in conventional and remote facilities.

  18. Leveraging OpenStudio's Application Programming Interfaces: Preprint

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Long, N.; Ball, B.; Goldwasser, D.

    2013-11-01

    OpenStudio development efforts have been focused on providing Application Programming Interfaces (APIs) where users are able to extend OpenStudio without the need to compile the open source libraries. This paper will discuss the basic purposes and functionalities of the core libraries that have been wrapped with APIs including the Building Model, Results Processing, Advanced Analysis, UncertaintyQuantification, and Data Interoperability through Translators. Several building energy modeling applications have been produced using OpenStudio's API and Software Development Kits (SDK) including the United States Department of Energy's Asset ScoreCalculator, a mobile-based audit tool, an energy design assistance reporting protocol, and a portfolio scalemore » incentive optimization analysismethodology. Each of these software applications will be discussed briefly and will describe how the APIs were leveraged for various uses including high-level modeling, data transformations from detailed building audits, error checking/quality assurance of models, and use of high-performance computing for mass simulations.« less

  19. The Effect of the Ill-posed Problem on Quantitative Error Assessment in Digital Image Correlation

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Lehoucq, R. B.; Reu, P. L.; Turner, D. Z.

    Here, this work explores the effect of the ill-posed problem on uncertainty quantification for motion estimation using digital image correlation (DIC) (Sutton et al. 2009). We develop a correction factor for standard uncertainty estimates based on the cosine of the angle between the true motion and the image gradients, in an integral sense over a subregion of the image. This correction factor accounts for variability in the DIC solution previously unaccounted for when considering only image noise, interpolation bias, contrast, and the software settings such as subset size and spacing.

  20. The Effect of the Ill-posed Problem on Quantitative Error Assessment in Digital Image Correlation

    DOE PAGES

    Lehoucq, R. B.; Reu, P. L.; Turner, D. Z.

    2017-11-27

    Here, this work explores the effect of the ill-posed problem on uncertainty quantification for motion estimation using digital image correlation (DIC) (Sutton et al. 2009). We develop a correction factor for standard uncertainty estimates based on the cosine of the angle between the true motion and the image gradients, in an integral sense over a subregion of the image. This correction factor accounts for variability in the DIC solution previously unaccounted for when considering only image noise, interpolation bias, contrast, and the software settings such as subset size and spacing.

  1. Tutorial examples for uncertainty quantification methods.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    De Bord, Sarah

    2015-08-01

    This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.

  2. Statistical Validation for Clinical Measures: Repeatability and Agreement of Kinect™-Based Software

    PubMed Central

    Tello, Emanuel; Rodrigo, Alejandro; Valentinuzzi, Max E.

    2018-01-01

    Background The rehabilitation process is a fundamental stage for recovery of people's capabilities. However, the evaluation of the process is performed by physiatrists and medical doctors, mostly based on their observations, that is, a subjective appreciation of the patient's evolution. This paper proposes a tracking platform of the movement made by an individual's upper limb using Kinect sensor(s) to be applied for the patient during the rehabilitation process. The main contribution is the development of quantifying software and the statistical validation of its performance, repeatability, and clinical use in the rehabilitation process. Methods The software determines joint angles and upper limb trajectories for the construction of a specific rehabilitation protocol and quantifies the treatment evolution. In turn, the information is presented via a graphical interface that allows the recording, storage, and report of the patient's data. For clinical purposes, the software information is statistically validated with three different methodologies, comparing the measures with a goniometer in terms of agreement and repeatability. Results The agreement of joint angles measured with the proposed software and goniometer is evaluated with Bland-Altman plots; all measurements fell well within the limits of agreement, meaning interchangeability of both techniques. Additionally, the results of Bland-Altman analysis of repeatability show 95% confidence. Finally, the physiotherapists' qualitative assessment shows encouraging results for the clinical use. Conclusion The main conclusion is that the software is capable of offering a clinical history of the patient and is useful for quantification of the rehabilitation success. The simplicity, low cost, and visualization possibilities enhance the use of the software Kinect for rehabilitation and other applications, and the expert's opinion endorses the choice of our approach for clinical practice. Comparison of the new measurement technique with established goniometric methods determines that the proposed software agrees sufficiently to be used interchangeably. PMID:29750166

  3. Large-scale time-lapse microscopy of Oct4 expression in human embryonic stem cell colonies.

    PubMed

    Bhadriraju, Kiran; Halter, Michael; Amelot, Julien; Bajcsy, Peter; Chalfoun, Joe; Vandecreme, Antoine; Mallon, Barbara S; Park, Kye-Yoon; Sista, Subhash; Elliott, John T; Plant, Anne L

    2016-07-01

    Identification and quantification of the characteristics of stem cell preparations is critical for understanding stem cell biology and for the development and manufacturing of stem cell based therapies. We have developed image analysis and visualization software that allows effective use of time-lapse microscopy to provide spatial and dynamic information from large numbers of human embryonic stem cell colonies. To achieve statistically relevant sampling, we examined >680 colonies from 3 different preparations of cells over 5days each, generating a total experimental dataset of 0.9 terabyte (TB). The 0.5 Giga-pixel images at each time point were represented by multi-resolution pyramids and visualized using the Deep Zoom Javascript library extended to support viewing Giga-pixel images over time and extracting data on individual colonies. We present a methodology that enables quantification of variations in nominally-identical preparations and between colonies, correlation of colony characteristics with Oct4 expression, and identification of rare events. Copyright © 2016. Published by Elsevier B.V.

  4. An Excel-based implementation of the spectral method of action potential alternans analysis.

    PubMed

    Pearman, Charles M

    2014-12-01

    Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro-arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T-wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. © 2014 The Author. Physiological Reports published by Wiley Periodicals, Inc. on behalf of the American Physiological Society and The Physiological Society.

  5. MIXING QUANTIFICATION BY VISUAL IMAGING ANALYSIS

    EPA Science Inventory

    This paper reports on development of a method for quantifying two measures of mixing, the scale and intensity of segregation, through flow visualization, video recording, and software analysis. This non-intrusive method analyzes a planar cross section of a flowing system from an ...

  6. MSiReader v1.0: Evolving Open-Source Mass Spectrometry Imaging Software for Targeted and Untargeted Analyses

    NASA Astrophysics Data System (ADS)

    Bokhart, Mark T.; Nazari, Milad; Garrard, Kenneth P.; Muddiman, David C.

    2018-01-01

    A major update to the mass spectrometry imaging (MSI) software MSiReader is presented, offering a multitude of newly added features critical to MSI analyses. MSiReader is a free, open-source, and vendor-neutral software written in the MATLAB platform and is capable of analyzing most common MSI data formats. A standalone version of the software, which does not require a MATLAB license, is also distributed. The newly incorporated data analysis features expand the utility of MSiReader beyond simple visualization of molecular distributions. The MSiQuantification tool allows researchers to calculate absolute concentrations from quantification MSI experiments exclusively through MSiReader software, significantly reducing data analysis time. An image overlay feature allows the incorporation of complementary imaging modalities to be displayed with the MSI data. A polarity filter has also been incorporated into the data loading step, allowing the facile analysis of polarity switching experiments without the need for data parsing prior to loading the data file into MSiReader. A quality assurance feature to generate a mass measurement accuracy (MMA) heatmap for an analyte of interest has also been added to allow for the investigation of MMA across the imaging experiment. Most importantly, as new features have been added performance has not degraded, in fact it has been dramatically improved. These new tools and the improvements to the performance in MSiReader v1.0 enable the MSI community to evaluate their data in greater depth and in less time. [Figure not available: see fulltext.

  7. DynamicRoots: A Software Platform for the Reconstruction and Analysis of Growing Plant Roots.

    PubMed

    Symonova, Olga; Topp, Christopher N; Edelsbrunner, Herbert

    2015-01-01

    We present a software platform for reconstructing and analyzing the growth of a plant root system from a time-series of 3D voxelized shapes. It aligns the shapes with each other, constructs a geometric graph representation together with the function that records the time of growth, and organizes the branches into a hierarchy that reflects the order of creation. The software includes the automatic computation of structural and dynamic traits for each root in the system enabling the quantification of growth on fine-scale. These are important advances in plant phenotyping with applications to the study of genetic and environmental influences on growth.

  8. An online database for plant image analysis software tools.

    PubMed

    Lobet, Guillaume; Draye, Xavier; Périlleux, Claire

    2013-10-09

    Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is best suited for their research. We present an online, manually curated, database referencing more than 90 plant image analysis software solutions. The website, plant-image-analysis.org, presents each software in a uniform and concise manner enabling users to identify the available solutions for their experimental needs. The website also enables user feedback, evaluations and new software submissions. The plant-image-analysis.org database provides an overview of existing plant image analysis software. The aim of such a toolbox is to help users to find solutions, and to provide developers a way to exchange and communicate about their work.

  9. Random, double- and single-strand DNA breaks can be differentiated in the method of Comet assay by the shape of the comet image.

    PubMed

    Georgieva, Milena; Zagorchev, Plamen; Miloshev, George

    2015-10-01

    Comet assay is an invaluable tool in DNA research. It is widely used to detect DNA damage as an indicator of exposure to genotoxic stress. A canonical set of parameters and specialized software programs exist for Comet assay data quantification and analysis. None of them so far has proven its potential to employ a computer-based algorithm for assessment of the shape of the comet as an indicator of the exact mechanism by which the studied genotoxins cut in the molecule of DNA. Here, we present 14 unique measurements of the comet image based on the comet morphology. Their mathematical derivation and statistical analysis allowed precise description of the shape of the comet image which in turn discriminated the cause of genotoxic stress. This algorithm led to the development of the "CometShape" software which allowed easy discrimination among different genotoxins depending on the type of DNA damage they induce. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. VESGEN Software for Mapping and Quantification of Vascular Regulators

    NASA Technical Reports Server (NTRS)

    Parsons-Wingerter, Patricia A.; Vickerman, Mary B.; Keith, Patricia A.

    2012-01-01

    VESsel GENeration (VESGEN) Analysis is an automated software that maps and quantifies effects of vascular regulators on vascular morphology by analyzing important vessel parameters. Quantification parameters include vessel diameter, length, branch points, density, and fractal dimension. For vascular trees, measurements are reported as dependent functions of vessel branching generation. VESGEN maps and quantifies vascular morphological events according to fractal-based vascular branching generation. It also relies on careful imaging of branching and networked vascular form. It was developed as a plug-in for ImageJ (National Institutes of Health, USA). VESGEN uses image-processing concepts of 8-neighbor pixel connectivity, skeleton, and distance map to analyze 2D, black-and-white (binary) images of vascular trees, networks, and tree-network composites. VESGEN maps typically 5 to 12 (or more) generations of vascular branching, starting from a single parent vessel. These generations are tracked and measured for critical vascular parameters that include vessel diameter, length, density and number, and tortuosity per branching generation. The effects of vascular therapeutics and regulators on vascular morphology and branching tested in human clinical or laboratory animal experimental studies are quantified by comparing vascular parameters with control groups. VESGEN provides a user interface to both guide and allow control over the users vascular analysis process. An option is provided to select a morphological tissue type of vascular trees, network or tree-network composites, which determines the general collections of algorithms, intermediate images, and output images and measurements that will be produced.

  11. Beyond the proteome: Mass Spectrometry Special Interest Group (MS-SIG) at ISMB/ECCB 2013

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Ryu, Soyoung; Payne, Samuel H.; Schaab, Christoph

    2014-07-02

    Mass spectrometry special interest group (MS-SIG) aims to bring together experts from the global research community to discuss highlights and challenges in the field of mass spectrometry (MS)-based proteomics and computational biology. The rapid echnological developments in MS-based proteomics have enabled the generation of a large amount of meaningful information on hundreds to thousands of proteins simultaneously from a biological sample; however, the complexity of the MS data require sophisticated computational algorithms and software for data analysis and interpretation. This year’s MS-SIG meeting theme was ‘Beyond the Proteome’ with major focuses on improving protein identification/quantification and using proteomics data tomore » solve interesting problems in systems biology and clinical research.« less

  12. MRM-DIFF: data processing strategy for differential analysis in large scale MRM-based lipidomics studies

    PubMed Central

    Tsugawa, Hiroshi; Ohta, Erika; Izumi, Yoshihiro; Ogiwara, Atsushi; Yukihira, Daichi; Bamba, Takeshi; Fukusaki, Eiichiro; Arita, Masanori

    2015-01-01

    Based on theoretically calculated comprehensive lipid libraries, in lipidomics as many as 1000 multiple reaction monitoring (MRM) transitions can be monitored for each single run. On the other hand, lipid analysis from each MRM chromatogram requires tremendous manual efforts to identify and quantify lipid species. Isotopic peaks differing by up to a few atomic masses further complicate analysis. To accelerate the identification and quantification process we developed novel software, MRM-DIFF, for the differential analysis of large-scale MRM assays. It supports a correlation optimized warping (COW) algorithm to align MRM chromatograms and utilizes quality control (QC) sample datasets to automatically adjust the alignment parameters. Moreover, user-defined reference libraries that include the molecular formula, retention time, and MRM transition can be used to identify target lipids and to correct peak abundances by considering isotopic peaks. Here, we demonstrate the software pipeline and introduce key points for MRM-based lipidomics research to reduce the mis-identification and overestimation of lipid profiles. The MRM-DIFF program, example data set and the tutorials are downloadable at the “Standalone software” section of the PRIMe (Platform for RIKEN Metabolomics, http://prime.psc.riken.jp/) database website. PMID:25688256

  13. MorphoGraphX: A platform for quantifying morphogenesis in 4D.

    PubMed

    Barbier de Reuille, Pierre; Routier-Kierzkowska, Anne-Lise; Kierzkowski, Daniel; Bassel, George W; Schüpbach, Thierry; Tauriello, Gerardo; Bajpai, Namrata; Strauss, Sören; Weber, Alain; Kiss, Annamaria; Burian, Agata; Hofhuis, Hugo; Sapala, Aleksandra; Lipowczan, Marcin; Heimlicher, Maria B; Robinson, Sarah; Bayer, Emmanuelle M; Basler, Konrad; Koumoutsakos, Petros; Roeder, Adrienne H K; Aegerter-Wilmsen, Tinri; Nakayama, Naomi; Tsiantis, Miltos; Hay, Angela; Kwiatkowska, Dorota; Xenarios, Ioannis; Kuhlemeier, Cris; Smith, Richard S

    2015-05-06

    Morphogenesis emerges from complex multiscale interactions between genetic and mechanical processes. To understand these processes, the evolution of cell shape, proliferation and gene expression must be quantified. This quantification is usually performed either in full 3D, which is computationally expensive and technically challenging, or on 2D planar projections, which introduces geometrical artifacts on highly curved organs. Here we present MorphoGraphX ( www.MorphoGraphX.org), a software that bridges this gap by working directly with curved surface images extracted from 3D data. In addition to traditional 3D image analysis, we have developed algorithms to operate on curved surfaces, such as cell segmentation, lineage tracking and fluorescence signal quantification. The software's modular design makes it easy to include existing libraries, or to implement new algorithms. Cell geometries extracted with MorphoGraphX can be exported and used as templates for simulation models, providing a powerful platform to investigate the interactions between shape, genes and growth.

  14. CCD Camera Detection of HIV Infection.

    PubMed

    Day, John R

    2017-01-01

    Rapid and precise quantification of the infectivity of HIV is important for molecular virologic studies, as well as for measuring the activities of antiviral drugs and neutralizing antibodies. An indicator cell line, a CCD camera, and image-analysis software are used to quantify HIV infectivity. The cells of the P4R5 line, which express the receptors for HIV infection as well as β-galactosidase under the control of the HIV-1 long terminal repeat, are infected with HIV and then incubated 2 days later with X-gal to stain the infected cells blue. Digital images of monolayers of the infected cells are captured using a high resolution CCD video camera and a macro video zoom lens. A software program is developed to process the images and to count the blue-stained foci of infection. The described method allows for the rapid quantification of the infected cells over a wide range of viral inocula with reproducibility, accuracy and at relatively low cost.

  15. Bioinformatics Pertinent to Lipid Analysis in Biological Samples.

    PubMed

    Ma, Justin; Arbelo, Ulises; Guerra, Yenifer; Aribindi, Katyayini; Bhattacharya, Sanjoy K; Pelaez, Daniel

    2017-01-01

    Electrospray ionization mass spectrometry has revolutionized the way lipids are studied. In this work, we present a tutorial for analyzing class-specific lipid spectra obtained from a triple quadrupole mass spectrometer. The open-source software MZmine 2.21 is used, coupled with LIPID MAPS databases. Here, we describe the steps for lipid identification, ratiometric quantification, and briefly address the differences to the analyses when using direct infusion versus tandem liquid chromatography-mass spectrometry (LC-MS). We also provide a tutorial and equations for quantification of lipid amounts using synthetic lipid standards and normalization to a protein amount.

  16. Development of magnetic resonance technology for noninvasive boron quantification

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bradshaw, K.M.

    1990-11-01

    Boron magnetic resonance imaging (MRI) and spectroscopy (MRS) were developed in support of the noninvasive boron quantification task of the Idaho National Engineering Laboratory (INEL) Power Burst Facility/Boron Neutron Capture Therapy (PBF/BNCT) program. The hardware and software described in this report are modifications specific to a GE Signa{trademark} MRI system, release 3.X and are necessary for boron magnetic resonance operation. The technology developed in this task has been applied to obtaining animal pharmacokinetic data of boron compounds (drug time response) and the in-vivo localization of boron in animal tissue noninvasively. 9 refs., 21 figs.

  17. A novel strategy using MASCOT Distiller for analysis of cleavable isotope-coded affinity tag data to quantify protein changes in plasma.

    PubMed

    Leung, Kit-Yi; Lescuyer, Pierre; Campbell, James; Byers, Helen L; Allard, Laure; Sanchez, Jean-Charles; Ward, Malcolm A

    2005-08-01

    A novel strategy consisting of cleavable Isotope-Coded Affinity Tag (cICAT) combined with MASCOT Distiller was evaluated as a tool for the quantification of proteins in "abnormal" patient plasma, prepared by pooling samples from patients with acute stroke. Quantification of all light and heavy cICAT-labelled peptide ion pairs was obtained using MASCOT Distiller combined with a proprietary software. Peptides displaying differences were selected for identification by MS. These preliminary results show the promise of our approach to identify potential biomarkers.

  18. Data Independent Acquisition analysis in ProHits 4.0.

    PubMed

    Liu, Guomin; Knight, James D R; Zhang, Jian Ping; Tsou, Chih-Chiang; Wang, Jian; Lambert, Jean-Philippe; Larsen, Brett; Tyers, Mike; Raught, Brian; Bandeira, Nuno; Nesvizhskii, Alexey I; Choi, Hyungwon; Gingras, Anne-Claude

    2016-10-21

    Affinity purification coupled with mass spectrometry (AP-MS) is a powerful technique for the identification and quantification of physical interactions. AP-MS requires careful experimental design, appropriate control selection and quantitative workflows to successfully identify bona fide interactors amongst a large background of contaminants. We previously introduced ProHits, a Laboratory Information Management System for interaction proteomics, which tracks all samples in a mass spectrometry facility, initiates database searches and provides visualization tools for spectral counting-based AP-MS approaches. More recently, we implemented Significance Analysis of INTeractome (SAINT) within ProHits to provide scoring of interactions based on spectral counts. Here, we provide an update to ProHits to support Data Independent Acquisition (DIA) with identification software (DIA-Umpire and MSPLIT-DIA), quantification tools (through DIA-Umpire, or externally via targeted extraction), and assessment of quantitative enrichment (through mapDIA) and scoring of interactions (through SAINT-intensity). With additional improvements, notably support of the iProphet pipeline, facilitated deposition into ProteomeXchange repositories and enhanced export and viewing functions, ProHits 4.0 offers a comprehensive suite of tools to facilitate affinity proteomics studies. It remains challenging to score, annotate and analyze proteomics data in a transparent manner. ProHits was previously introduced as a LIMS to enable storing, tracking and analysis of standard AP-MS data. In this revised version, we expand ProHits to include integration with a number of identification and quantification tools based on Data-Independent Acquisition (DIA). ProHits 4.0 also facilitates data deposition into public repositories, and the transfer of data to new visualization tools. Copyright © 2016 Elsevier B.V. All rights reserved.

  19. Optimization of monitoring networks based on uncertainty quantification of model predictions of contaminant transport

    NASA Astrophysics Data System (ADS)

    Vesselinov, V. V.; Harp, D.

    2010-12-01

    The process of decision making to protect groundwater resources requires a detailed estimation of uncertainties in model predictions. Various uncertainties associated with modeling a natural system, such as: (1) measurement and computational errors; (2) uncertainties in the conceptual model and model-parameter estimates; (3) simplifications in model setup and numerical representation of governing processes, contribute to the uncertainties in the model predictions. Due to this combination of factors, the sources of predictive uncertainties are generally difficult to quantify individually. Decision support related to optimal design of monitoring networks requires (1) detailed analyses of existing uncertainties related to model predictions of groundwater flow and contaminant transport, (2) optimization of the proposed monitoring network locations in terms of their efficiency to detect contaminants and provide early warning. We apply existing and newly-proposed methods to quantify predictive uncertainties and to optimize well locations. An important aspect of the analysis is the application of newly-developed optimization technique based on coupling of Particle Swarm and Levenberg-Marquardt optimization methods which proved to be robust and computationally efficient. These techniques and algorithms are bundled in a software package called MADS. MADS (Model Analyses for Decision Support) is an object-oriented code that is capable of performing various types of model analyses and supporting model-based decision making. The code can be executed under different computational modes, which include (1) sensitivity analyses (global and local), (2) Monte Carlo analysis, (3) model calibration, (4) parameter estimation, (5) uncertainty quantification, and (6) model selection. The code can be externally coupled with any existing model simulator through integrated modules that read/write input and output files using a set of template and instruction files (consistent with the PEST I/O protocol). MADS can also be internally coupled with a series of built-in analytical simulators. MADS provides functionality to work directly with existing control files developed for the code PEST (Doherty 2009). To perform the computational modes mentioned above, the code utilizes (1) advanced Latin-Hypercube sampling techniques (including Improved Distributed Sampling), (2) various gradient-based Levenberg-Marquardt optimization methods, (3) advanced global optimization methods (including Particle Swarm Optimization), and (4) a selection of alternative objective functions. The code has been successfully applied to perform various model analyses related to environmental management of real contamination sites. Examples include source identification problems, quantification of uncertainty, model calibration, and optimization of monitoring networks. The methodology and software codes are demonstrated using synthetic and real case studies where monitoring networks are optimized taking into account the uncertainty in model predictions of contaminant transport.

  20. Aviation Environmental Design Tool (AEDT) : Uncertainty Quantification Supplemental Report : Version 2a Service Pack 2 (SP2)

    DOT National Transportation Integrated Search

    2014-05-01

    The Federal Aviation Administration, Office of Environment and Energy (FAA-AEE) has developed the Aviation Environmental Design Tool (AEDT) version 2a software system with the support of the following development team: FAA, National Aeronautics and S...

  1. Development of Two Analytical Methods Based on Reverse Phase Chromatographic and SDS-PAGE Gel for Assessment of Deglycosylation Yield in N-Glycan Mapping.

    PubMed

    Eckard, Anahita D; Dupont, David R; Young, Johnie K

    2018-01-01

    N -lined glycosylation is one of the critical quality attributes (CQA) for biotherapeutics impacting the safety and activity of drug product. Changes in pattern and level of glycosylation can significantly alter the intrinsic properties of the product and, therefore, have to be monitored throughout its lifecycle. Therefore fast, precise, and unbiased N -glycan mapping assay is desired. To ensure these qualities, using analytical methods that evaluate completeness of deglycosylation is necessary. For quantification of deglycosylation yield, methods such as reduced liquid chromatography-mass spectrometry (LC-MS) and reduced capillary gel electrophoresis (CGE) have been commonly used. Here we present development of two additional methods to evaluate deglycosylation yield: one based on LC using reverse phase (RP) column and one based on reduced sodium dodecyl sulphate-polyacrylamide gel electrophoresis (SDS-PAGE gel) with offline software (GelAnalyzer). With the advent of rapid deglycosylation workflows in the market for N -glycan profiling replacing overnight incubation, we have aimed to quantify the level of deglycosylation in a selected rapid deglycosylation workflow. Our results have shown well resolved peaks of glycosylated and deglycosylated protein species with RP-LC method allowing simple quantification of deglycosylation yield of protein with high confidence. Additionally a good correlation, ≥0.94, was found between deglycosylation yields estimated by RP-LC method and that of reduced SDS-PAGE gel method with offline software. Evaluation of rapid deglycosylation protocol from GlycanAssure™ HyPerformance assay kit performed on fetuin and RNase B has shown complete deglycosylation within the recommended protocol time when evaluated with these techniques. Using this kit, N -glycans from NIST mAb were prepared in 1.4 hr and analyzed by hydrophilic interaction chromatography (HILIC) ultrahigh performance LC (UHPLC) equipped with a fluorescence detector (FLD). 37 peaks were resolved with good resolution. Excellent sample preparation repeatability was found with relative standard deviation (RSD) of <5% for peaks with >0.5% relative area.

  2. Comparison of two software systems for quantification of myocardial blood flow in patients with hypertrophic cardiomyopathy.

    PubMed

    Yalcin, Hulya; Valenta, Ines; Zhao, Min; Tahari, Abdel; Lu, Dai-Yin; Higuchi, Takahiro; Yalcin, Fatih; Kucukler, Nagehan; Soleimanifard, Yalda; Zhou, Yun; Pomper, Martin G; Abraham, Theodore P; Tsui, Ben; Lodge, Martin A; Schindler, Thomas H; Roselle Abraham, M

    2018-01-22

    Quantification of myocardial blood flow (MBF) by positron emission tomography (PET) is important for investigation of angina in hypertrophic cardiomyopathy (HCM). Several software programs exist for MBF quantification, but they have been mostly evaluated in patients (with normal cardiac geometry), referred for evaluation of coronary artery disease (CAD). Software performance has not been evaluated in HCM patients who frequently have hyperdynamic LV function, LV outflow tract (LVOT) obstruction, small LV cavity size, and variation in the degree/location of LV hypertrophy. We compared results of MBF obtained using PMod, which permits manual segmentation, to those obtained by FDA-approved QPET software which has an automated segmentation algorithm. 13 N-ammonia PET perfusion data were acquired in list mode at rest and during pharmacologic vasodilation, in 76 HCM patients and 10 non-HCM patients referred for evaluation of CAD (CAD group.) Data were resampled to create static, ECG-gated and 36-frame-dynamic images. Myocardial flow reserve (MFR) and MBF (in ml/min/g) were calculated using QPET and PMod softwares. All HCM patients had asymmetric septal hypertrophy, and 50% had evidence of LVOT obstruction, whereas non-HCM patients (CAD group) had normal wall thickness and ejection fraction. PMod yielded significantly higher values for global and regional stress-MBF and MFR than for QPET in HCM. Reasonably fair correlation was observed for global rest-MBF, stress-MBF, and MFR using these two softwares (rest-MBF: r = 0.78; stress-MBF: r = 0.66.; MFR: r = 0.7) in HCM patients. Agreement between global MBF and MFR values improved when HCM patients with high spillover fractions (> 0.65) were excluded from the analysis (rest-MBF: r = 0.84; stress-MBF: r = 0.72; MFR: r = 0.8.) Regionally, the highest agreement between PMod and QPET was observed in the LAD territory (rest-MBF: r = 0.82, Stress-MBF: r = 0.68) where spillover fraction was the lowest. Unlike HCM patients, the non-HCM patients (CAD group) demonstrated excellent agreement in MBF/MFR values, obtained by the two softwares, when patients with high spillover fractions were excluded (rest-MBF: r = 0.95; stress-MBF: r = 0.92; MFR: r = 0.95). Anatomic characteristics specific to HCM hearts contribute to lower correlations between MBF/MFR values obtained by PMod and QPET, compared with non-HCM patients. These differences indicate that PMod and QPET cannot be used interchangeably for MBF/MFR analyses in HCM patients.

  3. Leaf-GP: an open and automated software application for measuring growth phenotypes for arabidopsis and wheat.

    PubMed

    Zhou, Ji; Applegate, Christopher; Alonso, Albor Dobon; Reynolds, Daniel; Orford, Simon; Mackiewicz, Michal; Griffiths, Simon; Penfield, Steven; Pullen, Nick

    2017-01-01

    Plants demonstrate dynamic growth phenotypes that are determined by genetic and environmental factors. Phenotypic analysis of growth features over time is a key approach to understand how plants interact with environmental change as well as respond to different treatments. Although the importance of measuring dynamic growth traits is widely recognised, available open software tools are limited in terms of batch image processing, multiple traits analyses, software usability and cross-referencing results between experiments, making automated phenotypic analysis problematic. Here, we present Leaf-GP (Growth Phenotypes), an easy-to-use and open software application that can be executed on different computing platforms. To facilitate diverse scientific communities, we provide three software versions, including a graphic user interface (GUI) for personal computer (PC) users, a command-line interface for high-performance computer (HPC) users, and a well-commented interactive Jupyter Notebook (also known as the iPython Notebook) for computational biologists and computer scientists. The software is capable of extracting multiple growth traits automatically from large image datasets. We have utilised it in Arabidopsis thaliana and wheat ( Triticum aestivum ) growth studies at the Norwich Research Park (NRP, UK). By quantifying a number of growth phenotypes over time, we have identified diverse plant growth patterns between different genotypes under several experimental conditions. As Leaf-GP has been evaluated with noisy image series acquired by different imaging devices (e.g. smartphones and digital cameras) and still produced reliable biological outputs, we therefore believe that our automated analysis workflow and customised computer vision based feature extraction software implementation can facilitate a broader plant research community for their growth and development studies. Furthermore, because we implemented Leaf-GP based on open Python-based computer vision, image analysis and machine learning libraries, we believe that our software not only can contribute to biological research, but also demonstrates how to utilise existing open numeric and scientific libraries (e.g. Scikit-image, OpenCV, SciPy and Scikit-learn) to build sound plant phenomics analytic solutions, in a efficient and effective way. Leaf-GP is a sophisticated software application that provides three approaches to quantify growth phenotypes from large image series. We demonstrate its usefulness and high accuracy based on two biological applications: (1) the quantification of growth traits for Arabidopsis genotypes under two temperature conditions; and (2) measuring wheat growth in the glasshouse over time. The software is easy-to-use and cross-platform, which can be executed on Mac OS, Windows and HPC, with open Python-based scientific libraries preinstalled. Our work presents the advancement of how to integrate computer vision, image analysis, machine learning and software engineering in plant phenomics software implementation. To serve the plant research community, our modulated source code, detailed comments, executables (.exe for Windows; .app for Mac), and experimental results are freely available at https://github.com/Crop-Phenomics-Group/Leaf-GP/releases.

  4. SDMtoolbox 2.0: the next generation Python-based GIS toolkit for landscape genetic, biogeographic and species distribution model analyses.

    PubMed

    Brown, Jason L; Bennett, Joseph R; French, Connor M

    2017-01-01

    SDMtoolbox 2.0 is a software package for spatial studies of ecology, evolution, and genetics. The release of SDMtoolbox 2.0 allows researchers to use the most current ArcGIS software and MaxEnt software, and reduces the amount of time that would be spent developing common solutions. The central aim of this software is to automate complicated and repetitive spatial analyses in an intuitive graphical user interface. One core tenant facilitates careful parameterization of species distribution models (SDMs) to maximize each model's discriminatory ability and minimize overfitting. This includes carefully processing of occurrence data, environmental data, and model parameterization. This program directly interfaces with MaxEnt, one of the most powerful and widely used species distribution modeling software programs, although SDMtoolbox 2.0 is not limited to species distribution modeling or restricted to modeling in MaxEnt. Many of the SDM pre- and post-processing tools have 'universal' analogs for use with any modeling software. The current version contains a total of 79 scripts that harness the power of ArcGIS for macroecology, landscape genetics, and evolutionary studies. For example, these tools allow for biodiversity quantification (such as species richness or corrected weighted endemism), generation of least-cost paths and corridors among shared haplotypes, assessment of the significance of spatial randomizations, and enforcement of dispersal limitations of SDMs projected into future climates-to only name a few functions contained in SDMtoolbox 2.0. Lastly, dozens of generalized tools exists for batch processing and conversion of GIS data types or formats, which are broadly useful to any ArcMap user.

  5. Informed-Proteomics: open-source software package for top-down proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Park, Jungkap; Piehowski, Paul D.; Wilkins, Christopher

    Top-down proteomics involves the analysis of intact proteins. This approach is very attractive as it allows for analyzing proteins in their endogenous form without proteolysis, preserving valuable information about post-translation modifications, isoforms, proteolytic processing or their combinations collectively called proteoforms. Moreover, the quality of the top-down LC-MS/MS datasets is rapidly increasing due to advances in the liquid chromatography and mass spectrometry instrumentation and sample processing protocols. However, the top-down mass spectra are substantially more complex compare to the more conventional bottom-up data. To take full advantage of the increasing quality of the top-down LC-MS/MS datasets there is an urgent needmore » to develop algorithms and software tools for confident proteoform identification and quantification. In this study we present a new open source software suite for top-down proteomics analysis consisting of an LC-MS feature finding algorithm, a database search algorithm, and an interactive results viewer. The presented tool along with several other popular tools were evaluated using human-in-mouse xenograft luminal and basal breast tumor samples that are known to have significant differences in protein abundance based on bottom-up analysis.« less

  6. A highly efficient, high-throughput lipidomics platform for the quantitative detection of eicosanoids in human whole blood.

    PubMed

    Song, Jiao; Liu, Xuejun; Wu, Jiejun; Meehan, Michael J; Blevitt, Jonathan M; Dorrestein, Pieter C; Milla, Marcos E

    2013-02-15

    We have developed an ultra-performance liquid chromatography-multiple reaction monitoring/mass spectrometry (UPLC-MRM/MS)-based, high-content, high-throughput platform that enables simultaneous profiling of multiple lipids produced ex vivo in human whole blood (HWB) on treatment with calcium ionophore and its modulation with pharmacological agents. HWB samples were processed in a 96-well plate format compatible with high-throughput sample processing instrumentation. We employed a scheduled MRM (sMRM) method, with a triple-quadrupole mass spectrometer coupled to a UPLC system, to measure absolute amounts of 122 distinct eicosanoids using deuterated internal standards. In a 6.5-min run, we resolved and detected with high sensitivity (lower limit of quantification in the range of 0.4-460 pg) all targeted analytes from a very small HWB sample (2.5 μl). Approximately 90% of the analytes exhibited a dynamic range exceeding 1000. We also developed a tailored software package that dramatically sped up the overall data quantification and analysis process with superior consistency and accuracy. Matrix effects from HWB and precision of the calibration curve were evaluated using this newly developed automation tool. This platform was successfully applied to the global quantification of changes on all 122 eicosanoids in HWB samples from healthy donors in response to calcium ionophore stimulation. Copyright © 2012 Elsevier Inc. All rights reserved.

  7. Development of defined microbial population standards using fluorescence activated cell sorting for the absolute quantification of S. aureus using real-time PCR.

    PubMed

    Martinon, Alice; Cronin, Ultan P; Wilkinson, Martin G

    2012-01-01

    In this article, four types of standards were assessed in a SYBR Green-based real-time PCR procedure for the quantification of Staphylococcus aureus (S. aureus) in DNA samples. The standards were purified S. aureus genomic DNA (type A), circular plasmid DNA containing a thermonuclease (nuc) gene fragment (type B), DNA extracted from defined populations of S. aureus cells generated by Fluorescence Activated Cell Sorting (FACS) technology with (type C) or without purification of DNA by boiling (type D). The optimal efficiency of 2.016 was obtained on Roche LightCycler(®) 4.1. software for type C standards, whereas the lowest efficiency (1.682) corresponded to type D standards. Type C standards appeared to be more suitable for quantitative real-time PCR because of the use of defined populations for construction of standard curves. Overall, Fieller Confidence Interval algorithm may be improved for replicates having a low standard deviation in Cycle Threshold values such as found for type B and C standards. Stabilities of diluted PCR standards stored at -20°C were compared after 0, 7, 14 and 30 days and were lower for type A or C standards compared with type B standards. However, FACS generated standards may be useful for bacterial quantification in real-time PCR assays once optimal storage and temperature conditions are defined.

  8. Diagnostic evaluation of three cardiac software packages using a consecutive group of patients

    PubMed Central

    2011-01-01

    Purpose The aim of this study was to compare the diagnostic performance of the three software packages 4DMSPECT (4DM), Emory Cardiac Toolbox (ECTb), and Cedars Quantitative Perfusion SPECT (QPS) for quantification of myocardial perfusion scintigram (MPS) using a large group of consecutive patients. Methods We studied 1,052 consecutive patients who underwent 2-day stress/rest 99mTc-sestamibi MPS studies. The reference/gold-standard classifications for the MPS studies were obtained from three physicians, with more than 25 years each of experience in nuclear cardiology, who re-evaluated all MPS images. Automatic processing was carried out using 4DM, ECTb, and QPS software packages. Total stress defect extent (TDE) and summed stress score (SSS) based on a 17-segment model were obtained from the software packages. Receiver-operating characteristic (ROC) analysis was performed. Results A total of 734 patients were classified as normal and the remaining 318 were classified as having infarction and/or ischemia. The performance of the software packages calculated as the area under the SSS ROC curve were 0.87 for 4DM, 0.80 for QPS, and 0.76 for ECTb (QPS vs. ECTb p = 0.03; other differences p < 0.0001). The area under the TDE ROC curve were 0.87 for 4DM, 0.82 for QPS, and 0.76 for ECTb (QPS vs. ECTb p = 0.0005; other differences p < 0.0001). Conclusion There are considerable differences in performance between the three software packages with 4DM showing the best performance and ECTb the worst. These differences in performance should be taken in consideration when software packages are used in clinical routine or in clinical studies. PMID:22214226

  9. Application of Scion image software to the simultaneous determination of curcuminoids in turmeric (Curcuma longa).

    PubMed

    Sotanaphun, Uthai; Phattanawasin, Panadda; Sriphong, Lawan

    2009-01-01

    Curcumin, desmethoxycurcumin and bisdesmethoxycurcumin are bioactive constituents of turmeric (Curcuma longa). Owing to their different potency, quality control of turmeric based on the content of each curcuminoid is more reliable than that based on total curcuminoids. However, to perform such an assay, high-cost instrument is needed. To develop a simple and low-cost method for the simultaneous quantification of three curcuminoids in turmeric using TLC and the public-domain software Scion Image. The image of a TLC chromatogram of turmeric extract was recorded using a digital scanner. The density of the TLC spot of each curcuminoid was analysed by the Scion Image software. The density value was transformed to concentration by comparison with the calibration curve of standard curcuminoids developed on the same TLC plate. The polynomial regression data for all curcuminoids showed good linear relationship with R(2) > 0.99 in the concentration range of 0.375-6 microg/spot. The limits of detection and quantitation were 43-73 and 143-242 ng/spot, respectively. The method gave adequate precision, accuracy and recovery. The contents of each curcuminoid determined using this method were not significantly different from those determined using the TLC densitometric method. TLC image analysis using Scion Image is shown to be a reliable method for the simultaneous analysis of the content of each curcuminoid in turmeric.

  10. Toward a Quantification of the Information/Communication Industries. Publication No. 74-2.

    ERIC Educational Resources Information Center

    Lavey, Warren G.

    A national survey was made to collect data about the information/communication industries in the United States today. Eleven industries were studied: television, radio, telephone, telegraph, postal service, newspaper, periodical, book publishing and printing, motion pictures, computer software, and cable television. The data collection scheme used…

  11. Performance of new automated transthoracic three-dimensional echocardiographic software for left ventricular volumes and function assessment in routine clinical practice: Comparison with 3 Tesla cardiac magnetic resonance.

    PubMed

    Levy, Franck; Dan Schouver, Elie; Iacuzio, Laura; Civaia, Filippo; Rusek, Stephane; Dommerc, Carinne; Marechaux, Sylvestre; Dor, Vincent; Tribouilloy, Christophe; Dreyfus, Gilles

    2017-11-01

    Three-dimensional (3D) transthoracic echocardiography (TTE) is superior to two-dimensional Simpson's method for assessment of left ventricular (LV) volumes and LV ejection fraction (LVEF). Nevertheless, 3D TTE is not incorporated into everyday practice, as current LV chamber quantification software products are time-consuming. To evaluate the feasibility, accuracy and reproducibility of new fully automated fast 3D TTE software (HeartModel A.I. ; Philips Healthcare, Andover, MA, USA) for quantification of LV volumes and LVEF in routine practice; to compare the 3D LV volumes and LVEF obtained with a cardiac magnetic resonance (CMR) reference; and to optimize automated default border settings with CMR as reference. Sixty-three consecutive patients, who had comprehensive 3D TTE and CMR examinations within 24hours, were eligible for inclusion. Nine patients (14%) were excluded because of insufficient echogenicity in the 3D TTE. Thus, 54 patients (40 men; mean age 63±13 years) were prospectively included into the study. The inter- and intraobserver reproducibilities of 3D TTE were excellent (coefficient of variation<10%) for end-diastolic volume (EDV), end-systolic volume (ESV) and LVEF. Despite a slight underestimation of EDV using 3D TTE compared with CMR (bias=-22±34mL; P<0.0001), a significant correlation was found between the two measurements (r=0.93; P=0.0001). Enlarging default border detection settings leads to frequent volume overestimation in the general population, but improved agreement with CMR in patients with LVEF≤50%. Correlations between 3D TTE and CMR for ESV and LVEF were excellent (r=0.93 and r=0.91, respectively; P<0.0001). 3D TTE using new-generation fully automated software is a feasible, fast, reproducible and accurate imaging modality for LV volumetric quantification in routine practice. Optimization of border detection settings may increase agreement with CMR for EDV assessment in dilated ventricles. Copyright © 2017 Elsevier Masson SAS. All rights reserved.

  12. The Perseus computational platform for comprehensive analysis of (prote)omics data.

    PubMed

    Tyanova, Stefka; Temu, Tikira; Sinitcyn, Pavel; Carlson, Arthur; Hein, Marco Y; Geiger, Tamar; Mann, Matthias; Cox, Jürgen

    2016-09-01

    A main bottleneck in proteomics is the downstream biological analysis of highly multivariate quantitative protein abundance data generated using mass-spectrometry-based analysis. We developed the Perseus software platform (http://www.perseus-framework.org) to support biological and biomedical researchers in interpreting protein quantification, interaction and post-translational modification data. Perseus contains a comprehensive portfolio of statistical tools for high-dimensional omics data analysis covering normalization, pattern recognition, time-series analysis, cross-omics comparisons and multiple-hypothesis testing. A machine learning module supports the classification and validation of patient groups for diagnosis and prognosis, and it also detects predictive protein signatures. Central to Perseus is a user-friendly, interactive workflow environment that provides complete documentation of computational methods used in a publication. All activities in Perseus are realized as plugins, and users can extend the software by programming their own, which can be shared through a plugin store. We anticipate that Perseus's arsenal of algorithms and its intuitive usability will empower interdisciplinary analysis of complex large data sets.

  13. Efficient visualization of high-throughput targeted proteomics experiments: TAPIR.

    PubMed

    Röst, Hannes L; Rosenberger, George; Aebersold, Ruedi; Malmström, Lars

    2015-07-15

    Targeted mass spectrometry comprises a set of powerful methods to obtain accurate and consistent protein quantification in complex samples. To fully exploit these techniques, a cross-platform and open-source software stack based on standardized data exchange formats is required. We present TAPIR, a fast and efficient Python visualization software for chromatograms and peaks identified in targeted proteomics experiments. The input formats are open, community-driven standardized data formats (mzML for raw data storage and TraML encoding the hierarchical relationships between transitions, peptides and proteins). TAPIR is scalable to proteome-wide targeted proteomics studies (as enabled by SWATH-MS), allowing researchers to visualize high-throughput datasets. The framework integrates well with existing automated analysis pipelines and can be extended beyond targeted proteomics to other types of analyses. TAPIR is available for all computing platforms under the 3-clause BSD license at https://github.com/msproteomicstools/msproteomicstools. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  14. Tracking and Quantifying Developmental Processes in C. elegans Using Open-source Tools.

    PubMed

    Dutta, Priyanka; Lehmann, Christina; Odedra, Devang; Singh, Deepika; Pohl, Christian

    2015-12-16

    Quantitatively capturing developmental processes is crucial to derive mechanistic models and key to identify and describe mutant phenotypes. Here protocols are presented for preparing embryos and adult C. elegans animals for short- and long-term time-lapse microscopy and methods for tracking and quantification of developmental processes. The methods presented are all based on C. elegans strains available from the Caenorhabditis Genetics Center and on open-source software that can be easily implemented in any laboratory independently of the microscopy system used. A reconstruction of a 3D cell-shape model using the modelling software IMOD, manual tracking of fluorescently-labeled subcellular structures using the multi-purpose image analysis program Endrov, and an analysis of cortical contractile flow using PIVlab (Time-Resolved Digital Particle Image Velocimetry Tool for MATLAB) are shown. It is discussed how these methods can also be deployed to quantitatively capture other developmental processes in different models, e.g., cell tracking and lineage tracing, tracking of vesicle flow.

  15. CSE database: extended annotations and new recommendations for ECG software testing.

    PubMed

    Smíšek, Radovan; Maršánová, Lucie; Němcová, Andrea; Vítek, Martin; Kozumplík, Jiří; Nováková, Marie

    2017-08-01

    Nowadays, cardiovascular diseases represent the most common cause of death in western countries. Among various examination techniques, electrocardiography (ECG) is still a highly valuable tool used for the diagnosis of many cardiovascular disorders. In order to diagnose a person based on ECG, cardiologists can use automatic diagnostic algorithms. Research in this area is still necessary. In order to compare various algorithms correctly, it is necessary to test them on standard annotated databases, such as the Common Standards for Quantitative Electrocardiography (CSE) database. According to Scopus, the CSE database is the second most cited standard database. There were two main objectives in this work. First, new diagnoses were added to the CSE database, which extended its original annotations. Second, new recommendations for diagnostic software quality estimation were established. The ECG recordings were diagnosed by five new cardiologists independently, and in total, 59 different diagnoses were found. Such a large number of diagnoses is unique, even in terms of standard databases. Based on the cardiologists' diagnoses, a four-round consensus (4R consensus) was established. Such a 4R consensus means a correct final diagnosis, which should ideally be the output of any tested classification software. The accuracy of the cardiologists' diagnoses compared with the 4R consensus was the basis for the establishment of accuracy recommendations. The accuracy was determined in terms of sensitivity = 79.20-86.81%, positive predictive value = 79.10-87.11%, and the Jaccard coefficient = 72.21-81.14%, respectively. Within these ranges, the accuracy of the software is comparable with the accuracy of cardiologists. The accuracy quantification of the correct classification is unique. Diagnostic software developers can objectively evaluate the success of their algorithm and promote its further development. The annotations and recommendations proposed in this work will allow for faster development and testing of classification software. As a result, this might facilitate cardiologists' work and lead to faster diagnoses and earlier treatment.

  16. ddpcr: an R package and web application for analysis of droplet digital PCR data.

    PubMed

    Attali, Dean; Bidshahri, Roza; Haynes, Charles; Bryan, Jennifer

    2016-01-01

    Droplet digital polymerase chain reaction (ddPCR) is a novel platform for exact quantification of DNA which holds great promise in clinical diagnostics. It is increasingly popular due to its digital nature, which provides more accurate quantification and higher sensitivity than traditional real-time PCR. However, clinical adoption has been slowed in part by the lack of software tools available for analyzing ddPCR data. Here, we present ddpcr - a new R package for ddPCR visualization and analysis. In addition, ddpcr includes a web application (powered by the Shiny R package) that allows users to analyze ddPCR data using an interactive graphical interface.

  17. Analysis of lipid experiments (ALEX): a software framework for analysis of high-resolution shotgun lipidomics data.

    PubMed

    Husen, Peter; Tarasov, Kirill; Katafiasz, Maciej; Sokol, Elena; Vogt, Johannes; Baumgart, Jan; Nitsch, Robert; Ekroos, Kim; Ejsing, Christer S

    2013-01-01

    Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF) from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1). The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.

  18. LFQuant: a label-free fast quantitative analysis tool for high-resolution LC-MS/MS proteomics data.

    PubMed

    Zhang, Wei; Zhang, Jiyang; Xu, Changming; Li, Ning; Liu, Hui; Ma, Jie; Zhu, Yunping; Xie, Hongwei

    2012-12-01

    Database searching based methods for label-free quantification aim to reconstruct the peptide extracted ion chromatogram based on the identification information, which can limit the search space and thus make the data processing much faster. The random effect of the MS/MS sampling can be remedied by cross-assignment among different runs. Here, we present a new label-free fast quantitative analysis tool, LFQuant, for high-resolution LC-MS/MS proteomics data based on database searching. It is designed to accept raw data in two common formats (mzXML and Thermo RAW), and database search results from mainstream tools (MASCOT, SEQUEST, and X!Tandem), as input data. LFQuant can handle large-scale label-free data with fractionation such as SDS-PAGE and 2D LC. It is easy to use and provides handy user interfaces for data loading, parameter setting, quantitative analysis, and quantitative data visualization. LFQuant was compared with two common quantification software packages, MaxQuant and IDEAL-Q, on the replication data set and the UPS1 standard data set. The results show that LFQuant performs better than them in terms of both precision and accuracy, and consumes significantly less processing time. LFQuant is freely available under the GNU General Public License v3.0 at http://sourceforge.net/projects/lfquant/. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. 3D visualization and quantification of bone and teeth mineralization for the study of osteo/dentinogenesis in mice models

    NASA Astrophysics Data System (ADS)

    Marchadier, A.; Vidal, C.; Ordureau, S.; Lédée, R.; Léger, C.; Young, M.; Goldberg, M.

    2011-03-01

    Research on bone and teeth mineralization in animal models is critical for understanding human pathologies. Genetically modified mice represent highly valuable models for the study of osteo/dentinogenesis defects and osteoporosis. Current investigations on mice dental and skeletal phenotype use destructive and time consuming methods such as histology and scanning microscopy. Micro-CT imaging is quicker and provides high resolution qualitative phenotypic description. However reliable quantification of mineralization processes in mouse bone and teeth are still lacking. We have established novel CT imaging-based software for accurate qualitative and quantitative analysis of mouse mandibular bone and molars. Data were obtained from mandibles of mice lacking the Fibromodulin gene which is involved in mineralization processes. Mandibles were imaged with a micro-CT originally devoted to industrial applications (Viscom, X8060 NDT). 3D advanced visualization was performed using the VoxBox software (UsefulProgress) with ray casting algorithms. Comparison between control and defective mice mandibles was made by applying the same transfer function for each 3D data, thus allowing to detect shape, colour and density discrepencies. The 2D images of transverse slices of mandible and teeth were similar and even more accurate than those obtained with scanning electron microscopy. Image processing of the molars allowed the 3D reconstruction of the pulp chamber, providing a unique tool for the quantitative evaluation of dentinogenesis. This new method is highly powerful for the study of oro-facial mineralizations defects in mice models, complementary and even competitive to current histological and scanning microscopy appoaches.

  20. Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound

    NASA Astrophysics Data System (ADS)

    Galperin, Michael

    2003-05-01

    A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.

  1. Quantification of Abdominal Fat in Obese and Healthy Adolescents Using 3 Tesla Magnetic Resonance Imaging and Free Software for Image Analysis.

    PubMed

    Eloi, Juliana Cristina; Epifanio, Matias; de Gonçalves, Marília Maia; Pellicioli, Augusto; Vieira, Patricia Froelich Giora; Dias, Henrique Bregolin; Bruscato, Neide; Soder, Ricardo Bernardi; Santana, João Carlos Batista; Mouzaki, Marialena; Baldisserotto, Matteo

    2017-01-01

    Computed tomography, which uses ionizing radiation and expensive software packages for analysis of scans, can be used to quantify abdominal fat. The objective of this study is to measure abdominal fat with 3T MRI using free software for image analysis and to correlate these findings with anthropometric and laboratory parameters in adolescents. This prospective observational study included 24 overweight/obese and 33 healthy adolescents (mean age 16.55 years). All participants underwent abdominal MRI exams. Visceral and subcutaneous fat area and percentage were correlated with anthropometric parameters, lipid profile, glucose metabolism, and insulin resistance. Student's t test and Mann-Whitney's test was applied. Pearson's chi-square test was used to compare proportions. To determine associations Pearson's linear correlation or Spearman's correlation were used. In both groups, waist circumference (WC) was associated with visceral fat area (P = 0.001 and P = 0.01 respectively), and triglycerides were associated with fat percentage (P = 0.046 and P = 0.071 respectively). In obese individuals, total cholesterol/HDL ratio was associated with visceral fat area (P = 0.03) and percentage (P = 0.09), and insulin and HOMA-IR were associated with visceral fat area (P = 0.001) and percentage (P = 0.005). 3T MRI can provide reliable and good quality images for quantification of visceral and subcutaneous fat by using a free software package. The results demonstrate that WC is a good predictor of visceral fat in obese adolescents and visceral fat area is associated with total cholesterol/HDL ratio, insulin and HOMA-IR.

  2. EasyLCMS: an asynchronous web application for the automated quantification of LC-MS data

    PubMed Central

    2012-01-01

    Background Downstream applications in metabolomics, as well as mathematical modelling, require data in a quantitative format, which may also necessitate the automated and simultaneous quantification of numerous metabolites. Although numerous applications have been previously developed for metabolomics data handling, automated calibration and calculation of the concentrations in terms of μmol have not been carried out. Moreover, most of the metabolomics applications are designed for GC-MS, and would not be suitable for LC-MS, since in LC, the deviation in the retention time is not linear, which is not taken into account in these applications. Moreover, only a few are web-based applications, which could improve stand-alone software in terms of compatibility, sharing capabilities and hardware requirements, even though a strong bandwidth is required. Furthermore, none of these incorporate asynchronous communication to allow real-time interaction with pre-processed results. Findings Here, we present EasyLCMS (http://www.easylcms.es/), a new application for automated quantification which was validated using more than 1000 concentration comparisons in real samples with manual operation. The results showed that only 1% of the quantifications presented a relative error higher than 15%. Using clustering analysis, the metabolites with the highest relative error distributions were identified and studied to solve recurrent mistakes. Conclusions EasyLCMS is a new web application designed to quantify numerous metabolites, simultaneously integrating LC distortions and asynchronous web technology to present a visual interface with dynamic interaction which allows checking and correction of LC-MS raw data pre-processing results. Moreover, quantified data obtained with EasyLCMS are fully compatible with numerous downstream applications, as well as for mathematical modelling in the systems biology field. PMID:22884039

  3. Improved correlation between CT emphysema quantification and pulmonary function test by density correction of volumetric CT data based on air and aortic density.

    PubMed

    Kim, Song Soo; Seo, Joon Beom; Kim, Namkug; Chae, Eun Jin; Lee, Young Kyung; Oh, Yeon Mok; Lee, Sang Do

    2014-01-01

    To determine the improvement of emphysema quantification with density correction and to determine the optimal site to use for air density correction on volumetric computed tomography (CT). Seventy-eight CT scans of COPD patients (GOLD II-IV, smoking history 39.2±25.3 pack-years) were obtained from several single-vendor 16-MDCT scanners. After density measurement of aorta, tracheal- and external air, volumetric CT density correction was conducted (two reference values: air, -1,000 HU/blood, +50 HU). Using in-house software, emphysema index (EI) and mean lung density (MLD) were calculated. Differences in air densities, MLD and EI prior to and after density correction were evaluated (paired t-test). Correlation between those parameters and FEV1 and FEV1/FVC were compared (age- and sex adjusted partial correlation analysis). Measured densities (HU) of tracheal- and external air differed significantly (-990 ± 14, -1016 ± 9, P<0.001). MLD and EI on original CT data, after density correction using tracheal- and external air also differed significantly (MLD: -874.9 ± 27.6 vs. -882.3 ± 24.9 vs. -860.5 ± 26.6; EI: 16.8 ± 13.4 vs. 21.1 ± 14.5 vs. 9.7 ± 10.5, respectively, P<0.001). The correlation coefficients between CT quantification indices and FEV1, and FEV1/FVC increased after density correction. The tracheal air correction showed better results than the external air correction. Density correction of volumetric CT data can improve correlations of emphysema quantification and PFT. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.

  4. Absolute myocardial flow quantification with (82)Rb PET/CT: comparison of different software packages and methods.

    PubMed

    Tahari, Abdel K; Lee, Andy; Rajaram, Mahadevan; Fukushima, Kenji; Lodge, Martin A; Lee, Benjamin C; Ficaro, Edward P; Nekolla, Stephan; Klein, Ran; deKemp, Robert A; Wahl, Richard L; Bengel, Frank M; Bravo, Paco E

    2014-01-01

    In clinical cardiac (82)Rb PET, globally impaired coronary flow reserve (CFR) is a relevant marker for predicting short-term cardiovascular events. However, there are limited data on the impact of different software and methods for estimation of myocardial blood flow (MBF) and CFR. Our objective was to compare quantitative results obtained from previously validated software tools. We retrospectively analyzed cardiac (82)Rb PET/CT data from 25 subjects (group 1, 62 ± 11 years) with low-to-intermediate probability of coronary artery disease (CAD) and 26 patients (group 2, 57 ± 10 years; P=0.07) with known CAD. Resting and vasodilator-stress MBF and CFR were derived using three software applications: (1) Corridor4DM (4DM) based on factor analysis (FA) and kinetic modeling, (2) 4DM based on region-of-interest (ROI) and kinetic modeling, (3) MunichHeart (MH), which uses a simplified ROI-based retention model approach, and (4) FlowQuant (FQ) based on ROI and compartmental modeling with constant distribution volume. Resting and stress MBF values (in milliliters per minute per gram) derived using the different methods were significantly different: using 4DM-FA, 4DM-ROI, FQ, and MH resting MBF values were 1.47 ± 0.59, 1.16 ± 0.51, 0.91 ± 0.39, and 0.90 ± 0.44, respectively (P<0.001), and stress MBF values were 3.05 ± 1.66, 2.26 ± 1.01, 1.90 ± 0.82, and 1.83 ± 0.81, respectively (P<0.001). However, there were no statistically significant differences among the CFR values (2.15 ± 1.08, 2.05 ± 0.83, 2.23 ± 0.89, and 2.21 ± 0.90, respectively; P=0.17). Regional MBF and CFR according to vascular territories showed similar results. Linear correlation coefficient for global CFR varied between 0.71 (MH vs. 4DM-ROI) and 0.90 (FQ vs. 4DM-ROI). Using a cut-off value of 2.0 for abnormal CFR, the agreement among the software programs ranged between 76 % (MH vs. FQ) and 90 % (FQ vs. 4DM-ROI). Interobserver agreement was in general excellent with all software packages. Quantitative assessment of resting and stress MBF with (82)Rb PET is dependent on the software and methods used, whereas CFR appears to be more comparable. Follow-up and treatment assessment should be done with the same software and method.

  5. Space station data system analysis/architecture study. Task 3: Trade studies, DR-5, volume 1

    NASA Technical Reports Server (NTRS)

    1985-01-01

    The primary objective of Task 3 is to provide additional analysis and insight necessary to support key design/programmatic decision for options quantification and selection for system definition. This includes: (1) the identification of key trade study topics; (2) the definition of a trade study procedure for each topic (issues to be resolved, key inputs, criteria/weighting, methodology); (3) conduct tradeoff and sensitivity analysis; and (4) the review/verification of results within the context of evolving system design and definition. The trade study topics addressed in this volume include space autonomy and function automation, software transportability, system network topology, communications standardization, onboard local area networking, distributed operating system, software configuration management, and the software development environment facility.

  6. MorphoGraphX: A platform for quantifying morphogenesis in 4D

    PubMed Central

    Barbier de Reuille, Pierre; Routier-Kierzkowska, Anne-Lise; Kierzkowski, Daniel; Bassel, George W; Schüpbach, Thierry; Tauriello, Gerardo; Bajpai, Namrata; Strauss, Sören; Weber, Alain; Kiss, Annamaria; Burian, Agata; Hofhuis, Hugo; Sapala, Aleksandra; Lipowczan, Marcin; Heimlicher, Maria B; Robinson, Sarah; Bayer, Emmanuelle M; Basler, Konrad; Koumoutsakos, Petros; Roeder, Adrienne HK; Aegerter-Wilmsen, Tinri; Nakayama, Naomi; Tsiantis, Miltos; Hay, Angela; Kwiatkowska, Dorota; Xenarios, Ioannis; Kuhlemeier, Cris; Smith, Richard S

    2015-01-01

    Morphogenesis emerges from complex multiscale interactions between genetic and mechanical processes. To understand these processes, the evolution of cell shape, proliferation and gene expression must be quantified. This quantification is usually performed either in full 3D, which is computationally expensive and technically challenging, or on 2D planar projections, which introduces geometrical artifacts on highly curved organs. Here we present MorphoGraphX (www.MorphoGraphX.org), a software that bridges this gap by working directly with curved surface images extracted from 3D data. In addition to traditional 3D image analysis, we have developed algorithms to operate on curved surfaces, such as cell segmentation, lineage tracking and fluorescence signal quantification. The software's modular design makes it easy to include existing libraries, or to implement new algorithms. Cell geometries extracted with MorphoGraphX can be exported and used as templates for simulation models, providing a powerful platform to investigate the interactions between shape, genes and growth. DOI: http://dx.doi.org/10.7554/eLife.05864.001 PMID:25946108

  7. Detection and quantification of proteins and cells by use of elemental mass spectrometry: progress and challenges.

    PubMed

    Yan, Xiaowen; Yang, Limin; Wang, Qiuquan

    2013-07-01

    Much progress has been made in identification of the proteins in proteomes, and quantification of these proteins has attracted much interest. In addition to popular tandem mass spectrometric methods based on soft ionization, inductively coupled plasma mass spectrometry (ICPMS), a typical example of mass spectrometry based on hard ionization, usually used for analysis of elements, has unique advantages in absolute quantification of proteins by determination of an element with a definite stoichiometry in a protein or attached to the protein. In this Trends article, we briefly describe state-of-the-art ICPMS-based methods for quantification of proteins, emphasizing protein-labeling and element-tagging strategies developed on the basis of chemically selective reactions and/or biospecific interactions. Recent progress from protein to cell quantification by use of ICPMS is also discussed, and the possibilities and challenges of ICPMS-based protein quantification for universal, selective, or targeted quantification of proteins and cells in a biological sample are also discussed critically. We believe ICPMS-based protein quantification will become ever more important in targeted quantitative proteomics and bioanalysis in the near future.

  8. Combined Uncertainty and A-Posteriori Error Bound Estimates for General CFD Calculations: Theory and Software Implementation

    NASA Technical Reports Server (NTRS)

    Barth, Timothy J.

    2014-01-01

    This workshop presentation discusses the design and implementation of numerical methods for the quantification of statistical uncertainty, including a-posteriori error bounds, for output quantities computed using CFD methods. Hydrodynamic realizations often contain numerical error arising from finite-dimensional approximation (e.g. numerical methods using grids, basis functions, particles) and statistical uncertainty arising from incomplete information and/or statistical characterization of model parameters and random fields. The first task at hand is to derive formal error bounds for statistics given realizations containing finite-dimensional numerical error [1]. The error in computed output statistics contains contributions from both realization error and the error resulting from the calculation of statistics integrals using a numerical method. A second task is to devise computable a-posteriori error bounds by numerically approximating all terms arising in the error bound estimates. For the same reason that CFD calculations including error bounds but omitting uncertainty modeling are only of limited value, CFD calculations including uncertainty modeling but omitting error bounds are only of limited value. To gain maximum value from CFD calculations, a general software package for uncertainty quantification with quantified error bounds has been developed at NASA. The package provides implementations for a suite of numerical methods used in uncertainty quantification: Dense tensorization basis methods [3] and a subscale recovery variant [1] for non-smooth data, Sparse tensorization methods[2] utilizing node-nested hierarchies, Sampling methods[4] for high-dimensional random variable spaces.

  9. Quantification of dsDNA using the Hitachi F-7000 Fluorescence Spectrophotometer and PicoGreen dye.

    PubMed

    Moreno, Luis A; Cox, Kendra L

    2010-11-05

    Quantification of DNA, especially in small concentrations, is an important task with a wide range of biological applications including standard molecular biology assays such as synthesis and purification of DNA, diagnostic applications such as quantification of DNA amplification products, and detection of DNA molecules in drug preparations. During this video we will demonstrate the capability of the Hitachi F-7000 Fluorescence Spectrophotometer equipped with a Micro Plate Reader accessory to perform dsDNA quantification using Molecular Probes Quant-it PicoGreen dye reagent kit. The F-7000 Fluorescence Spectrophotometer offers high sensitivity and high speed measurements. It is a highly flexible system capable of measuring fluorescence, luminescence, and phosphorescence. Several measuring modes are available, including wavelength scan, time scan, photometry and 3-D scan measurement. The spectrophotometer has sensitivity in the range of 50 picomoles of fluorescein when using a 300 μL sample volume in the microplate, and is capable of measuring scan speeds of 60,000 nm/minute. It also has a wide dynamic range of up to 5 orders of magnitude which allows for the use of calibration curves over a wide range of concentrations. The optical system uses all reflective optics for maximum energy and sensitivity. The standard wavelength range is 200 to 750 nm, and can be extended to 900 nm when using one of the optional near infrared photomultipliers. The system allows optional temperature control for the plate reader from 5 to 60 degrees Celsius using an optional external temperature controlled liquid circulator. The microplate reader allows for the use of 96 well microplates, and the measuring speed for 96 wells is less than 60 seconds when using the kinetics mode. Software controls for the F-7000 and Microplate Reader are also highly flexible. Samples may be set in either column or row formats, and any combination of wells may be chosen for sample measurements. This allows for optimal utilization of the microplate. Additionally, the software allows importing micro plate sample configurations created in Excel and saved in comma separated values, or "csv" format. Microplate measuring configurations can be saved and recalled by the software for convenience and increased productivity. Data results can be output to a standard report, to Excel, or to an optional Report Generator Program.

  10. Quantification of dsDNA using the Hitachi F-7000 Fluorescence Spectrophotometer and PicoGreen Dye

    PubMed Central

    Moreno, Luis A.; Cox, Kendra L.

    2010-01-01

    Quantification of DNA, especially in small concentrations, is an important task with a wide range of biological applications including standard molecular biology assays such as synthesis and purification of DNA, diagnostic applications such as quantification of DNA amplification products, and detection of DNA molecules in drug preparations. During this video we will demonstrate the capability of the Hitachi F-7000 Fluorescence Spectrophotometer equipped with a Micro Plate Reader accessory to perform dsDNA quantification using Molecular Probes Quant-it PicoGreen dye reagent kit. The F-7000 Fluorescence Spectrophotometer offers high sensitivity and high speed measurements. It is a highly flexible system capable of measuring fluorescence, luminescence, and phosphorescence. Several measuring modes are available, including wavelength scan, time scan, photometry and 3-D scan measurement. The spectrophotometer has sensitivity in the range of 50 picomoles of fluorescein when using a 300 μL sample volume in the microplate, and is capable of measuring scan speeds of 60,000 nm/minute. It also has a wide dynamic range of up to 5 orders of magnitude which allows for the use of calibration curves over a wide range of concentrations. The optical system uses all reflective optics for maximum energy and sensitivity. The standard wavelength range is 200 to 750 nm, and can be extended to 900 nm when using one of the optional near infrared photomultipliers. The system allows optional temperature control for the plate reader from 5 to 60 degrees Celsius using an optional external temperature controlled liquid circulator. The microplate reader allows for the use of 96 well microplates, and the measuring speed for 96 wells is less than 60 seconds when using the kinetics mode. Software controls for the F-7000 and Microplate Reader are also highly flexible. Samples may be set in either column or row formats, and any combination of wells may be chosen for sample measurements. This allows for optimal utilization of the microplate. Additionally, the software allows importing micro plate sample configurations created in Excel and saved in comma separated values, or "csv" format. Microplate measuring configurations can be saved and recalled by the software for convenience and increased productivity. Data results can be output to a standard report, to Excel, or to an optional Report Generator Program. PMID:21189464

  11. SDMtoolbox 2.0: the next generation Python-based GIS toolkit for landscape genetic, biogeographic and species distribution model analyses

    PubMed Central

    Bennett, Joseph R.; French, Connor M.

    2017-01-01

    SDMtoolbox 2.0 is a software package for spatial studies of ecology, evolution, and genetics. The release of SDMtoolbox 2.0 allows researchers to use the most current ArcGIS software and MaxEnt software, and reduces the amount of time that would be spent developing common solutions. The central aim of this software is to automate complicated and repetitive spatial analyses in an intuitive graphical user interface. One core tenant facilitates careful parameterization of species distribution models (SDMs) to maximize each model’s discriminatory ability and minimize overfitting. This includes carefully processing of occurrence data, environmental data, and model parameterization. This program directly interfaces with MaxEnt, one of the most powerful and widely used species distribution modeling software programs, although SDMtoolbox 2.0 is not limited to species distribution modeling or restricted to modeling in MaxEnt. Many of the SDM pre- and post-processing tools have ‘universal’ analogs for use with any modeling software. The current version contains a total of 79 scripts that harness the power of ArcGIS for macroecology, landscape genetics, and evolutionary studies. For example, these tools allow for biodiversity quantification (such as species richness or corrected weighted endemism), generation of least-cost paths and corridors among shared haplotypes, assessment of the significance of spatial randomizations, and enforcement of dispersal limitations of SDMs projected into future climates—to only name a few functions contained in SDMtoolbox 2.0. Lastly, dozens of generalized tools exists for batch processing and conversion of GIS data types or formats, which are broadly useful to any ArcMap user. PMID:29230356

  12. Contrast-enhanced ultrasonography for the detection of joint vascularity in arthritis--subjective grading versus computer-aided objective quantification.

    PubMed

    Klauser, A S; Franz, M; Bellmann Weiler, R; Gruber, J; Hartig, F; Mur, E; Wick, M C; Jaschke, W

    2011-12-01

    To compare joint inflammation assessment using subjective grading of power Doppler ultrasonography (PDUS) and contrast-enhanced ultrasonography (CEUS) versus computer-aided objective CEUS quantification. 37 joints of 28 patients with arthritis of different etiologies underwent B-mode ultrasonography, PDUS, and CEUS using a second-generation contrast agent. Synovial thickness, extent of vascularized pannus and intensity of vascularization were included in a 4-point PDUS and CEUS grading system. Subjective CEUS and PDUS scores were compared to computer-aided objective CEUS quantification using Qontrast® software for the calculation of the signal intensity (SI) and the ratio of SI for contrast enhancement. The interobserver agreement for subjective scoring was good to excellent (κ = 0.8 - 1.0; P < 0.0001). Computer-aided objective CEUS quantification correlated statistically significantly with subjective CEUS (P < 0.001) and PDUS grading (P < 0.05). The Qontrast® SI ratio correlated with subjective CEUS (P < 0.02) and PDUS grading (P < 0.03). Clinical activity did not correlate with vascularity or synovial thickening (P = N. S.) and no correlation between synovial thickening and vascularity extent could be found, neither using PDUS nor CEUS (P = N. S.). Both subjective CEUS grading and objective CEUS quantification are valuable for assessing joint vascularity in arthritis and computer-aided CEUS quantification may be a suitable objective tool for therapy follow-up in arthritis. © Georg Thieme Verlag KG Stuttgart · New York.

  13. Selective Distance-Based K+ Quantification on Paper-Based Microfluidics.

    PubMed

    Gerold, Chase T; Bakker, Eric; Henry, Charles S

    2018-04-03

    In this study, paper-based microfluidic devices (μPADs) capable of K + quantification in aqueous samples, as well as in human serum, using both colorimetric and distance-based methods are described. A lipophilic phase containing potassium ionophore I (valinomycin) was utilized to achieve highly selective quantification of K + in the presence of Na + , Li + , and Mg 2+ ions. Successful addition of a suspended lipophilic phase to a wax printed paper-based device is described and offers a solution to current approaches that rely on organic solvents, which damage wax barriers. The approach provides an avenue for future alkali/alkaline quantification utilizing μPADs. Colorimetric spot tests allowed for K + quantification from 0.1-5.0 mM using only 3.00 μL of sample solution. Selective distance-based quantification required small sample volumes (6.00 μL) and gave responses sensitive enough to distinguish between 1.0 and 2.5 mM of sample K + . μPADs using distance-based methods were also capable of differentiating between 4.3 and 6.9 mM K + in human serum samples. Distance-based methods required no digital analysis, electronic hardware, or pumps; any steps required for quantification could be carried out using the naked eye.

  14. DICOM image quantification secondary capture (DICOM IQSC) integrated with numeric results, regions, and curves: implementation and applications in nuclear medicine

    NASA Astrophysics Data System (ADS)

    Cao, Xinhua; Xu, Xiaoyin; Voss, Stephan

    2017-03-01

    In this paper, we describe an enhanced DICOM Secondary Capture (SC) that integrates Image Quantification (IQ) results, Regions of Interest (ROIs), and Time Activity Curves (TACs) with screen shots by embedding extra medical imaging information into a standard DICOM header. A software toolkit of DICOM IQSC has been developed to implement the SC-centered information integration of quantitative analysis for routine practice of nuclear medicine. Primary experiments show that the DICOM IQSC method is simple and easy to implement seamlessly integrating post-processing workstations with PACS for archiving and retrieving IQ information. Additional DICOM IQSC applications in routine nuclear medicine and clinic research are also discussed.

  15. Quantitative contrast-enhanced ultrasound for monitoring vedolizumab therapy in inflammatory bowel disease patients: a pilot study.

    PubMed

    Goertz, Ruediger S; Klett, Daniel; Wildner, Dane; Atreya, Raja; Neurath, Markus F; Strobel, Deike

    2018-01-01

    Background Microvascularization of the bowel wall can be visualized and quantified non-invasively by software-assisted analysis of derived time-intensity curves. Purpose To perform software-based quantification of bowel wall perfusion using quantitative contrast-enhanced ultrasound (CEUS) according to clinical response in patients with inflammatory bowel disease treated with vedolizumab. Material and Methods In a prospective study, in 18 out of 34 patients, high-frequency ultrasound of bowel wall thickness using color Doppler flow combined with CEUS was performed at baseline and after 14 weeks of treatment with vedolizumab. Clinical activity scores at week 14 were used to differentiate between responders and non-responders. CEUS parameters were calculated by software analysis of the video loops. Results Nine of 18 patients (11 with Crohn's disease and seven with ulcerative colitis) showed response to treatment with vedolizumab. Overall, the responder group showed a significant decrease in the semi-quantitative color Doppler vascularization score. Amplitude-derived CEUS parameters of mural microvascularization such as peak enhancement or wash-in rate decreased in responders, in contrast with non-responders. Time-derived parameters remained stable or increased during treatment in all patients. Conclusion Analysis of bowel microvascularization by CEUS shows statistically significant changes in the wash-in-rate related to response of vedolizumab therapy.

  16. Algorithm Diversity for Resilent Systems

    DTIC Science & Technology

    2016-06-27

    data structures. 15. SUBJECT TERMS computer security, software diversity, program transformation 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF 18...systematic method for transforming Datalog rules with general universal and existential quantification into efficient algorithms with precise complexity...worst case in the size of the ground rules. There are numerous choices during the transformation that lead to diverse algorithms and different

  17. An Open-Source Standard T-Wave Alternans Detector for Benchmarking.

    PubMed

    Khaustov, A; Nemati, S; Clifford, Gd

    2008-09-14

    We describe an open source algorithm suite for T-Wave Alternans (TWA) detection and quantification. The software consists of Matlab implementations of the widely used Spectral Method and Modified Moving Average with libraries to read both WFDB and ASCII data under windows and Linux. The software suite can run in both batch mode and with a provided graphical user interface to aid waveform exploration. Our software suite was calibrated using an open source TWA model, described in a partner paper [1] by Clifford and Sameni. For the PhysioNet/CinC Challenge 2008 we obtained a score of 0.881 for the Spectral Method and 0.400 for the MMA method. However, our objective was not to provide the best TWA detector, but rather a basis for detailed discussion of algorithms.

  18. CometQ: An automated tool for the detection and quantification of DNA damage using comet assay image analysis.

    PubMed

    Ganapathy, Sreelatha; Muraleedharan, Aparna; Sathidevi, Puthumangalathu Savithri; Chand, Parkash; Rajkumar, Ravi Philip

    2016-09-01

    DNA damage analysis plays an important role in determining the approaches for treatment and prevention of various diseases like cancer, schizophrenia and other heritable diseases. Comet assay is a sensitive and versatile method for DNA damage analysis. The main objective of this work is to implement a fully automated tool for the detection and quantification of DNA damage by analysing comet assay images. The comet assay image analysis consists of four stages: (1) classifier (2) comet segmentation (3) comet partitioning and (4) comet quantification. Main features of the proposed software are the design and development of four comet segmentation methods, and the automatic routing of the input comet assay image to the most suitable one among these methods depending on the type of the image (silver stained or fluorescent stained) as well as the level of DNA damage (heavily damaged or lightly/moderately damaged). A classifier stage, based on support vector machine (SVM) is designed and implemented at the front end, to categorise the input image into one of the above four groups to ensure proper routing. Comet segmentation is followed by comet partitioning which is implemented using a novel technique coined as modified fuzzy clustering. Comet parameters are calculated in the comet quantification stage and are saved in an excel file. Our dataset consists of 600 silver stained images obtained from 40 Schizophrenia patients with different levels of severity, admitted to a tertiary hospital in South India and 56 fluorescent stained images obtained from different internet sources. The performance of "CometQ", the proposed standalone application for automated analysis of comet assay images, is evaluated by a clinical expert and is also compared with that of a most recent and related software-OpenComet. CometQ gave 90.26% positive predictive value (PPV) and 93.34% sensitivity which are much higher than those of OpenComet, especially in the case of silver stained images. The results are validated using confusion matrix and Jaccard index (JI). Comet assay images obtained after DNA damage repair by incubation in the nutrient medium were also analysed, and CometQ showed a significant change in all the comet parameters in most of the cases. Results show that CometQ is an accurate and efficient tool with good sensitivity and PPV for DNA damage analysis using comet assay images. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.

  19. Development and intra-institutional and inter-institutional validation of a comprehensive new hepatobiliary software: Part 1--Liver and gallbladder function.

    PubMed

    Krishnamurthy, Gerbail T; Krishnamurthy, Shakuntala; Gambhir, Sanjiv Sam; Rodrigues, Cesar; Rosenberg, Jarrett; Schiepers, Christiaan; Buxton-Thomas, Muriel

    2009-12-01

    To develop a software tool for quantification of liver and gallbladder function, and to assess the repeatability and reproducibility of measurements made with it. The software tool developed with the JAVA programming language uses the JAVA2 Standard Edition framework. After manual selection of the regions of interest on a 99mTc hepatic iminodiacetic acid study, the program calculates differential hepatic bile flow, basal duodeno-gastric bile reflux (B-DGBR), hepatic extraction fraction (HEF) of both the lobes with deconvolutional analysis and excretion half-time with nonlinear least squares fit. Gallbladder ejection fraction, ejection period (EP), ejection rate (ER), and postcholecystokinin (CCK) DGBR are calculated after stimulation with CCK-8. To assess intra-observer repeatability and intra-observer reproducibility, measurements from 10 normal participants were analyzed twice by three nuclear medicine technologists at the primary center. To assess inter-site reproducibility, measurements from a superset of 24 normal participants were also assessed once by three observers at the primary center and single observer at three other sites. For the 24 control participants, mean+/-SD of hepatic bile flow into gallbladder was 63.87+/-28.7%, HEF of the right lobe 100+/-0%, left lobe 99.43+2.63%, excretion half-time of the right lobe 21.50+6.98 min, left lobe 28.3+/-11.3 min. Basal DGBR was 1.2+/-1.0%. Gallbladder ejection fraction was 80+/-11%, EP 15.0+/-3.0 min, ER 5.8+/-1.6%/min, and DGBR-CCK 1.3+/-2.3%. Left and right lobe HEF was virtually identical across readers. All measures showed high repeatability except for gallbladder bile flow, basal DGBR, and EP, which exhibited marginal repeatability. Ejection fraction exhibited high reproducibility. There was high concordance among the three primary center observers except for basal DGBR, EP, and ER. Concordance between the primary site and one of the other sites was high, one was fair, and one was poor. New United States Food and Drug Administration-approved personal computer-based Krishnamurthy Hepato-Biliary Software for quantification of the liver and gallbladder function shows promise for consistently repeatable and reproducible results both within and between institutions, and may help to promote universal standardization of data acquisition and analysis in nuclear hepatology.

  20. Adapting a standardised international 24 h dietary recall methodology (GloboDiet software) for research and dietary surveillance in Korea.

    PubMed

    Park, Min Kyung; Park, Jin Young; Nicolas, Geneviève; Paik, Hee Young; Kim, Jeongseon; Slimani, Nadia

    2015-06-14

    During the past decades, a rapid nutritional transition has been observed along with economic growth in the Republic of Korea. Since this dramatic change in diet has been frequently associated with cancer and other non-communicable diseases, dietary monitoring is essential to understand the association. Benefiting from pre-existing standardised dietary methodologies, the present study aimed to evaluate the feasibility and describe the development of a Korean version of the international computerised 24 h dietary recall method (GloboDiet software) and its complementary tools, developed at the International Agency for Research on Cancer (IARC), WHO. Following established international Standard Operating Procedures and guidelines, about seventy common and country-specific databases on foods, recipes, dietary supplements, quantification methods and coefficients were customised and translated. The main results of the present study highlight the specific adaptations made to adapt the GloboDiet software for research and dietary surveillance in Korea. New (sub-) subgroups were added into the existing common food classification, and new descriptors were added to the facets to classify and describe specific Korean foods. Quantification methods were critically evaluated and adapted considering the foods and food packages available in the Korean market. Furthermore, a picture book of foods/dishes was prepared including new pictures and food portion sizes relevant to Korean diet. The development of the Korean version of GloboDiet demonstrated that it was possible to adapt the IARC-WHO international dietary tool to an Asian context without compromising its concept of standardisation and software structure. It, thus, confirms that this international dietary methodology, used so far only in Europe, is flexible and robust enough to be customised for other regions worldwide.

  1. Sustaining an Online, Shared Community Resource for Models, Robust Open source Software Tools and Data for Volcanology - the Vhub Experience

    NASA Astrophysics Data System (ADS)

    Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.

    2015-12-01

    Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical models.

  2. Automatic Identification and Quantification of Extra-Well Fluorescence in Microarray Images.

    PubMed

    Rivera, Robert; Wang, Jie; Yu, Xiaobo; Demirkan, Gokhan; Hopper, Marika; Bian, Xiaofang; Tahsin, Tasnia; Magee, D Mitchell; Qiu, Ji; LaBaer, Joshua; Wallstrom, Garrick

    2017-11-03

    In recent studies involving NAPPA microarrays, extra-well fluorescence is used as a key measure for identifying disease biomarkers because there is evidence to support that it is better correlated with strong antibody responses than statistical analysis involving intraspot intensity. Because this feature is not well quantified by traditional image analysis software, identification and quantification of extra-well fluorescence is performed manually, which is both time-consuming and highly susceptible to variation between raters. A system that could automate this task efficiently and effectively would greatly improve the process of data acquisition in microarray studies, thereby accelerating the discovery of disease biomarkers. In this study, we experimented with different machine learning methods, as well as novel heuristics, for identifying spots exhibiting extra-well fluorescence (rings) in microarray images and assigning each ring a grade of 1-5 based on its intensity and morphology. The sensitivity of our final system for identifying rings was found to be 72% at 99% specificity and 98% at 92% specificity. Our system performs this task significantly faster than a human, while maintaining high performance, and therefore represents a valuable tool for microarray image analysis.

  3. Contrast-Enhanced Ultrasound (CEUS) and Quantitative Perfusion Analysis in Patients with Suspicion for Prostate Cancer.

    PubMed

    Maxeiner, Andreas; Fischer, Thomas; Schwabe, Julia; Baur, Alexander Daniel Jacques; Stephan, Carsten; Peters, Robert; Slowinski, Torsten; von Laffert, Maximilian; Marticorena Garcia, Stephan Rodrigo; Hamm, Bernd; Jung, Ernst-Michael

    2018-06-06

     The aim of this study was to investigate contrast-enhanced ultrasound (CEUS) parameters acquired by software during magnetic resonance imaging (MRI) US fusion-guided biopsy for prostate cancer (PCa) detection and discrimination.  From 2012 to 2015, 158 out of 165 men with suspicion for PCa and with at least 1 negative biopsy of the prostate were included and underwent a multi-parametric 3 Tesla MRI and an MRI/US fusion-guided biopsy, consecutively. CEUS was conducted during biopsy with intravenous bolus application of 2.4 mL of SonoVue ® (Bracco, Milan, Italy). In the latter CEUS clips were investigated using quantitative perfusion analysis software (VueBox, Bracco). The area of strongest enhancement within the MRI pre-located region was investigated and all available parameters from the quantification tool box were collected and analyzed for PCa and its further differentiation was based on the histopathological results.  The overall detection rate was 74 (47 %) PCa cases in 158 included patients. From these 74 PCa cases, 49 (66 %) were graded Gleason ≥ 3 + 4 = 7 (ISUP ≥ 2) PCa. The best results for cancer detection over all quantitative perfusion parameters were rise time (p = 0.026) and time to peak (p = 0.037). Within the subgroup analysis (> vs ≤ 3 + 4 = 7a (ISUP 2)), peak enhancement (p = 0.012), wash-in rate (p = 0.011), wash-out rate (p = 0.007) and wash-in perfusion index (p = 0.014) also showed statistical significance.  The quantification of CEUS parameters was able to discriminate PCa aggressiveness during MRI/US fusion-guided prostate biopsy. © Georg Thieme Verlag KG Stuttgart · New York.

  4. Transthoracic 3D echocardiographic left heart chamber quantification in patients with bicuspid aortic valve disease.

    PubMed

    van den Hoven, Allard T; Mc-Ghie, Jackie S; Chelu, Raluca G; Duijnhouwer, Anthonie L; Baggen, Vivan J M; Coenen, Adriaan; Vletter, Wim B; Dijkshoorn, Marcel L; van den Bosch, Annemien E; Roos-Hesselink, Jolien W

    2017-12-01

    Integration of volumetric heart chamber quantification by 3D echocardiography into clinical practice has been hampered by several factors which a new fully automated algorithm (Left Heart Model, (LHM)) may help overcome. This study therefore aims to evaluate the feasibility and accuracy of the LHM software in quantifying left atrial and left ventricular volumes and left ventricular ejection fraction in a cohort of patients with a bicuspid aortic valve. Patients with a bicuspid aortic valve were prospectively included. All patients underwent 2D and 3D transthoracic echocardiography and computed tomography. Left atrial and ventricular volumes were obtained using the automated program, which did not require manual contour detection. For comparison manual and semi-automated measurements were performed using conventional 2D and 3D datasets. 53 patients were included, in four of those patients no 3D dataset could be acquired. Additionally, 12 patients were excluded based on poor imaging quality. Left ventricular end-diastolic and end-systolic volumes and ejection fraction calculated by the LHM correlated well with manual 2D and 3D measurements (Pearson's r between 0.43 and 0.97, p < 0.05). Left atrial volume (LAV) also correlated significantly although LHM did estimate larger LAV compared to both 2DE and 3DE (Pearson's r between 0.61 and 0.81, p < 0.01). The fully automated software works well in a real-world setting and helps to overcome some of the major hurdles in integrating 3D analysis into daily practice, as it is user-independent and highly reproducible in a group of patients with a clearly defined and well-studied valvular abnormality.

  5. Psoas muscle area is not representative of total skeletal muscle area in the assessment of sarcopenia in ovarian cancer.

    PubMed

    Rutten, Iris J G; Ubachs, Jorne; Kruitwagen, Roy F P M; Beets-Tan, Regina G H; Olde Damink, Steven W M; Van Gorp, Toon

    2017-08-01

    Computed tomography measurements of total skeletal muscle area can detect changes and predict overall survival (OS) in patients with advanced ovarian cancer. This study investigates whether assessment of psoas muscle area reflects total muscle area and can be used to assess sarcopenia in ovarian cancer patients. Ovarian cancer patients (n = 150) treated with induction chemotherapy and interval debulking were enrolled retrospectively in this longitudinal study. Muscle was measured cross sectionally with computed tomography in three ways: (i) software quantification of total skeletal muscle area (SMA); (ii) software quantification of psoas muscle area (PA); and (iii) manual measurement of length and width of the psoas muscle to derive the psoas surface area (PLW). Pearson correlation between the different methods was studied. Patients were divided into two groups based on the extent of change in muscle area, and agreement was measured with kappa coefficients. Cox-regression was used to test predictors for OS. Correlation between SMA and both psoas muscle area measurements was poor (r = 0.52 and 0.39 for PA and PLW, respectively). After categorizing patients into muscle loss or gain, kappa agreement was also poor for all comparisons (all κ < 0.40). In regression analysis, SMA loss was predictive of poor OS (hazard ratio 1.698 (95%CI 1.038-2.778), P = 0.035). No relationship with OS was seen for PA or PLW loss. Change in psoas muscle area is not representative of total muscle area change and should not be used to substitute total skeletal muscle to predict survival in patients with ovarian cancer. © 2017 The Authors. Journal of Cachexia, Sarcopenia and Muscle published by John Wiley & Sons Ltd on behalf of the Society on Sarcopenia, Cachexia and Wasting Disorders.

  6. mapDIA: Preprocessing and statistical analysis of quantitative proteomics data from data independent acquisition mass spectrometry.

    PubMed

    Teo, Guoshou; Kim, Sinae; Tsou, Chih-Chiang; Collins, Ben; Gingras, Anne-Claude; Nesvizhskii, Alexey I; Choi, Hyungwon

    2015-11-03

    Data independent acquisition (DIA) mass spectrometry is an emerging technique that offers more complete detection and quantification of peptides and proteins across multiple samples. DIA allows fragment-level quantification, which can be considered as repeated measurements of the abundance of the corresponding peptides and proteins in the downstream statistical analysis. However, few statistical approaches are available for aggregating these complex fragment-level data into peptide- or protein-level statistical summaries. In this work, we describe a software package, mapDIA, for statistical analysis of differential protein expression using DIA fragment-level intensities. The workflow consists of three major steps: intensity normalization, peptide/fragment selection, and statistical analysis. First, mapDIA offers normalization of fragment-level intensities by total intensity sums as well as a novel alternative normalization by local intensity sums in retention time space. Second, mapDIA removes outlier observations and selects peptides/fragments that preserve the major quantitative patterns across all samples for each protein. Last, using the selected fragments and peptides, mapDIA performs model-based statistical significance analysis of protein-level differential expression between specified groups of samples. Using a comprehensive set of simulation datasets, we show that mapDIA detects differentially expressed proteins with accurate control of the false discovery rates. We also describe the analysis procedure in detail using two recently published DIA datasets generated for 14-3-3β dynamic interaction network and prostate cancer glycoproteome. The software was written in C++ language and the source code is available for free through SourceForge website http://sourceforge.net/projects/mapdia/.This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  7. TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics.

    PubMed

    Röst, Hannes L; Liu, Yansheng; D'Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi

    2016-09-01

    Next-generation mass spectrometric (MS) techniques such as SWATH-MS have substantially increased the throughput and reproducibility of proteomic analysis, but ensuring consistent quantification of thousands of peptide analytes across multiple liquid chromatography-tandem MS (LC-MS/MS) runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we developed TRIC (http://proteomics.ethz.ch/tric/), a software tool that utilizes fragment-ion data to perform cross-run alignment, consistent peak-picking and quantification for high-throughput targeted proteomics. TRIC reduced the identification error compared to a state-of-the-art SWATH-MS analysis without alignment by more than threefold at constant recall while correcting for highly nonlinear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups. Thus, TRIC fills a gap in the pipeline for automated analysis of massively parallel targeted proteomics data sets.

  8. Measuring the complexity of design in real-time imaging software

    NASA Astrophysics Data System (ADS)

    Sangwan, Raghvinder S.; Vercellone-Smith, Pamela; Laplante, Phillip A.

    2007-02-01

    Due to the intricacies in the algorithms involved, the design of imaging software is considered to be more complex than non-image processing software (Sangwan et al, 2005). A recent investigation (Larsson and Laplante, 2006) examined the complexity of several image processing and non-image processing software packages along a wide variety of metrics, including those postulated by McCabe (1976), Chidamber and Kemerer (1994), and Martin (2003). This work found that it was not always possible to quantitatively compare the complexity between imaging applications and nonimage processing systems. Newer research and an accompanying tool (Structure 101, 2006), however, provides a greatly simplified approach to measuring software complexity. Therefore it may be possible to definitively quantify the complexity differences between imaging and non-imaging software, between imaging and real-time imaging software, and between software programs of the same application type. In this paper, we review prior results and describe the methodology for measuring complexity in imaging systems. We then apply a new complexity measurement methodology to several sets of imaging and non-imaging code in order to compare the complexity differences between the two types of applications. The benefit of such quantification is far reaching, for example, leading to more easily measured performance improvement and quality in real-time imaging code.

  9. An analytical tool that quantifies cellular morphology changes from three-dimensional fluorescence images.

    PubMed

    Haass-Koffler, Carolina L; Naeemuddin, Mohammad; Bartlett, Selena E

    2012-08-31

    The most common software analysis tools available for measuring fluorescence images are for two-dimensional (2D) data that rely on manual settings for inclusion and exclusion of data points, and computer-aided pattern recognition to support the interpretation and findings of the analysis. It has become increasingly important to be able to measure fluorescence images constructed from three-dimensional (3D) datasets in order to be able to capture the complexity of cellular dynamics and understand the basis of cellular plasticity within biological systems. Sophisticated microscopy instruments have permitted the visualization of 3D fluorescence images through the acquisition of multispectral fluorescence images and powerful analytical software that reconstructs the images from confocal stacks that then provide a 3D representation of the collected 2D images. Advanced design-based stereology methods have progressed from the approximation and assumptions of the original model-based stereology even in complex tissue sections. Despite these scientific advances in microscopy, a need remains for an automated analytic method that fully exploits the intrinsic 3D data to allow for the analysis and quantification of the complex changes in cell morphology, protein localization and receptor trafficking. Current techniques available to quantify fluorescence images include Meta-Morph (Molecular Devices, Sunnyvale, CA) and Image J (NIH) which provide manual analysis. Imaris (Andor Technology, Belfast, Northern Ireland) software provides the feature MeasurementPro, which allows the manual creation of measurement points that can be placed in a volume image or drawn on a series of 2D slices to create a 3D object. This method is useful for single-click point measurements to measure a line distance between two objects or to create a polygon that encloses a region of interest, but it is difficult to apply to complex cellular network structures. Filament Tracer (Andor) allows automatic detection of the 3D neuronal filament-like however, this module has been developed to measure defined structures such as neurons, which are comprised of dendrites, axons and spines (tree-like structure). This module has been ingeniously utilized to make morphological measurements to non-neuronal cells, however, the output data provide information of an extended cellular network by using a software that depends on a defined cell shape rather than being an amorphous-shaped cellular model. To overcome the issue of analyzing amorphous-shaped cells and making the software more suitable to a biological application, Imaris developed Imaris Cell. This was a scientific project with the Eidgenössische Technische Hochschule, which has been developed to calculate the relationship between cells and organelles. While the software enables the detection of biological constraints, by forcing one nucleus per cell and using cell membranes to segment cells, it cannot be utilized to analyze fluorescence data that are not continuous because ideally it builds cell surface without void spaces. To our knowledge, at present no user-modifiable automated approach that provides morphometric information from 3D fluorescence images has been developed that achieves cellular spatial information of an undefined shape (Figure 1). We have developed an analytical platform using the Imaris core software module and Imaris XT interfaced to MATLAB (Mat Works, Inc.). These tools allow the 3D measurement of cells without a pre-defined shape and with inconsistent fluorescence network components. Furthermore, this method will allow researchers who have extended expertise in biological systems, but not familiarity to computer applications, to perform quantification of morphological changes in cell dynamics.

  10. Overview of Probabilistic Methods for SAE G-11 Meeting for Reliability and Uncertainty Quantification for DoD TACOM Initiative with SAE G-11 Division

    NASA Technical Reports Server (NTRS)

    Singhal, Surendra N.

    2003-01-01

    The SAE G-11 RMSL Division and Probabilistic Methods Committee meeting during October 6-8 at the Best Western Sterling Inn, Sterling Heights (Detroit), Michigan is co-sponsored by US Army Tank-automotive & Armaments Command (TACOM). The meeting will provide an industry/government/academia forum to review RMSL technology; reliability and probabilistic technology; reliability-based design methods; software reliability; and maintainability standards. With over 100 members including members with national/international standing, the mission of the G-11's Probabilistic Methods Committee is to "enable/facilitate rapid deployment of probabilistic technology to enhance the competitiveness of our industries by better, faster, greener, smarter, affordable and reliable product development."

  11. The MaxQuant computational platform for mass spectrometry-based shotgun proteomics.

    PubMed

    Tyanova, Stefka; Temu, Tikira; Cox, Juergen

    2016-12-01

    MaxQuant is one of the most frequently used platforms for mass-spectrometry (MS)-based proteomics data analysis. Since its first release in 2008, it has grown substantially in functionality and can be used in conjunction with more MS platforms. Here we present an updated protocol covering the most important basic computational workflows, including those designed for quantitative label-free proteomics, MS1-level labeling and isobaric labeling techniques. This protocol presents a complete description of the parameters used in MaxQuant, as well as of the configuration options of its integrated search engine, Andromeda. This protocol update describes an adaptation of an existing protocol that substantially modifies the technique. Important concepts of shotgun proteomics and their implementation in MaxQuant are briefly reviewed, including different quantification strategies and the control of false-discovery rates (FDRs), as well as the analysis of post-translational modifications (PTMs). The MaxQuant output tables, which contain information about quantification of proteins and PTMs, are explained in detail. Furthermore, we provide a short version of the workflow that is applicable to data sets with simple and standard experimental designs. The MaxQuant algorithms are efficiently parallelized on multiple processors and scale well from desktop computers to servers with many cores. The software is written in C# and is freely available at http://www.maxquant.org.

  12. IMS software developments for the detection of chemical warfare agent

    NASA Technical Reports Server (NTRS)

    Klepel, ST.; Graefenhain, U.; Lippe, R.; Stach, J.; Starrock, V.

    1995-01-01

    Interference compounds like gasoline, diesel, burning wood or fuel, etc. are presented in common battlefield situations. These compounds can cause detectors to respond as a false positive or interfere with the detector's ability to respond to target compounds such as chemical warfare agents. To ensure proper response of the ion mobility spectrometer to chemical warfare agents, two special software packages were developed and incorporated into the Bruker RAID-1. The programs suppress interferring signals caused by car exhaust or smoke gases resulting from burning materials and correct the influence of variable sample gas humidity which is important for detection and quantification of blister agents like mustard gas or lewisite.

  13. Uncertainty quantification in Rothermel's Model using an efficient sampling method

    Treesearch

    Edwin Jimenez; M. Yousuff Hussaini; Scott L. Goodrick

    2007-01-01

    The purpose of the present work is to quantify parametric uncertainty in Rothermel’s wildland fire spread model (implemented in software such as BehavePlus3 and FARSITE), which is undoubtedly among the most widely used fire spread models in the United States. This model consists of a nonlinear system of equations that relates environmental variables (input parameter...

  14. An Innovative Method for Obtaining Consistent Images and Quantification of Histochemically Stained Specimens

    PubMed Central

    Sedgewick, Gerald J.; Ericson, Marna

    2015-01-01

    Obtaining digital images of color brightfield microscopy is an important aspect of biomedical research and the clinical practice of diagnostic pathology. Although the field of digital pathology has had tremendous advances in whole-slide imaging systems, little effort has been directed toward standardizing color brightfield digital imaging to maintain image-to-image consistency and tonal linearity. Using a single camera and microscope to obtain digital images of three stains, we show that microscope and camera systems inherently produce image-to-image variation. Moreover, we demonstrate that post-processing with a widely used raster graphics editor software program does not completely correct for session-to-session inconsistency. We introduce a reliable method for creating consistent images with a hardware/software solution (ChromaCal™; Datacolor Inc., NJ) along with its features for creating color standardization, preserving linear tonal levels, providing automated white balancing and setting automated brightness to consistent levels. The resulting image consistency using this method will also streamline mean density and morphometry measurements, as images are easily segmented and single thresholds can be used. We suggest that this is a superior method for color brightfield imaging, which can be used for quantification and can be readily incorporated into workflows. PMID:25575568

  15. Comprehensive optical and data management infrastructure for high-throughput light-sheet microscopy of whole mouse brains.

    PubMed

    Müllenbroich, M Caroline; Silvestri, Ludovico; Onofri, Leonardo; Costantini, Irene; Hoff, Marcel Van't; Sacconi, Leonardo; Iannello, Giulio; Pavone, Francesco S

    2015-10-01

    Comprehensive mapping and quantification of neuronal projections in the central nervous system requires high-throughput imaging of large volumes with microscopic resolution. To this end, we have developed a confocal light-sheet microscope that has been optimized for three-dimensional (3-D) imaging of structurally intact clarified whole-mount mouse brains. We describe the optical and electromechanical arrangement of the microscope and give details on the organization of the microscope management software. The software orchestrates all components of the microscope, coordinates critical timing and synchronization, and has been written in a versatile and modular structure using the LabVIEW language. It can easily be adapted and integrated to other microscope systems and has been made freely available to the light-sheet community. The tremendous amount of data routinely generated by light-sheet microscopy further requires novel strategies for data handling and storage. To complete the full imaging pipeline of our high-throughput microscope, we further elaborate on big data management from streaming of raw images up to stitching of 3-D datasets. The mesoscale neuroanatomy imaged at micron-scale resolution in those datasets allows characterization and quantification of neuronal projections in unsectioned mouse brains.

  16. CytoSpectre: a tool for spectral analysis of oriented structures on cellular and subcellular levels.

    PubMed

    Kartasalo, Kimmo; Pölönen, Risto-Pekka; Ojala, Marisa; Rasku, Jyrki; Lekkala, Jukka; Aalto-Setälä, Katriina; Kallio, Pasi

    2015-10-26

    Orientation and the degree of isotropy are important in many biological systems such as the sarcomeres of cardiomyocytes and other fibrillar structures of the cytoskeleton. Image based analysis of such structures is often limited to qualitative evaluation by human experts, hampering the throughput, repeatability and reliability of the analyses. Software tools are not readily available for this purpose and the existing methods typically rely at least partly on manual operation. We developed CytoSpectre, an automated tool based on spectral analysis, allowing the quantification of orientation and also size distributions of structures in microscopy images. CytoSpectre utilizes the Fourier transform to estimate the power spectrum of an image and based on the spectrum, computes parameter values describing, among others, the mean orientation, isotropy and size of target structures. The analysis can be further tuned to focus on targets of particular size at cellular or subcellular scales. The software can be operated via a graphical user interface without any programming expertise. We analyzed the performance of CytoSpectre by extensive simulations using artificial images, by benchmarking against FibrilTool and by comparisons with manual measurements performed for real images by a panel of human experts. The software was found to be tolerant against noise and blurring and superior to FibrilTool when analyzing realistic targets with degraded image quality. The analysis of real images indicated general good agreement between computational and manual results while also revealing notable expert-to-expert variation. Moreover, the experiment showed that CytoSpectre can handle images obtained of different cell types using different microscopy techniques. Finally, we studied the effect of mechanical stretching on cardiomyocytes to demonstrate the software in an actual experiment and observed changes in cellular orientation in response to stretching. CytoSpectre, a versatile, easy-to-use software tool for spectral analysis of microscopy images was developed. The tool is compatible with most 2D images and can be used to analyze targets at different scales. We expect the tool to be useful in diverse applications dealing with structures whose orientation and size distributions are of interest. While designed for the biological field, the software could also be useful in non-biological applications.

  17. IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.

    PubMed

    Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M

    2016-04-01

    Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.

  18. QPROT: Statistical method for testing differential expression using protein-level intensity data in label-free quantitative proteomics.

    PubMed

    Choi, Hyungwon; Kim, Sinae; Fermin, Damian; Tsou, Chih-Chiang; Nesvizhskii, Alexey I

    2015-11-03

    We introduce QPROT, a statistical framework and computational tool for differential protein expression analysis using protein intensity data. QPROT is an extension of the QSPEC suite, originally developed for spectral count data, adapted for the analysis using continuously measured protein-level intensity data. QPROT offers a new intensity normalization procedure and model-based differential expression analysis, both of which account for missing data. Determination of differential expression of each protein is based on the standardized Z-statistic based on the posterior distribution of the log fold change parameter, guided by the false discovery rate estimated by a well-known Empirical Bayes method. We evaluated the classification performance of QPROT using the quantification calibration data from the clinical proteomic technology assessment for cancer (CPTAC) study and a recently published Escherichia coli benchmark dataset, with evaluation of FDR accuracy in the latter. QPROT is a statistical framework with computational software tool for comparative quantitative proteomics analysis. It features various extensions of QSPEC method originally built for spectral count data analysis, including probabilistic treatment of missing values in protein intensity data. With the increasing popularity of label-free quantitative proteomics data, the proposed method and accompanying software suite will be immediately useful for many proteomics laboratories. This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.

  19. Development of computer tablet software for clinical quantification of lateral knee compartment translation during the pivot shift test.

    PubMed

    Muller, Bart; Hofbauer, Marcus; Rahnemai-Azar, Amir Ata; Wolf, Megan; Araki, Daisuke; Hoshino, Yuichi; Araujo, Paulo; Debski, Richard E; Irrgang, James J; Fu, Freddie H; Musahl, Volker

    2016-01-01

    The pivot shift test is a commonly used clinical examination by orthopedic surgeons to evaluate knee function following injury. However, the test can only be graded subjectively by the examiner. Therefore, the purpose of this study is to develop software for a computer tablet to quantify anterior translation of the lateral knee compartment during the pivot shift test. Based on the simple image analysis method, software for a computer tablet was developed with the following primary design constraint - the software should be easy to use in a clinical setting and it should not slow down an outpatient visit. Translation of the lateral compartment of the intact knee was 2.0 ± 0.2 mm and for the anterior cruciate ligament-deficient knee was 8.9 ± 0.9 mm (p < 0.001). Intra-tester (ICC range = 0.913 to 0.999) and inter-tester (ICC = 0.949) reliability were excellent for the repeatability assessments. Overall, the average percent error of measuring simulated translation of the lateral knee compartment with the tablet parallel to the monitor increased from 2.8% at 50 cm distance to 7.7% at 200 cm. Deviation from the parallel position of the tablet did not have a significant effect until a tablet angle of 45°. Average percent error during anterior translation of the lateral knee compartment of 6mm was 2.2% compared to 6.2% for 2 mm of translation. The software provides reliable, objective, and quantitative data on translation of the lateral knee compartment during the pivot shift test and meets the design constraints posed by the clinical setting.

  20. A Graphical User Interface for Software-assisted Tracking of Protein Concentration in Dynamic Cellular Protrusions.

    PubMed

    Saha, Tanumoy; Rathmann, Isabel; Galic, Milos

    2017-07-11

    Filopodia are dynamic, finger-like cellular protrusions associated with migration and cell-cell communication. In order to better understand the complex signaling mechanisms underlying filopodial initiation, elongation and subsequent stabilization or retraction, it is crucial to determine the spatio-temporal protein activity in these dynamic structures. To analyze protein function in filopodia, we recently developed a semi-automated tracking algorithm that adapts to filopodial shape-changes, thus allowing parallel analysis of protrusion dynamics and relative protein concentration along the whole filopodial length. Here, we present a detailed step-by-step protocol for optimized cell handling, image acquisition and software analysis. We further provide instructions for the use of optional features during image analysis and data representation, as well as troubleshooting guidelines for all critical steps along the way. Finally, we also include a comparison of the described image analysis software with other programs available for filopodia quantification. Together, the presented protocol provides a framework for accurate analysis of protein dynamics in filopodial protrusions using image analysis software.

  1. Laser-induced photo emission detection: data acquisition based on light intensity counting

    NASA Astrophysics Data System (ADS)

    Yulianto, N.; Yudasari, N.; Putri, K. Y.

    2017-04-01

    Laser Induced Breakdown Detection (LIBD) is one of the quantification techniques for colloids. There are two ways of detection in LIBD: optical detection and acoustic detection. LIBD is based on the detection of plasma emission due to the interaction between particle and laser beam. In this research, the changing of light intensity during plasma formations was detected by a photodiode sensor. A photo emission data acquisition system was built to collect and transform them into digital counts. The real-time system used data acquisition device National Instrument DAQ 6009 and LABVIEW software. The system has been tested on distilled water and tap water samples. The result showed 99.8% accuracy by using counting technique in comparison to the acoustic detection with sample rate of 10 Hz, thus the acquisition system can be applied as an alternative method to the existing LIBD acquisition system.

  2. CellCognition: time-resolved phenotype annotation in high-throughput live cell imaging.

    PubMed

    Held, Michael; Schmitz, Michael H A; Fischer, Bernd; Walter, Thomas; Neumann, Beate; Olma, Michael H; Peter, Matthias; Ellenberg, Jan; Gerlich, Daniel W

    2010-09-01

    Fluorescence time-lapse imaging has become a powerful tool to investigate complex dynamic processes such as cell division or intracellular trafficking. Automated microscopes generate time-resolved imaging data at high throughput, yet tools for quantification of large-scale movie data are largely missing. Here we present CellCognition, a computational framework to annotate complex cellular dynamics. We developed a machine-learning method that combines state-of-the-art classification with hidden Markov modeling for annotation of the progression through morphologically distinct biological states. Incorporation of time information into the annotation scheme was essential to suppress classification noise at state transitions and confusion between different functional states with similar morphology. We demonstrate generic applicability in different assays and perturbation conditions, including a candidate-based RNA interference screen for regulators of mitotic exit in human cells. CellCognition is published as open source software, enabling live-cell imaging-based screening with assays that directly score cellular dynamics.

  3. Exact and Approximate Probabilistic Symbolic Execution

    NASA Technical Reports Server (NTRS)

    Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem

    2014-01-01

    Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.

  4. Application of Digital Anthropometry for Craniofacial Assessment

    PubMed Central

    Jayaratne, Yasas S. N.; Zwahlen, Roger A.

    2014-01-01

    Craniofacial anthropometry is an objective technique based on a series of measurements and proportions, which facilitate the characterization of phenotypic variation and quantification of dysmorphology. With the introduction of stereophotography, it is possible to acquire a lifelike three-dimensional (3D) image of the face with natural color and texture. Most of the traditional anthropometric landmarks can be identified on these 3D photographs using specialized software. Therefore, it has become possible to compute new digital measurements, which were not feasible with traditional instruments. The term “digital anthropometry” has been used by researchers based on such systems to separate their methods from conventional manual measurements. Anthropometry has been traditionally used as a research tool. With the advent of digital anthropometry, this technique can be employed in several disciplines as a noninvasive tool for quantifying facial morphology. The aim of this review is to provide a broad overview of digital anthropometry and discuss its clinical applications. PMID:25050146

  5. Quantitative fluorescence angiography for neurosurgical interventions.

    PubMed

    Weichelt, Claudia; Duscha, Philipp; Steinmeier, Ralf; Meyer, Tobias; Kuß, Julia; Cimalla, Peter; Kirsch, Matthias; Sobottka, Stephan B; Koch, Edmund; Schackert, Gabriele; Morgenstern, Ute

    2013-06-01

    Present methods for quantitative measurement of cerebral perfusion during neurosurgical operations require additional technology for measurement, data acquisition, and processing. This study used conventional fluorescence video angiography--as an established method to visualize blood flow in brain vessels--enhanced by a quantifying perfusion software tool. For these purposes, the fluorescence dye indocyanine green is given intravenously, and after activation by a near-infrared light source the fluorescence signal is recorded. Video data are analyzed by software algorithms to allow quantification of the blood flow. Additionally, perfusion is measured intraoperatively by a reference system. Furthermore, comparing reference measurements using a flow phantom were performed to verify the quantitative blood flow results of the software and to validate the software algorithm. Analysis of intraoperative video data provides characteristic biological parameters. These parameters were implemented in the special flow phantom for experimental validation of the developed software algorithms. Furthermore, various factors that influence the determination of perfusion parameters were analyzed by means of mathematical simulation. Comparing patient measurement, phantom experiment, and computer simulation under certain conditions (variable frame rate, vessel diameter, etc.), the results of the software algorithms are within the range of parameter accuracy of the reference methods. Therefore, the software algorithm for calculating cortical perfusion parameters from video data presents a helpful intraoperative tool without complex additional measurement technology.

  6. User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Adams, Brian M.; Coleman, Kayla; Gilkey, Lindsay N.

    Sandia’s Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a physics-based computational model. This can lend efficiency and rigor to manual parameter perturbation studies already being conducted by analysts. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, riskmore » analysis, and quantification of margins and uncertainty with such models. It directly supports verification and validation activities. Dakota algorithms enrich complex science and engineering models, enabling an analyst to answer crucial questions of - Sensitivity: Which are the most important input factors or parameters entering the simulation, and how do they influence key outputs?; Uncertainty: What is the uncertainty or variability in simulation output, given uncertainties in input parameters? How safe, reliable, robust, or variable is my system? (Quantification of margins and uncertainty, QMU); Optimization: What parameter values yield the best performing design or operating condition, given constraints? Calibration: What models and/or parameters best match experimental data? In general, Dakota is the Consortium for Advanced Simulation of Light Water Reactors (CASL) delivery vehicle for verification, validation, and uncertainty quantification (VUQ) algorithms. It permits ready application of the VUQ methods described above to simulation codes by CASL researchers, code developers, and application engineers.« less

  7. EpiProfile Quantifies Histone Peptides With Modifications by Extracting Retention Time and Intensity in High-resolution Mass Spectra*

    PubMed Central

    Yuan, Zuo-Fei; Lin, Shu; Molden, Rosalynn C.; Cao, Xing-Jun; Bhanu, Natarajan V.; Wang, Xiaoshi; Sidoli, Simone; Liu, Shichong; Garcia, Benjamin A.

    2015-01-01

    Histone post-translational modifications contribute to chromatin function through their chemical properties which influence chromatin structure and their ability to recruit chromatin interacting proteins. Nanoflow liquid chromatography coupled with high resolution tandem mass spectrometry (nanoLC-MS/MS) has emerged as the most suitable technology for global histone modification analysis because of the high sensitivity and the high mass accuracy of this approach that provides confident identification. However, analysis of histones with this method is even more challenging because of the large number and variety of isobaric histone peptides and the high dynamic range of histone peptide abundances. Here, we introduce EpiProfile, a software tool that discriminates isobaric histone peptides using the distinguishing fragment ions in their tandem mass spectra and extracts the chromatographic area under the curve using previous knowledge about peptide retention time. The accuracy of EpiProfile was evaluated by analysis of mixtures containing different ratios of synthetic histone peptides. In addition to label-free quantification of histone peptides, EpiProfile is flexible and can quantify different types of isotopically labeled histone peptides. EpiProfile is unique in generating layouts (i.e. relative retention time) of histone peptides when compared with manual quantification of the data and other programs (such as Skyline), filling the need of an automatic and freely available tool to quantify labeled and non-labeled modified histone peptides. In summary, EpiProfile is a valuable nanoflow liquid chromatography coupled with high resolution tandem mass spectrometry-based quantification tool for histone peptides, which can also be adapted to analyze nonhistone protein samples. PMID:25805797

  8. An optimized protocol for generation and analysis of Ion Proton sequencing reads for RNA-Seq.

    PubMed

    Yuan, Yongxian; Xu, Huaiqian; Leung, Ross Ka-Kit

    2016-05-26

    Previous studies compared running cost, time and other performance measures of popular sequencing platforms. However, comprehensive assessment of library construction and analysis protocols for Proton sequencing platform remains unexplored. Unlike Illumina sequencing platforms, Proton reads are heterogeneous in length and quality. When sequencing data from different platforms are combined, this can result in reads with various read length. Whether the performance of the commonly used software for handling such kind of data is satisfactory is unknown. By using universal human reference RNA as the initial material, RNaseIII and chemical fragmentation methods in library construction showed similar result in gene and junction discovery number and expression level estimated accuracy. In contrast, sequencing quality, read length and the choice of software affected mapping rate to a much larger extent. Unspliced aligner TMAP attained the highest mapping rate (97.27 % to genome, 86.46 % to transcriptome), though 47.83 % of mapped reads were clipped. Long reads could paradoxically reduce mapping in junctions. With reference annotation guide, the mapping rate of TopHat2 significantly increased from 75.79 to 92.09 %, especially for long (>150 bp) reads. Sailfish, a k-mer based gene expression quantifier attained highly consistent results with that of TaqMan array and highest sensitivity. We provided for the first time, the reference statistics of library preparation methods, gene detection and quantification and junction discovery for RNA-Seq by the Ion Proton platform. Chemical fragmentation performed equally well with the enzyme-based one. The optimal Ion Proton sequencing options and analysis software have been evaluated.

  9. Cardio-PACs: a new opportunity

    NASA Astrophysics Data System (ADS)

    Heupler, Frederick A., Jr.; Thomas, James D.; Blume, Hartwig R.; Cecil, Robert A.; Heisler, Mary

    2000-05-01

    It is now possible to replace film-based image management in the cardiac catheterization laboratory with a Cardiology Picture Archiving and Communication System (Cardio-PACS) based on digital imaging technology. The first step in the conversion process is installation of a digital image acquisition system that is capable of generating high-quality DICOM-compatible images. The next three steps, which are the subject of this presentation, involve image display, distribution, and storage. Clinical requirements and associated cost considerations for these three steps are listed below: Image display: (1) Image quality equal to film, with DICOM format, lossless compression, image processing, desktop PC-based with color monitor, and physician-friendly imaging software; (2) Performance specifications include: acquire 30 frames/sec; replay 15 frames/sec; access to file server 5 seconds, and to archive 5 minutes; (3) Compatibility of image file, transmission, and processing formats; (4) Image manipulation: brightness, contrast, gray scale, zoom, biplane display, and quantification; (5) User-friendly control of image review. Image distribution: (1) Standard IP-based network between cardiac catheterization laboratories, file server, long-term archive, review stations, and remote sites; (2) Non-proprietary formats; (3) Bidirectional distribution. Image storage: (1) CD-ROM vs disk vs tape; (2) Verification of data integrity; (3) User-designated storage capacity for catheterization laboratory, file server, long-term archive. Costs: (1) Image acquisition equipment, file server, long-term archive; (2) Network infrastructure; (3) Review stations and software; (4) Maintenance and administration; (5) Future upgrades and expansion; (6) Personnel.

  10. Quantification of Pulmonary Fibrosis in a Bleomycin Mouse Model Using Automated Histological Image Analysis.

    PubMed

    Gilhodes, Jean-Claude; Julé, Yvon; Kreuz, Sebastian; Stierstorfer, Birgit; Stiller, Detlef; Wollin, Lutz

    2017-01-01

    Current literature on pulmonary fibrosis induced in animal models highlights the need of an accurate, reliable and reproducible histological quantitative analysis. One of the major limits of histological scoring concerns the fact that it is observer-dependent and consequently subject to variability, which may preclude comparative studies between different laboratories. To achieve a reliable and observer-independent quantification of lung fibrosis we developed an automated software histological image analysis performed from digital image of entire lung sections. This automated analysis was compared to standard evaluation methods with regard to its validation as an end-point measure of fibrosis. Lung fibrosis was induced in mice by intratracheal administration of bleomycin (BLM) at 0.25, 0.5, 0.75 and 1 mg/kg. A detailed characterization of BLM-induced fibrosis was performed 14 days after BLM administration using lung function testing, micro-computed tomography and Ashcroft scoring analysis. Quantification of fibrosis by automated analysis was assessed based on pulmonary tissue density measured from thousands of micro-tiles processed from digital images of entire lung sections. Prior to analysis, large bronchi and vessels were manually excluded from the original images. Measurement of fibrosis has been expressed by two indexes: the mean pulmonary tissue density and the high pulmonary tissue density frequency. We showed that tissue density indexes gave access to a very accurate and reliable quantification of morphological changes induced by BLM even for the lowest concentration used (0.25 mg/kg). A reconstructed 2D-image of the entire lung section at high resolution (3.6 μm/pixel) has been performed from tissue density values allowing the visualization of their distribution throughout fibrotic and non-fibrotic regions. A significant correlation (p<0.0001) was found between automated analysis and the above standard evaluation methods. This correlation establishes automated analysis as a novel end-point measure of BLM-induced lung fibrosis in mice, which will be very valuable for future preclinical drug explorations.

  11. Quantification of Pulmonary Fibrosis in a Bleomycin Mouse Model Using Automated Histological Image Analysis

    PubMed Central

    Gilhodes, Jean-Claude; Kreuz, Sebastian; Stierstorfer, Birgit; Stiller, Detlef; Wollin, Lutz

    2017-01-01

    Current literature on pulmonary fibrosis induced in animal models highlights the need of an accurate, reliable and reproducible histological quantitative analysis. One of the major limits of histological scoring concerns the fact that it is observer-dependent and consequently subject to variability, which may preclude comparative studies between different laboratories. To achieve a reliable and observer-independent quantification of lung fibrosis we developed an automated software histological image analysis performed from digital image of entire lung sections. This automated analysis was compared to standard evaluation methods with regard to its validation as an end-point measure of fibrosis. Lung fibrosis was induced in mice by intratracheal administration of bleomycin (BLM) at 0.25, 0.5, 0.75 and 1 mg/kg. A detailed characterization of BLM-induced fibrosis was performed 14 days after BLM administration using lung function testing, micro-computed tomography and Ashcroft scoring analysis. Quantification of fibrosis by automated analysis was assessed based on pulmonary tissue density measured from thousands of micro-tiles processed from digital images of entire lung sections. Prior to analysis, large bronchi and vessels were manually excluded from the original images. Measurement of fibrosis has been expressed by two indexes: the mean pulmonary tissue density and the high pulmonary tissue density frequency. We showed that tissue density indexes gave access to a very accurate and reliable quantification of morphological changes induced by BLM even for the lowest concentration used (0.25 mg/kg). A reconstructed 2D-image of the entire lung section at high resolution (3.6 μm/pixel) has been performed from tissue density values allowing the visualization of their distribution throughout fibrotic and non-fibrotic regions. A significant correlation (p<0.0001) was found between automated analysis and the above standard evaluation methods. This correlation establishes automated analysis as a novel end-point measure of BLM-induced lung fibrosis in mice, which will be very valuable for future preclinical drug explorations. PMID:28107543

  12. Quantifying errors without random sampling.

    PubMed

    Phillips, Carl V; LaPole, Luwanna M

    2003-06-12

    All quantifications of mortality, morbidity, and other health measures involve numerous sources of error. The routine quantification of random sampling error makes it easy to forget that other sources of error can and should be quantified. When a quantification does not involve sampling, error is almost never quantified and results are often reported in ways that dramatically overstate their precision. We argue that the precision implicit in typical reporting is problematic and sketch methods for quantifying the various sources of error, building up from simple examples that can be solved analytically to more complex cases. There are straightforward ways to partially quantify the uncertainty surrounding a parameter that is not characterized by random sampling, such as limiting reported significant figures. We present simple methods for doing such quantifications, and for incorporating them into calculations. More complicated methods become necessary when multiple sources of uncertainty must be combined. We demonstrate that Monte Carlo simulation, using available software, can estimate the uncertainty resulting from complicated calculations with many sources of uncertainty. We apply the method to the current estimate of the annual incidence of foodborne illness in the United States. Quantifying uncertainty from systematic errors is practical. Reporting this uncertainty would more honestly represent study results, help show the probability that estimated values fall within some critical range, and facilitate better targeting of further research.

  13. Detection of proteins using a colorimetric bio-barcode assay.

    PubMed

    Nam, Jwa-Min; Jang, Kyung-Jin; Groves, Jay T

    2007-01-01

    The colorimetric bio-barcode assay is a red-to-blue color change-based protein detection method with ultrahigh sensitivity. This assay is based on both the bio-barcode amplification method that allows for detecting miniscule amount of targets with attomolar sensitivity and gold nanoparticle-based colorimetric DNA detection method that allows for a simple and straightforward detection of biomolecules of interest (here we detect interleukin-2, an important biomarker (cytokine) for many immunodeficiency-related diseases and cancers). The protocol is composed of the following steps: (i) conjugation of target capture molecules and barcode DNA strands onto silica microparticles, (ii) target capture with probes, (iii) separation and release of barcode DNA strands from the separated probes, (iv) detection of released barcode DNA using DNA-modified gold nanoparticle probes and (v) red-to-blue color change analysis with a graphic software. Actual target detection and quantification steps with premade probes take approximately 3 h (whole protocol including probe preparations takes approximately 3 days).

  14. Multiscale Modeling and Uncertainty Quantification for Nuclear Fuel Performance

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Estep, Donald; El-Azab, Anter; Pernice, Michael

    2017-03-23

    In this project, we will address the challenges associated with constructing high fidelity multiscale models of nuclear fuel performance. We (*) propose a novel approach for coupling mesoscale and macroscale models, (*) devise efficient numerical methods for simulating the coupled system, and (*) devise and analyze effective numerical approaches for error and uncertainty quantification for the coupled multiscale system. As an integral part of the project, we will carry out analysis of the effects of upscaling and downscaling, investigate efficient methods for stochastic sensitivity analysis of the individual macroscale and mesoscale models, and carry out a posteriori error analysis formore » computed results. We will pursue development and implementation of solutions in software used at Idaho National Laboratories on models of interest to the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.« less

  15. Bayesian Treatment of Uncertainty in Environmental Modeling: Optimization, Sampling and Data Assimilation Using the DREAM Software Package

    NASA Astrophysics Data System (ADS)

    Vrugt, J. A.

    2012-12-01

    In the past decade much progress has been made in the treatment of uncertainty in earth systems modeling. Whereas initial approaches has focused mostly on quantification of parameter and predictive uncertainty, recent methods attempt to disentangle the effects of parameter, forcing (input) data, model structural and calibration data errors. In this talk I will highlight some of our recent work involving theory, concepts and applications of Bayesian parameter and/or state estimation. In particular, new methods for sequential Monte Carlo (SMC) and Markov Chain Monte Carlo (MCMC) simulation will be presented with emphasis on massively parallel distributed computing and quantification of model structural errors. The theoretical and numerical developments will be illustrated using model-data synthesis problems in hydrology, hydrogeology and geophysics.

  16. KiT: a MATLAB package for kinetochore tracking.

    PubMed

    Armond, Jonathan W; Vladimirou, Elina; McAinsh, Andrew D; Burroughs, Nigel J

    2016-06-15

    During mitosis, chromosomes are attached to the mitotic spindle via large protein complexes called kinetochores. The motion of kinetochores throughout mitosis is intricate and automated quantitative tracking of their motion has already revealed many surprising facets of their behaviour. Here, we present 'KiT' (Kinetochore Tracking)-an easy-to-use, open-source software package for tracking kinetochores from live-cell fluorescent movies. KiT supports 2D, 3D and multi-colour movies, quantification of fluorescence, integrated deconvolution, parallel execution and multiple algorithms for particle localization. KiT is free, open-source software implemented in MATLAB and runs on all MATLAB supported platforms. KiT can be downloaded as a package from http://www.mechanochemistry.org/mcainsh/software.php The source repository is available at https://bitbucket.org/jarmond/kit and under continuing development. Supplementary data are available at Bioinformatics online. jonathan.armond@warwick.ac.uk. © The Author 2016. Published by Oxford University Press.

  17. Representation of Ice Geometry by Parametric Functions: Construction of Approximating NURBS Curves and Quantification of Ice Roughness--Year 1: Approximating NURBS Curves

    NASA Technical Reports Server (NTRS)

    Dill, Loren H.; Choo, Yung K. (Technical Monitor)

    2004-01-01

    Software was developed to construct approximating NURBS curves for iced airfoil geometries. Users specify a tolerance that determines the extent to which the approximating curve follows the rough ice. The user can therefore smooth the ice geometry in a controlled manner, thereby enabling the generation of grids suitable for numerical aerodynamic simulations. Ultimately, this ability to smooth the ice geometry will permit studies of the effects of smoothing upon the aerodynamics of iced airfoils. The software was applied to several different types of iced airfoil data collected in the Icing Research Tunnel at NASA Glenn Research Center, and in all cases was found to efficiently generate suitable approximating NURBS curves. This method is an improvement over the current "control point formulation" of Smaggice (v.1.2). In this report, we present the relevant theory of approximating NURBS curves and discuss typical results of the software.

  18. Reliable measurement of E. coli single cell fluorescence distribution using a standard microscope set-up.

    PubMed

    Cortesi, Marilisa; Bandiera, Lucia; Pasini, Alice; Bevilacqua, Alessandro; Gherardi, Alessandro; Furini, Simone; Giordano, Emanuele

    2017-01-01

    Quantifying gene expression at single cell level is fundamental for the complete characterization of synthetic gene circuits, due to the significant impact of noise and inter-cellular variability on the system's functionality. Commercial set-ups that allow the acquisition of fluorescent signal at single cell level (flow cytometers or quantitative microscopes) are expensive apparatuses that are hardly affordable by small laboratories. A protocol that makes a standard optical microscope able to acquire quantitative, single cell, fluorescent data from a bacterial population transformed with synthetic gene circuitry is presented. Single cell fluorescence values, acquired with a microscope set-up and processed with custom-made software, are compared with results that were obtained with a flow cytometer in a bacterial population transformed with the same gene circuitry. The high correlation between data from the two experimental set-ups, with a correlation coefficient computed over the tested dynamic range > 0.99, proves that a standard optical microscope- when coupled with appropriate software for image processing- might be used for quantitative single-cell fluorescence measurements. The calibration of the set-up, together with its validation, is described. The experimental protocol described in this paper makes quantitative measurement of single cell fluorescence accessible to laboratories equipped with standard optical microscope set-ups. Our method allows for an affordable measurement/quantification of intercellular variability, whose better understanding of this phenomenon will improve our comprehension of cellular behaviors and the design of synthetic gene circuits. All the required software is freely available to the synthetic biology community (MUSIQ Microscope flUorescence SIngle cell Quantification).

  19. cFinder: definition and quantification of multiple haplotypes in a mixed sample.

    PubMed

    Niklas, Norbert; Hafenscher, Julia; Barna, Agnes; Wiesinger, Karin; Pröll, Johannes; Dreiseitl, Stephan; Preuner-Stix, Sandra; Valent, Peter; Lion, Thomas; Gabriel, Christian

    2015-09-07

    Next-generation sequencing allows for determining the genetic composition of a mixed sample. For instance, when performing resistance testing for BCR-ABL1 it is necessary to identify clones and define compound mutations; together with an exact quantification this may complement diagnosis and therapy decisions with additional information. Moreover, that applies not only to oncological issues but also determination of viral, bacterial or fungal infection. The efforts to retrieve multiple haplotypes (more than two) and proportion information from data with conventional software are difficult, cumbersome and demand multiple manual steps. Therefore, we developed a tool called cFinder that is capable of automatic detection of haplotypes and their accurate quantification within one sample. BCR-ABL1 samples containing multiple clones were used for testing and our cFinder could identify all previously found clones together with their abundance and even refine some results. Additionally, reads were simulated using GemSIM with multiple haplotypes, the detection was very close to linear (R(2) = 0.96). Our aim is not to deduce haploblocks over statistics, but to characterize one sample's composition precisely. As a result the cFinder reports the connections of variants (haplotypes) with their readcount and relative occurrence (percentage). Download is available at http://sourceforge.net/projects/cfinder/. Our cFinder is implemented in an efficient algorithm that can be run on a low-performance desktop computer. Furthermore, it considers paired-end information (if available) and is generally open for any current next-generation sequencing technology and alignment strategy. To our knowledge, this is the first software that enables researchers without extensive bioinformatic support to designate multiple haplotypes and how they constitute to a sample.

  20. Quantification of micro-CT images of textile reinforcements

    NASA Astrophysics Data System (ADS)

    Straumit, Ilya; Lomov, Stepan V.; Wevers, Martine

    2017-10-01

    VoxTex software (KU Leuven) employs 3D image processing, which use the local directionality information, retrieved using analysis of local structure tensor. The processing results in a voxel 3D array, with each voxel carrying information on (1) material type (matrix; yarn/ply, with identification of the yarn/ply in the reinforcement architecture; void) and (2) fibre direction for fibrous yarns/plies. The knowledge of the material phase volume and known characterisation of the textile structure allows assigning to the voxels (3) fibre volume fraction. This basic voxel model can be further used for different type of the material analysis: Internal geometry and characterisation of defects; permeability; micromechanics; mesoFE voxel models. Apart from the voxel based analysis, approaches to reconstruction of the yarn paths are presented.

  1. Learning the Task Management Space of an Aircraft Approach Model

    NASA Technical Reports Server (NTRS)

    Krall, Joseph; Menzies, Tim; Davies, Misty

    2014-01-01

    Validating models of airspace operations is a particular challenge. These models are often aimed at finding and exploring safety violations, and aim to be accurate representations of real-world behavior. However, the rules governing the behavior are quite complex: nonlinear physics, operational modes, human behavior, and stochastic environmental concerns all determine the responses of the system. In this paper, we present a study on aircraft runway approaches as modeled in Georgia Tech's Work Models that Compute (WMC) simulation. We use a new learner, Genetic-Active Learning for Search-Based Software Engineering (GALE) to discover the Pareto frontiers defined by cognitive structures. These cognitive structures organize the prioritization and assignment of tasks of each pilot during approaches. We discuss the benefits of our approach, and also discuss future work necessary to enable uncertainty quantification.

  2. Identification of potential internal control genes for real-time PCR analysis during stress response in Pyropia haitanensis

    NASA Astrophysics Data System (ADS)

    Wang, Xia; Feng, Jianhua; Huang, Aiyou; He, Linwen; Niu, Jianfeng; Wang, Guangce

    2017-11-01

    Pyropia haitanensis has prominent stress-resistance characteristics and is endemic to China. Studies into the stress responses in these algae could provide valuable information on the stress-response mechanisms in the intertidal Rhodophyta. Here, the effects of salinity and light intensity on the quantum yield of photosystem II in Py. haitanensis were investigated using pulse-amplitude-modulation fluorometry. Total RNA and genomic DNA of the samples under different stress conditions were isolated. By normalizing to the genomic DNA quantity, the RNA content in each sample was evaluated. The cDNA was synthesized and the expression levels of seven potential internal control genes were evaluated using qRT-PCR method. Then, we used geNorm, a common statistical algorithm, to analyze the qRT-PCR data of seven reference genes. Potential genes that may constantly be expressed under different conditions were selected, and these genes showed stable expression levels in samples under a salinity treatment, while tubulin, glyceraldehyde-3-phosphate dehydrogenase and actin showed stability in samples stressed by strong light. Based on the results of the pulse amplitude-modulation fluorometry, an absolute quantification was performed to obtain gene copy numbers in certain stress-treated samples. The stably expressed genes as determined by the absolute quantification in certain samples conformed to the results of the geNorm screening. Based on the results of the software analysis and absolute quantification, we proposed that elongation factor 3 and 18S ribosomal RNA could be used as internal control genes when the Py. haitanensis blades were subjected to salinity stress, and that α-tubulin and 18S ribosomal RNA could be used as the internal control genes when the stress was from strong light. In general, our findings provide a convenient reference for the selection of internal control genes when designing experiments related to stress responses in Py. haitanensis.

  3. Selection and evaluation of reference genes for expression studies with quantitative PCR in the model fungus Neurospora crassa under different environmental conditions in continuous culture.

    PubMed

    Cusick, Kathleen D; Fitzgerald, Lisa A; Pirlo, Russell K; Cockrell, Allison L; Petersen, Emily R; Biffinger, Justin C

    2014-01-01

    Neurospora crassa has served as a model organism for studying circadian pathways and more recently has gained attention in the biofuel industry due to its enhanced capacity for cellulase production. However, in order to optimize N. crassa for biotechnological applications, metabolic pathways during growth under different environmental conditions must be addressed. Reverse-transcription quantitative PCR (RT-qPCR) is a technique that provides a high-throughput platform from which to measure the expression of a large set of genes over time. The selection of a suitable reference gene is critical for gene expression studies using relative quantification, as this strategy is based on normalization of target gene expression to a reference gene whose expression is stable under the experimental conditions. This study evaluated twelve candidate reference genes for use with N. crassa when grown in continuous culture bioreactors under different light and temperature conditions. Based on combined stability values from NormFinder and Best Keeper software packages, the following are the most appropriate reference genes under conditions of: (1) light/dark cycling: btl, asl, and vma1; (2) all-dark growth: btl, tbp, vma1, and vma2; (3) temperature flux: btl, vma1, act, and asl; (4) all conditions combined: vma1, vma2, tbp, and btl. Since N. crassa exists as different cell types (uni- or multi-nucleated), expression changes in a subset of the candidate genes was further assessed using absolute quantification. A strong negative correlation was found to exist between ratio and threshold cycle (CT) values, demonstrating that CT changes serve as a reliable reflection of transcript, and not gene copy number, fluctuations. The results of this study identified genes that are appropriate for use as reference genes in RT-qPCR studies with N. crassa and demonstrated that even with the presence of different cell types, relative quantification is an acceptable method for measuring gene expression changes during growth in bioreactors.

  4. Rapid quantification and sex determination of forensic evidence materials.

    PubMed

    Andréasson, Hanna; Allen, Marie

    2003-11-01

    DNA quantification of forensic evidence is very valuable for an optimal use of the available biological material. Moreover, sex determination is of great importance as additional information in criminal investigations as well as in identification of missing persons, no suspect cases, and ancient DNA studies. While routine forensic DNA analysis based on short tandem repeat markers includes a marker for sex determination, analysis of samples containing scarce amounts of DNA is often based on mitochondrial DNA, and sex determination is not performed. In order to allow quantification and simultaneous sex determination on minute amounts of DNA, an assay based on real-time PCR analysis of a marker within the human amelogenin gene has been developed. The sex determination is based on melting curve analysis, while an externally standardized kinetic analysis allows quantification of the nuclear DNA copy number in the sample. This real-time DNA quantification assay has proven to be highly sensitive, enabling quantification of single DNA copies. Although certain limitations were apparent, the system is a rapid, cost-effective, and flexible assay for analysis of forensic casework samples.

  5. Quantification of collagen contraction in three-dimensional cell culture.

    PubMed

    Kopanska, Katarzyna S; Bussonnier, Matthias; Geraldo, Sara; Simon, Anthony; Vignjevic, Danijela; Betz, Timo

    2015-01-01

    Many different cell types including fibroblasts, smooth muscle cells, endothelial cells, and cancer cells exert traction forces on the fibrous components of the extracellular matrix. This can be observed as matrix contraction both macro- and microscopically in three-dimensional (3D) tissues models such as collagen type I gels. The quantification of local contraction at the micron scale, including its directionality and speed, in correlation with other parameters such as cell invasion, local protein or gene expression, can provide useful information to study wound healing, organism development, and cancer metastasis. In this article, we present a set of tools to quantify the flow dynamics of collagen contraction, induced by cells migrating out of a multicellular cancer spheroid into a three-dimensional (3D) collagen matrix. We adapted a pseudo-speckle technique that can be applied to bright-field and fluorescent microscopy time series. The image analysis presented here is based on an in-house written software developed in the Matlab (Mathworks) programming environment. The analysis program is freely available from GitHub following the link: http://dx.doi.org/10.5281/zenodo.10116. This tool provides an automatized technique to measure collagen contraction that can be utilized in different 3D cellular systems. Copyright © 2015 Elsevier Inc. All rights reserved.

  6. Chemical Fingerprint Analysis and Quantitative Analysis of Rosa rugosa by UPLC-DAD.

    PubMed

    Mansur, Sanawar; Abdulla, Rahima; Ayupbec, Amatjan; Aisa, Haji Akbar

    2016-12-21

    A method based on ultra performance liquid chromatography with a diode array detector (UPLC-DAD) was developed for quantitative analysis of five active compounds and chemical fingerprint analysis of Rosa rugosa . Ten batches of R. rugosa collected from different plantations in the Xinjiang region of China were used to establish the fingerprint. The feasibility and advantages of the used UPLC fingerprint were verified for its similarity evaluation by systematically comparing chromatograms with professional analytical software recommended by State Food and Drug Administration (SFDA) of China. In quantitative analysis, the five compounds showed good regression (R² = 0.9995) within the test ranges, and the recovery of the method was in the range of 94.2%-103.8%. The similarities of liquid chromatography fingerprints of 10 batches of R. rugosa were more than 0.981. The developed UPLC fingerprint method is simple, reliable, and validated for the quality control and identification of R. rugosa . Additionally, simultaneous quantification of five major bioactive ingredients in the R. rugosa samples was conducted to interpret the consistency of the quality test. The results indicated that the UPLC fingerprint, as a characteristic distinguishing method combining similarity evaluation and quantification analysis, can be successfully used to assess the quality and to identify the authenticity of R. rugosa .

  7. Diversity of the Marine Cyanobacterium Trichodesmium: Characterization of the Woods Hole Culture Collection and Quantification of Field Populations

    DTIC Science & Technology

    2009-09-01

    using her beadbeater, Sonya Dyhrman for being my initial biology advisor, Heidi Sosik for her advice on image processing , the residents of Watson...64 2-17 Phycobiliprotein absorption spectra . . . . . . . . . . . . . . . . . . . . . 66 3-1 Image processing for automated cell counts...digital camera and Axiovision 4.6.3 software. Images were measured, and cell metrics were determined using the MATLAB image processing toolbox

  8. A novel method to characterize silica bodies in grasses.

    PubMed

    Dabney, Clemon; Ostergaard, Jason; Watkins, Eric; Chen, Changbin

    2016-01-01

    The deposition of silicon into epidermal cells of grass species is thought to be an important mechanism that plants use as a defense against pests and environmental stresses. There are a number of techniques available to study the size, density and distribution pattern of silica bodies in grass leaves. However, none of those techniques can provide a high-throughput analysis, especially for a great number of samples. We developed a method utilizing the autofluorescence of silica bodies to investigate their size and distribution, along with the number of carbon inclusions within the silica bodies of perennial grass species Koeleria macrantha. Fluorescence images were analyzed by image software Adobe Photoshop CS5 or ImageJ that remarkably facilitated the quantification of silica bodies in the dry ash. We observed three types of silica bodies or silica body related mineral structures. Silica bodies were detected on both abaxial and adaxial epidermis of K. macrantha leaves, although their sizes, density, and distribution patterns were different. No auto-fluorescence was detected from carbon inclusions. The combination of fluorescence microscopy and image processing software displayed efficient utilization in the identification and quantification of silica bodies in K. macrantha leaf tissues, which should applicable to biological, ecological and geological studies of grasses including forage, turf grasses and cereal crops.

  9. Toward improved peptide feature detection in quantitative proteomics using stable isotope labeling.

    PubMed

    Nilse, Lars; Sigloch, Florian Christoph; Biniossek, Martin L; Schilling, Oliver

    2015-08-01

    Reliable detection of peptides in LC-MS data is a key algorithmic step in the analysis of quantitative proteomics experiments. While highly abundant peptides can be detected reliably by most modern software tools, there is much less agreement on medium and low-intensity peptides in a sample. The choice of software tools can have a big impact on the quantification of proteins, especially for proteins that appear in lower concentrations. However, in many experiments, it is precisely this region of less abundant but substantially regulated proteins that holds the biggest potential for discoveries. This is particularly true for discovery proteomics in the pharmacological sector with a specific interest in key regulatory proteins. In this viewpoint article, we discuss how the development of novel software algorithms allows us to study this region of the proteome with increased confidence. Reliable results are one of many aspects to be considered when deciding on a bioinformatics software platform. Deployment into existing IT infrastructures, compatibility with other software packages, scalability, automation, flexibility, and support need to be considered and are briefly addressed in this viewpoint article. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  10. Transmissivity interpolation using Fluid Flow Log data at different depth level in Liwa Aquifer, UAE.

    NASA Astrophysics Data System (ADS)

    Gülşen, Esra; Kurtulus, Bedri; Necati Yaylim, Tolga; Avsar, Ozgur

    2017-04-01

    In groundwater studies, quantification and detection of fluid flows in borehole is an important part of assessment aquifer characteristic at different depths. Monitoring wells disturbs the natural flow field and this disturbance creates different flow paths to an aquifer. Vertical flow fluid analyses are one of the important techniques to deal with the detection and quantification of these vertical flows in borehole/monitoring wells. Liwa region is located about 146 km to the south west of Abu Dhabi city and about 36 km southwest of Madinat Zayed. SWSR (Strategic Water Storage & Recovery Project) comprises three Schemes (A, B and C) and each scheme contains an infiltration basin in the center, 105 recovery wells, 10 clusters and each cluster comprises 3 monitoring wells with different depths; shallow ( 50 m), intermediate ( 75 m) and deep ( 100 m). The scope of this study is to calculate the transmissivity values at different depth and evaluate the Fluid Flow Log (FFL) data for Scheme A (105 recovery wells) in order to understand the aquifer characteristic at different depths. The transmissivity values at different depth levels are calculated using Razack and Huntley (1991) equation for vertical flow rates of 30 m3 /h, 60 m3 /h, 90 m3 /h, 120 m3 /h and then Empirical Bayesian Kriging is used for interpolation in Scheme A using ArcGIS 10.2 software. FFL are drawn by GeODin software. Derivative analysis of fluid flow data are done by Microsoft Office: Excel software. All statistical analyses are calculated by IBMSPSS software. The interpolation results show that the transmissivity values are higher at the top of the aquifer. In other word, the aquifer is found more productive at the upper part of the Liwa aquifer. We are very grateful for financial support and providing us the data to ZETAS Dubai Inc.

  11. A quantification of the effectiveness of EPID dosimetry and software-based plan verification systems in detecting incidents in radiotherapy

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Bojechko, Casey; Phillps, Mark; Kalet, Alan

    Purpose: Complex treatments in radiation therapy require robust verification in order to prevent errors that can adversely affect the patient. For this purpose, the authors estimate the effectiveness of detecting errors with a “defense in depth” system composed of electronic portal imaging device (EPID) based dosimetry and a software-based system composed of rules-based and Bayesian network verifications. Methods: The authors analyzed incidents with a high potential severity score, scored as a 3 or 4 on a 4 point scale, recorded in an in-house voluntary incident reporting system, collected from February 2012 to August 2014. The incidents were categorized into differentmore » failure modes. The detectability, defined as the number of incidents that are detectable divided total number of incidents, was calculated for each failure mode. Results: In total, 343 incidents were used in this study. Of the incidents 67% were related to photon external beam therapy (EBRT). The majority of the EBRT incidents were related to patient positioning and only a small number of these could be detected by EPID dosimetry when performed prior to treatment (6%). A large fraction could be detected by in vivo dosimetry performed during the first fraction (74%). Rules-based and Bayesian network verifications were found to be complimentary to EPID dosimetry, able to detect errors related to patient prescriptions and documentation, and errors unrelated to photon EBRT. Combining all of the verification steps together, 91% of all EBRT incidents could be detected. Conclusions: This study shows that the defense in depth system is potentially able to detect a large majority of incidents. The most effective EPID-based dosimetry verification is in vivo measurements during the first fraction and is complemented by rules-based and Bayesian network plan checking.« less

  12. Optimization by infusion of multiple reaction monitoring transitions for sensitive quantification of peptides by liquid chromatography/mass spectrometry.

    PubMed

    Alghanem, Bandar; Nikitin, Frédéric; Stricker, Thomas; Duchoslav, Eva; Luban, Jeremy; Strambio-De-Castillia, Caterina; Muller, Markus; Lisacek, Frédérique; Varesio, Emmanuel; Hopfgartner, Gérard

    2017-05-15

    In peptide quantification by liquid chromatography/mass spectrometry (LC/MS), the optimization of multiple reaction monitoring (MRM) parameters is essential for sensitive detection. We have compared different approaches to build MRM assays, based either on flow injection analysis (FIA) of isotopically labelled peptides, or on the knowledge and the prediction of the best settings for MRM transitions and collision energies (CE). In this context, we introduce MRMOptimizer, an open-source software tool that processes spectra and assists the user in selecting transitions in the FIA workflow. MS/MS spectral libraries with CE voltages from 10 to 70 V are automatically acquired in FIA mode for isotopically labelled peptides. Then MRMOptimizer determines the optimal MRM settings for each peptide. To assess the quantitative performance of our approach, 155 peptides, representing 84 proteins, were analysed by LC/MRM-MS and the peak areas were compared between: (A) the MRMOptimizer-based workflow, (B1) the SRMAtlas transitions set used 'as-is'; (B2) the same SRMAtlas set with CE parameters optimized by Skyline. 51% of the three most intense transitions per peptide were shown to be common to both A and B1/B2 methods, and displayed similar sensitivity and peak area distributions. The peak areas obtained with MRMOptimizer for transitions sharing either the precursor ion charge state or the fragment ions with the SRMAtlas set at unique transitions were increased 1.8- to 2.3-fold. The gain in sensitivity using MRMOptimizer for transitions with different precursor ion charge state and fragment ions (8% of the total), reaches a ~ 11-fold increase. Isotopically labelled peptides can be used to optimize MRM transitions more efficiently in FIA than by searching databases. The MRMOptimizer software is MS independent and enables the post-acquisition selection of MRM parameters. Coefficients of variation for optimal CE values are lower than those obtained with the SRMAtlas approach (B2) and one additional peptide was detected. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Perfusion quantification in contrast-enhanced ultrasound (CEUS)--ready for research projects and routine clinical use.

    PubMed

    Tranquart, F; Mercier, L; Frinking, P; Gaud, E; Arditi, M

    2012-07-01

    With contrast-enhanced ultrasound (CEUS) now established as a valuable imaging modality for many applications, a more specific demand has recently emerged for quantifying perfusion and using measured parameters as objective indicators for various disease states. However, CEUS perfusion quantification remains challenging and is not well integrated in daily clinical practice. The development of VueBox™ alleviates existing limitations and enables quantification in a standardized way. VueBox™ operates as an off-line software application, after dynamic contrast-enhanced ultrasound (DCE-US) is performed. It enables linearization of DICOM clips, assessment of perfusion using patented curve-fitting models, and generation of parametric images by synthesizing perfusion information at the pixel level using color coding. VueBox™ is compatible with most of the available ultrasound platforms (nonlinear contrast-enabled), has the ability to process both bolus and disruption-replenishment kinetics loops, allows analysis results and their context to be saved, and generates analysis reports automatically. Specific features have been added to VueBox™, such as fully automatic in-plane motion compensation and an easy-to-use clip editor. Processing time has been reduced as a result of parallel programming optimized for multi-core processors. A long list of perfusion parameters is available for each of the two administration modes to address all possible demands currently reported in the literature for diagnosis or treatment monitoring. In conclusion, VueBox™ is a valid and robust quantification tool to be used for standardizing perfusion quantification and to improve the reproducibility of results across centers. © Georg Thieme Verlag KG Stuttgart · New York.

  14. A refined methodology for modeling volume quantification performance in CT

    NASA Astrophysics Data System (ADS)

    Chen, Baiyu; Wilson, Joshua; Samei, Ehsan

    2014-03-01

    The utility of CT lung nodule volume quantification technique depends on the precision of the quantification. To enable the evaluation of quantification precision, we previously developed a mathematical model that related precision to image resolution and noise properties in uniform backgrounds in terms of an estimability index (e'). The e' was shown to predict empirical precision across 54 imaging and reconstruction protocols, but with different correlation qualities for FBP and iterative reconstruction (IR) due to the non-linearity of IR impacted by anatomical structure. To better account for the non-linearity of IR, this study aimed to refine the noise characterization of the model in the presence of textured backgrounds. Repeated scans of an anthropomorphic lung phantom were acquired. Subtracted images were used to measure the image quantum noise, which was then used to adjust the noise component of the e' calculation measured from a uniform region. In addition to the model refinement, the validation of the model was further extended to 2 nodule sizes (5 and 10 mm) and 2 segmentation algorithms. Results showed that the magnitude of IR's quantum noise was significantly higher in structured backgrounds than in uniform backgrounds (ASiR, 30-50%; MBIR, 100-200%). With the refined model, the correlation between e' values and empirical precision no longer depended on reconstruction algorithm. In conclusion, the model with refined noise characterization relfected the nonlinearity of iterative reconstruction in structured background, and further showed successful prediction of quantification precision across a variety of nodule sizes, dose levels, slice thickness, reconstruction algorithms, and segmentation software.

  15. Quantitative Imaging with a Mobile Phone Microscope

    PubMed Central

    Skandarajah, Arunan; Reber, Clay D.; Switz, Neil A.; Fletcher, Daniel A.

    2014-01-01

    Use of optical imaging for medical and scientific applications requires accurate quantification of features such as object size, color, and brightness. High pixel density cameras available on modern mobile phones have made photography simple and convenient for consumer applications; however, the camera hardware and software that enables this simplicity can present a barrier to accurate quantification of image data. This issue is exacerbated by automated settings, proprietary image processing algorithms, rapid phone evolution, and the diversity of manufacturers. If mobile phone cameras are to live up to their potential to increase access to healthcare in low-resource settings, limitations of mobile phone–based imaging must be fully understood and addressed with procedures that minimize their effects on image quantification. Here we focus on microscopic optical imaging using a custom mobile phone microscope that is compatible with phones from multiple manufacturers. We demonstrate that quantitative microscopy with micron-scale spatial resolution can be carried out with multiple phones and that image linearity, distortion, and color can be corrected as needed. Using all versions of the iPhone and a selection of Android phones released between 2007 and 2012, we show that phones with greater than 5 MP are capable of nearly diffraction-limited resolution over a broad range of magnifications, including those relevant for single cell imaging. We find that automatic focus, exposure, and color gain standard on mobile phones can degrade image resolution and reduce accuracy of color capture if uncorrected, and we devise procedures to avoid these barriers to quantitative imaging. By accommodating the differences between mobile phone cameras and the scientific cameras, mobile phone microscopes can be reliably used to increase access to quantitative imaging for a variety of medical and scientific applications. PMID:24824072

  16. An inexpensive and worldwide available digital image analysis technique for histological fibrosis quantification in chronic hepatitis C.

    PubMed

    Campos, C F F; Paiva, D D; Perazzo, H; Moreira, P S; Areco, L F F; Terra, C; Perez, R; Figueiredo, F A F

    2014-03-01

    Hepatic fibrosis staging is based on semiquantitative scores. Digital imaging analysis (DIA) appears more accurate because fibrosis is quantified in a continuous scale. However, high cost, lack of standardization and worldwide unavailability restrict its use in clinical practice. We developed an inexpensive and widely available DIA technique for fibrosis quantification in hepatitis C, and here, we evaluate its reproducibility and correlation with semiquantitative scores, and determine the fibrosis percentage associated with septal fibrosis and cirrhosis. 282 needle biopsies staged by Ishak and METAVIR scores were included. Images of trichrome-stained sections were captured and processed using Adobe(®) Photoshop(®) CS3 and Adobe(®) Bridge(®) softwares. The percentage of fibrosis (fibrosis index) was determined by the ratio between the fibrosis area and the total sample area, expressed in pixels calculated in an automated way. An excellent correlation between DIA fibrosis index and Ishak and METAVIR scores was observed (Spearman's r = 0.95 and 0.92; P < 0.001, respectively). Excellent intra-observer reproducibility was observed in a randomly chosen subset of 39 biopsies with an intraclass correlation index of 0.99 (95% CI, 0.95-0.99). The best cut-offs associated with septal fibrosis and cirrhosis were 6% (AUROC 0.97, 95% CI, 0.95-0.99) and 27% (AUROC 1.0, 95% CI, 0.99-1), respectively. This new DIA technique had high correlation with semiquantitative scores in hepatitis C. This method is reproducible, inexpensive and available worldwide allowing its use in clinical practice. The incorporation of DIA technique provides a more complete evaluation of fibrosis adding the quantification to architectural patterns. © 2013 John Wiley & Sons Ltd.

  17. Leveraging transcript quantification for fast computation of alternative splicing profiles.

    PubMed

    Alamancos, Gael P; Pagès, Amadís; Trincado, Juan L; Bellora, Nicolás; Eyras, Eduardo

    2015-09-01

    Alternative splicing plays an essential role in many cellular processes and bears major relevance in the understanding of multiple diseases, including cancer. High-throughput RNA sequencing allows genome-wide analyses of splicing across multiple conditions. However, the increasing number of available data sets represents a major challenge in terms of computation time and storage requirements. We describe SUPPA, a computational tool to calculate relative inclusion values of alternative splicing events, exploiting fast transcript quantification. SUPPA accuracy is comparable and sometimes superior to standard methods using simulated as well as real RNA-sequencing data compared with experimentally validated events. We assess the variability in terms of the choice of annotation and provide evidence that using complete transcripts rather than more transcripts per gene provides better estimates. Moreover, SUPPA coupled with de novo transcript reconstruction methods does not achieve accuracies as high as using quantification of known transcripts, but remains comparable to existing methods. Finally, we show that SUPPA is more than 1000 times faster than standard methods. Coupled with fast transcript quantification, SUPPA provides inclusion values at a much higher speed than existing methods without compromising accuracy, thereby facilitating the systematic splicing analysis of large data sets with limited computational resources. The software is implemented in Python 2.7 and is available under the MIT license at https://bitbucket.org/regulatorygenomicsupf/suppa. © 2015 Alamancos et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.

  18. Visualization of LC-MS/MS proteomics data in MaxQuant.

    PubMed

    Tyanova, Stefka; Temu, Tikira; Carlson, Arthur; Sinitcyn, Pavel; Mann, Matthias; Cox, Juergen

    2015-04-01

    Modern software platforms enable the analysis of shotgun proteomics data in an automated fashion resulting in high quality identification and quantification results. Additional understanding of the underlying data can be gained with the help of advanced visualization tools that allow for easy navigation through large LC-MS/MS datasets potentially consisting of terabytes of raw data. The updated MaxQuant version has a map navigation component that steers the users through mass and retention time-dependent mass spectrometric signals. It can be used to monitor a peptide feature used in label-free quantification over many LC-MS runs and visualize it with advanced 3D graphic models. An expert annotation system aids the interpretation of the MS/MS spectra used for the identification of these peptide features. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  19. Automated analysis of flow cytometric data for measuring neutrophil CD64 expression using a multi-instrument compatible probability state model.

    PubMed

    Wong, Linda; Hill, Beth L; Hunsberger, Benjamin C; Bagwell, C Bruce; Curtis, Adam D; Davis, Bruce H

    2015-01-01

    Leuko64™ (Trillium Diagnostics) is a flow cytometric assay that measures neutrophil CD64 expression and serves as an in vitro indicator of infection/sepsis or the presence of a systemic acute inflammatory response. Leuko64 assay currently utilizes QuantiCALC, a semiautomated software that employs cluster algorithms to define cell populations. The software reduces subjective gating decisions, resulting in interanalyst variability of <5%. We evaluated a completely automated approach to measuring neutrophil CD64 expression using GemStone™ (Verity Software House) and probability state modeling (PSM). Four hundred and fifty-seven human blood samples were processed using the Leuko64 assay. Samples were analyzed on four different flow cytometer models: BD FACSCanto II, BD FACScan, BC Gallios/Navios, and BC FC500. A probability state model was designed to identify calibration beads and three leukocyte subpopulations based on differences in intensity levels of several parameters. PSM automatically calculates CD64 index values for each cell population using equations programmed into the model. GemStone software uses PSM that requires no operator intervention, thus totally automating data analysis and internal quality control flagging. Expert analysis with the predicate method (QuantiCALC) was performed. Interanalyst precision was evaluated for both methods of data analysis. PSM with GemStone correlates well with the expert manual analysis, r(2) = 0.99675 for the neutrophil CD64 index values with no intermethod bias detected. The average interanalyst imprecision for the QuantiCALC method was 1.06% (range 0.00-7.94%), which was reduced to 0.00% with the GemStone PSM. The operator-to-operator agreement in GemStone was a perfect correlation, r(2) = 1.000. Automated quantification of CD64 index values produced results that strongly correlate with expert analysis using a standard gate-based data analysis method. PSM successfully evaluated flow cytometric data generated by multiple instruments across multiple lots of the Leuko64 kit in all 457 cases. The probability-based method provides greater objectivity, higher data analysis speed, and allows for greater precision for in vitro diagnostic flow cytometric assays. © 2015 International Clinical Cytometry Society.

  20. Cytometric analysis of retinopathies in retinal trypsin digests

    NASA Astrophysics Data System (ADS)

    Ghanian, Zahra; Staniszewski, Kevin; Sorenson, Christine M.; Sheibani, Nader; Ranji, Mahsa

    2014-03-01

    The objective of this work was to design an automated image cytometry tool for determination of various retinal vascular parameters including extraction of features that are relevant to postnatal retinal vascular development, and the progression of diabetic retinopathy. To confirm the utility and accuracy of the software, retinal trypsin digest from TSP1-/- and diabetic Akita/+; TSP1-/- mice were analyzed. TSP1 is a critical inhibitor of development of retinopathies and lack of TSP1 exacerbates progression of early diabetic retinopathies. Loss of vascular cells of and gain more acellular capillaries as two major signs of diabetic retinopathies were used to classify a retina as normal or injured. This software allows quantification and high throughput assessment of retinopathy changes associated with diabetes.

  1. Boiler: lossy compression of RNA-seq alignments using coverage vectors

    PubMed Central

    Pritt, Jacob; Langmead, Ben

    2016-01-01

    We describe Boiler, a new software tool for compressing and querying large collections of RNA-seq alignments. Boiler discards most per-read data, keeping only a genomic coverage vector plus a few empirical distributions summarizing the alignments. Since most per-read data is discarded, storage footprint is often much smaller than that achieved by other compression tools. Despite this, the most relevant per-read data can be recovered; we show that Boiler compression has only a slight negative impact on results given by downstream tools for isoform assembly and quantification. Boiler also allows the user to pose fast and useful queries without decompressing the entire file. Boiler is free open source software available from github.com/jpritt/boiler. PMID:27298258

  2. Sulfur-based absolute quantification of proteins using isotope dilution inductively coupled plasma mass spectrometry

    NASA Astrophysics Data System (ADS)

    Lee, Hyun-Seok; Heun Kim, Sook; Jeong, Ji-Seon; Lee, Yong-Moon; Yim, Yong-Hyeon

    2015-10-01

    An element-based reductive approach provides an effective means of realizing International System of Units (SI) traceability for high-purity biological standards. Here, we develop an absolute protein quantification method using double isotope dilution (ID) inductively coupled plasma mass spectrometry (ICP-MS) combined with microwave-assisted acid digestion for the first time. We validated the method and applied it to certify the candidate protein certified reference material (CRM) of human growth hormone (hGH). The concentration of hGH was determined by analysing the total amount of sulfur in hGH. Next, the size-exclusion chromatography method was used with ICP-MS to characterize and quantify sulfur-containing impurities. By subtracting the contribution of sulfur-containing impurities from the total sulfur content in the hGH CRM, we obtained a SI-traceable certification value. The quantification result obtained with the present method based on sulfur analysis was in excellent agreement with the result determined via a well-established protein quantification method based on amino acid analysis using conventional acid hydrolysis combined with an ID liquid chromatography-tandem mass spectrometry. The element-based protein quantification method developed here can be generally used for SI-traceable absolute quantification of proteins, especially pure-protein standards.

  3. A phase quantification method based on EBSD data for a continuously cooled microalloyed steel

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Zhao, H.; Wynne, B.P.; Palmiere, E.J., E-mail: e.j

    2017-01-15

    Mechanical properties of steels depend on the phase constitutions of the final microstructures which can be related to the processing parameters. Therefore, accurate quantification of different phases is necessary to investigate the relationships between processing parameters, final microstructures and mechanical properties. Point counting on micrographs observed by optical or scanning electron microscopy is widely used as a phase quantification method, and different phases are discriminated according to their morphological characteristics. However, it is difficult to differentiate some of the phase constituents with similar morphology. Differently, for EBSD based phase quantification methods, besides morphological characteristics, other parameters derived from the orientationmore » information can also be used for discrimination. In this research, a phase quantification method based on EBSD data in the unit of grains was proposed to identify and quantify the complex phase constitutions of a microalloyed steel subjected to accelerated coolings. Characteristics of polygonal ferrite/quasi-polygonal ferrite, acicular ferrite and bainitic ferrite on grain averaged misorientation angles, aspect ratios, high angle grain boundary fractions and grain sizes were analysed and used to develop the identification criteria for each phase. Comparing the results obtained by this EBSD based method and point counting, it was found that this EBSD based method can provide accurate and reliable phase quantification results for microstructures with relatively slow cooling rates. - Highlights: •A phase quantification method based on EBSD data in the unit of grains was proposed. •The critical grain area above which GAM angles are valid parameters was obtained. •Grain size and grain boundary misorientation were used to identify acicular ferrite. •High cooling rates deteriorate the accuracy of this EBSD based method.« less

  4. Streaming fragment assignment for real-time analysis of sequencing experiments

    PubMed Central

    Roberts, Adam; Pachter, Lior

    2013-01-01

    We present eXpress, a software package for highly efficient probabilistic assignment of ambiguously mapping sequenced fragments. eXpress uses a streaming algorithm with linear run time and constant memory use. It can determine abundances of sequenced molecules in real time, and can be applied to ChIP-seq, metagenomics and other large-scale sequencing data. We demonstrate its use on RNA-seq data, showing greater efficiency than other quantification methods. PMID:23160280

  5. Open source software and low cost sensors for teaching UAV science

    NASA Astrophysics Data System (ADS)

    Kefauver, S. C.; Sanchez-Bragado, R.; El-Haddad, G.; Araus, J. L.

    2016-12-01

    Drones, also known as UASs (unmanned aerial systems), UAVs (Unmanned Aerial Vehicles) or RPAS (Remotely piloted aircraft systems), are both useful advanced scientific platforms and recreational toys that are appealing to younger generations. As such, they can make for excellent education tools as well as low-cost scientific research project alternatives. However, the process of taking pretty pictures to remote sensing science can be daunting if one is presented with only expensive software and sensor options. There are a number of open-source tools and low cost platform and sensor options available that can provide excellent scientific research results, and, by often requiring more user-involvement than commercial software and sensors, provide even greater educational benefits. Scale-invariant feature transform (SIFT) algorithm implementations, such as the Microsoft Image Composite Editor (ICE), which can create quality 2D image mosaics with some motion and terrain adjustments and VisualSFM (Structure from Motion), which can provide full image mosaicking with movement and orthorectification capacities. RGB image quantification using alternate color space transforms, such as the BreedPix indices, can be calculated via plugins in the open-source software Fiji (http://fiji.sc/Fiji; http://github.com/george-haddad/CIMMYT). Recent analyses of aerial images from UAVs over different vegetation types and environments have shown RGB metrics can outperform more costly commercial sensors. Specifically, Hue-based pixel counts, the Triangle Greenness Index (TGI), and the Normalized Green Red Difference Index (NGRDI) consistently outperformed NDVI in estimating abiotic and biotic stress impacts on crop health. Also, simple kits are available for NDVI camera conversions. Furthermore, suggestions for multivariate analyses of the different RGB indices in the "R program for statistical computing", such as classification and regression trees can allow for a more approachable interpretation of results in the classroom.

  6. Depth profiling and imaging capabilities of an ultrashort pulse laser ablation time of flight mass spectrometer

    PubMed Central

    Cui, Yang; Moore, Jerry F.; Milasinovic, Slobodan; Liu, Yaoming; Gordon, Robert J.; Hanley, Luke

    2012-01-01

    An ultrafast laser ablation time-of-flight mass spectrometer (AToF-MS) and associated data acquisition software that permits imaging at micron-scale resolution and sub-micron-scale depth profiling are described. The ion funnel-based source of this instrument can be operated at pressures ranging from 10−8 to ∼0.3 mbar. Mass spectra may be collected and stored at a rate of 1 kHz by the data acquisition system, allowing the instrument to be coupled with standard commercial Ti:sapphire lasers. The capabilities of the AToF-MS instrument are demonstrated on metal foils and semiconductor wafers using a Ti:sapphire laser emitting 800 nm, ∼75 fs pulses at 1 kHz. Results show that elemental quantification and depth profiling are feasible with this instrument. PMID:23020378

  7. Towards a supported common NEAMS software stack

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Cormac Garvey

    2012-04-01

    The NEAMS IPSC's are developing multidimensional, multiphysics, multiscale simulation codes based on first principles that will be capable of predicting all aspects of current and future nuclear reactor systems. These new breeds of simulation codes will include rigorous verification, validation and uncertainty quantification checks to quantify the accuracy and quality of the simulation results. The resulting NEAMS IPSC simulation codes will be an invaluable tool in designing the next generation of Nuclear Reactors and also contribute to a more speedy process in the acquisition of licenses from the NRC for new Reactor designs. Due to the high resolution of themore » models, the complexity of the physics and the added computational resources to quantify the accuracy/quality of the results, the NEAMS IPSC codes will require large HPC resources to carry out the production simulation runs.« less

  8. Value of PET/CT 3D visualization of head and neck squamous cell carcinoma extended to mandible.

    PubMed

    Lopez, R; Gantet, P; Julian, A; Hitzel, A; Herbault-Barres, B; Alshehri, S; Payoux, P

    2018-05-01

    To study an original 3D visualization of head and neck squamous cell carcinoma extending to the mandible by using [18F]-NaF PET/CT and [18F]-FDG PET/CT imaging along with a new innovative FDG and NaF image analysis using dedicated software. The main interest of the 3D evaluation is to have a better visualization of bone extension in such cancers and that could also avoid unsatisfying surgical treatment later on. A prospective study was carried out from November 2016 to September 2017. Twenty patients with head and neck squamous cell carcinoma extending to the mandible (stage 4 in the UICC classification) underwent [18F]-NaF and [18F]-FDG PET/CT. We compared the delineation of 3D quantification obtained with [18F]-NaF and [18F]-FDG PET/CT. In order to carry out this comparison, a method of visualisation and quantification of PET images was developed. This new approach was based on a process of quantification of radioactive activity within the mandibular bone that objectively defined the significant limits of this activity on PET images and on a 3D visualization. Furthermore, the spatial limits obtained by analysis of the PET/CT 3D images were compared to those obtained by histopathological examination of mandibular resection which confirmed intraosseous extension to the mandible. The [18F]-NaF PET/CT imaging confirmed the mandibular extension in 85% of cases and was not shown in [18F]-FDG PET/CT imaging. The [18F]-NaF PET/CT was significantly more accurate than [18F]-FDG PET/CT in 3D assessment of intraosseous extension of head and neck squamous cell carcinoma. This new 3D information shows the importance in the imaging approach of cancers. All cases of mandibular extension suspected on [18F]-NaF PET/CT imaging were confirmed based on histopathological results as a reference. The [18F]-NaF PET/CT 3D visualization should be included in the pre-treatment workups of head and neck cancers. With the use of a dedicated software which enables objective delineation of radioactive activity within the bone, it gives a very encouraging results. The [18F]-FDG PET/CT appears insufficient to confirm mandibular extension. This new 3D simulation management is expected to avoid under treatment of patients with intraosseous mandibular extension of head and neck cancers. However, there is also a need for a further study that will compare the interest of PET/CT and PET/MRI in this indication. Copyright © 2018 European Association for Cranio-Maxillo-Facial Surgery. Published by Elsevier Ltd. All rights reserved.

  9. Maui-VIA: A User-Friendly Software for Visual Identification, Alignment, Correction, and Quantification of Gas Chromatography–Mass Spectrometry Data

    PubMed Central

    Kuich, P. Henning J. L.; Hoffmann, Nils; Kempa, Stefan

    2015-01-01

    A current bottleneck in GC–MS metabolomics is the processing of raw machine data into a final datamatrix that contains the quantities of identified metabolites in each sample. While there are many bioinformatics tools available to aid the initial steps of the process, their use requires both significant technical expertise and a subsequent manual validation of identifications and alignments if high data quality is desired. The manual validation is tedious and time consuming, becoming prohibitively so as sample numbers increase. We have, therefore, developed Maui-VIA, a solution based on a visual interface that allows experts and non-experts to simultaneously and quickly process, inspect, and correct large numbers of GC–MS samples. It allows for the visual inspection of identifications and alignments, facilitating a unique and, due to its visualization and keyboard shortcuts, very fast interaction with the data. Therefore, Maui-Via fills an important niche by (1) providing functionality that optimizes the component of data processing that is currently most labor intensive to save time and (2) lowering the threshold of expertise required to process GC–MS data. Maui-VIA projects are initiated with baseline-corrected raw data, peaklists, and a database of metabolite spectra and retention indices used for identification. It provides functionality for retention index calculation, a targeted library search, the visual annotation, alignment, correction interface, and metabolite quantification, as well as the export of the final datamatrix. The high quality of data produced by Maui-VIA is illustrated by its comparison to data attained manually by an expert using vendor software on a previously published dataset concerning the response of Chlamydomonas reinhardtii to salt stress. In conclusion, Maui-VIA provides the opportunity for fast, confident, and high-quality data processing validation of large numbers of GC–MS samples by non-experts. PMID:25654076

  10. Development and prospective evaluation of an automated software system for quality control of quantitative 99mTc-MAG3 renal studies.

    PubMed

    Folks, Russell D; Garcia, Ernest V; Taylor, Andrew T

    2007-03-01

    Quantitative nuclear renography has numerous potential sources of error. We previously reported the initial development of a computer software module for comprehensively addressing the issue of quality control (QC) in the analysis of radionuclide renal images. The objective of this study was to prospectively test the QC software. The QC software works in conjunction with standard quantitative renal image analysis using a renal quantification program. The software saves a text file that summarizes QC findings as possible errors in user-entered values, calculated values that may be unreliable because of the patient's clinical condition, and problems relating to acquisition or processing. To test the QC software, a technologist not involved in software development processed 83 consecutive nontransplant clinical studies. The QC findings of the software were then tabulated. QC events were defined as technical (study descriptors that were out of range or were entered and then changed, unusually sized or positioned regions of interest, or missing frames in the dynamic image set) or clinical (calculated functional values judged to be erroneous or unreliable). Technical QC events were identified in 36 (43%) of 83 studies. Clinical QC events were identified in 37 (45%) of 83 studies. Specific QC events included starting the camera after the bolus had reached the kidney, dose infiltration, oversubtraction of background activity, and missing frames in the dynamic image set. QC software has been developed to automatically verify user input, monitor calculation of renal functional parameters, summarize QC findings, and flag potentially unreliable values for the nuclear medicine physician. Incorporation of automated QC features into commercial or local renal software can reduce errors and improve technologist performance and should improve the efficiency and accuracy of image interpretation.

  11. Detection and quantification of soil-transmitted helminths in environmental samples: A review of current state-of-the-art and future perspectives.

    PubMed

    Amoah, Isaac Dennis; Singh, Gulshan; Stenström, Thor Axel; Reddy, Poovendhree

    2017-05-01

    It is estimated that over a billion people are infected with soil-transmitted helminths (STHs) globally with majority occurring in tropical and subtropical regions of the world. The roundworm (Ascaris lumbricoides), whipworm (Trichuris trichiura), and hookworms (Ancylostoma duodenale and Necator americanus) are the main species infecting people. These infections are mostly gained through exposure to faecally contaminated water, soil or contaminated food and with an increase in the risk of infections due to wastewater and sludge reuse in agriculture. Different methods have been developed for the detection and quantification of STHs eggs in environmental samples. However, there is a lack of a universally accepted technique which creates a challenge for comparative assessments of helminths egg concentrations both in different samples matrices as well as between locations. This review presents a comparison of reported methodologies for the detection of STHs eggs, an assessment of the relative performance of available detection methods and a discussion of new emerging techniques that could be applied for detection and quantification. It is based on a literature search using PubMed and Science Direct considering all geographical locations. Original research articles were selected based on their methodology and results sections. Methods reported in these articles were grouped into conventional, molecular and emerging techniques, the main steps in each method were then compared and discussed. The inclusion of a dissociation step aimed at detaching helminth eggs from particulate matter was found to improve the recovery of eggs. Additionally the selection and application of flotation solutions that take into account the relative densities of the eggs of different species of STHs also results in higher egg recovery. Generally the use of conventional methods was shown to be laborious and time consuming and prone to human error. The alternate use of nucleic acid-based techniques has improved the sensitivity of detection and made species specific identification possible. However, these nucleic acid based methods are expensive and less suitable in regions with limited resources and skill. The loop mediated isothermal amplification method shows promise for application in these settings due to its simplicity and use of basic equipment. In addition, the development of imaging soft-ware for the detection and quantification of STHs shows promise to further reduce human error associated with the analysis of environmental samples. It may be concluded that there is a need to comparatively assess the performance of different methods to determine their applicability in different settings as well as for use with different sample matrices (wastewater, sludge, compost, soil, vegetables etc.). Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.

  12. PSEA-Quant: a protein set enrichment analysis on label-free and label-based protein quantification data.

    PubMed

    Lavallée-Adam, Mathieu; Rauniyar, Navin; McClatchy, Daniel B; Yates, John R

    2014-12-05

    The majority of large-scale proteomics quantification methods yield long lists of quantified proteins that are often difficult to interpret and poorly reproduced. Computational approaches are required to analyze such intricate quantitative proteomics data sets. We propose a statistical approach to computationally identify protein sets (e.g., Gene Ontology (GO) terms) that are significantly enriched with abundant proteins with reproducible quantification measurements across a set of replicates. To this end, we developed PSEA-Quant, a protein set enrichment analysis algorithm for label-free and label-based protein quantification data sets. It offers an alternative approach to classic GO analyses, models protein annotation biases, and allows the analysis of samples originating from a single condition, unlike analogous approaches such as GSEA and PSEA. We demonstrate that PSEA-Quant produces results complementary to GO analyses. We also show that PSEA-Quant provides valuable information about the biological processes involved in cystic fibrosis using label-free protein quantification of a cell line expressing a CFTR mutant. Finally, PSEA-Quant highlights the differences in the mechanisms taking place in the human, rat, and mouse brain frontal cortices based on tandem mass tag quantification. Our approach, which is available online, will thus improve the analysis of proteomics quantification data sets by providing meaningful biological insights.

  13. PSEA-Quant: A Protein Set Enrichment Analysis on Label-Free and Label-Based Protein Quantification Data

    PubMed Central

    2015-01-01

    The majority of large-scale proteomics quantification methods yield long lists of quantified proteins that are often difficult to interpret and poorly reproduced. Computational approaches are required to analyze such intricate quantitative proteomics data sets. We propose a statistical approach to computationally identify protein sets (e.g., Gene Ontology (GO) terms) that are significantly enriched with abundant proteins with reproducible quantification measurements across a set of replicates. To this end, we developed PSEA-Quant, a protein set enrichment analysis algorithm for label-free and label-based protein quantification data sets. It offers an alternative approach to classic GO analyses, models protein annotation biases, and allows the analysis of samples originating from a single condition, unlike analogous approaches such as GSEA and PSEA. We demonstrate that PSEA-Quant produces results complementary to GO analyses. We also show that PSEA-Quant provides valuable information about the biological processes involved in cystic fibrosis using label-free protein quantification of a cell line expressing a CFTR mutant. Finally, PSEA-Quant highlights the differences in the mechanisms taking place in the human, rat, and mouse brain frontal cortices based on tandem mass tag quantification. Our approach, which is available online, will thus improve the analysis of proteomics quantification data sets by providing meaningful biological insights. PMID:25177766

  14. Inter- and intra-rater reliability of patellofemoral kinematic and contact area quantification by fast spin echo MRI and correlation with cartilage health by quantitative T1ρ MRI☆

    PubMed Central

    Lau, Brian C.; Thuillier, Daniel U.; Pedoia, Valentina; Chen, Ellison Y.; Zhang, Zhihong; Feeley, Brian T.; Souza, Richard B.

    2016-01-01

    Background Patellar maltracking is a leading cause of patellofemoral pain syndrome (PFPS). The aim of this study was to determine the inter- and intra-rater reliability of a semi-automated program for magnetic resonance imaging (MRI) based patellofemoral kinematics. Methods Sixteen subjects (10 with PFPS [mean age 32.3; SD 5.2; eight females] and six controls without PFPS 19 [mean age 28.6; SD 2.8; three females]) participated in the study. One set of T2-weighted, fat-saturated fast spin-echo (FSE) MRIs were acquired from each subject in full extension and 30° of knee flexion. MRI including axial T1ρ relaxation time mapping sequences was also performed on each knee. Following image acquisitions, regions of interest for kinematic MRI, and patellar and trochlear cartilage were segmented and quantified with in-house designed spline- based MATLAB semi-automated software. Results Intraclass Correlations Coefficients (ICC) of calculated kinematic parameters were good to excellent, ICC > 0.8 in patellar flexion, rotation, tilt, and translation (anterior -posterior, medial -lateral, and superior -inferior), and contact area translation. Only patellar tilt in the flexed position and motion from extended to flexed state was significantly different between PFPS and control patients (p = 0.002 and p = 0.006, respectively). No significant correlations were identified between patellofemoral kinematics and contact area with T1ρ relaxation times. Conclusions A semi-automated, spline-based kinematic MRI technique for patellofemoral kinematic and contact area quantification is highly reproducible with the potential to help better understand the role of patellofemoral maltracking in PFPS and other knee disorders. PMID:26746045

  15. Inter- and intra-rater reliability of patellofemoral kinematic and contact area quantification by fast spin echo MRI and correlation with cartilage health by quantitative T1ρ MRI.

    PubMed

    Lau, Brian C; Thuillier, Daniel U; Pedoia, Valentina; Chen, Ellison Y; Zhang, Zhihong; Feeley, Brian T; Souza, Richard B

    2016-01-01

    Patellar maltracking is a leading cause of patellofemoral pain syndrome (PFPS). The aim of this study was to determine the inter- and intra-rater reliability of a semi-automated program for magnetic resonance imaging (MRI) based patellofemoral kinematics. Sixteen subjects (10 with PFPS [mean age 32.3; SD 5.2; eight females] and six controls without PFPS 19 [mean age 28.6; SD 2.8; three females]) participated in the study. One set of T2-weighted, fat-saturated fast spin-echo (FSE) MRIs were acquired from each subject in full extension and 30° of knee flexion. MRI including axial T1ρ relaxation time mapping sequences was also performed on each knee. Following image acquisitions, regions of interest for kinematic MRI, and patellar and trochlear cartilage were segmented and quantified with in-house designed spline- based MATLAB semi-automated software. Intraclass Correlations Coefficients (ICC) of calculated kinematic parameters were good to excellent, ICC > 0.8 in patellar flexion, rotation, tilt, and translation (anterior -posterior, medial -lateral, and superior -inferior), and contact area translation. Only patellar tilt in the flexed position and motion from extended to flexed state was significantly different between PFPS and control patients (p=0.002 and p=0.006, respectively). No significant correlations were identified between patellofemoral kinematics and contact area with T1ρ relaxation times. A semi-automated, spline-based kinematic MRI technique for patellofemoral kinematic and contact area quantification is highly reproducible with the potential to help better understand the role of patellofemoral maltracking in PFPS and other knee disorders. Level IV. Published by Elsevier B.V.

  16. DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marques da Silva, A; Narciso, L

    Purpose: Commercial workstations usually have their own software to calculate dynamic renal functions. However, usually they have low flexibility and subjectivity on delimiting kidney and background areas. The aim of this paper is to present a public domain software, called RenalQuant, capable to semi-automatically draw regions of interest on dynamic renal scintigraphies, extracting data and generating renal function quantification parameters. Methods: The software was developed in Java and written as an ImageJ-based plugin. The preprocessing and segmentation steps include the user’s selection of one time frame with higher activity in kidney’s region, compared with background, and low activity in themore » liver. Next, the chosen time frame is smoothed using a Gaussian low pass spatial filter (σ = 3) for noise reduction and better delimitation of kidneys. The maximum entropy thresholding method is used for segmentation. A background area is automatically placed below each kidney, and the user confirms if these regions are correctly segmented and positioned. Quantitative data are extracted and each renogram and relative renal function (RRF) value is calculated and displayed. Results: RenalQuant plugin was validated using retrospective 20 patients’ 99mTc-DTPA exams, and compared with results produced by commercial workstation software, referred as reference. The renograms intraclass correlation coefficients (ICC) were calculated and false-negative and false-positive RRF values were analyzed. The results showed that ICC values between RenalQuant plugin and reference software for both kidneys’ renograms were higher than 0.75, showing excellent reliability. Conclusion: Our results indicated RenalQuant plugin can be trustingly used to generate renograms, using DICOM dynamic renal scintigraphy exams as input. It is user friendly and user’s interaction occurs at a minimum level. Further studies have to investigate how to increase RRF accuracy and explore how to solve limitations in the segmentation step, mainly when background region has higher activity compared to kidneys. Financial support by CAPES.« less

  17. Computational solution verification and validation applied to a thermal model of a ruggedized instrumentation package

    DOE PAGES

    Scott, Sarah Nicole; Templeton, Jeremy Alan; Hough, Patricia Diane; ...

    2014-01-01

    This study details a methodology for quantification of errors and uncertainties of a finite element heat transfer model applied to a Ruggedized Instrumentation Package (RIP). The proposed verification and validation (V&V) process includes solution verification to examine errors associated with the code's solution techniques, and model validation to assess the model's predictive capability for quantities of interest. The model was subjected to mesh resolution and numerical parameters sensitivity studies to determine reasonable parameter values and to understand how they change the overall model response and performance criteria. To facilitate quantification of the uncertainty associated with the mesh, automatic meshing andmore » mesh refining/coarsening algorithms were created and implemented on the complex geometry of the RIP. Automated software to vary model inputs was also developed to determine the solution’s sensitivity to numerical and physical parameters. The model was compared with an experiment to demonstrate its accuracy and determine the importance of both modelled and unmodelled physics in quantifying the results' uncertainty. An emphasis is placed on automating the V&V process to enable uncertainty quantification within tight development schedules.« less

  18. A reliable and rapid tool for plasma quantification of 18 psychotropic drugs by ESI tandem mass spectrometry.

    PubMed

    Vecchione, Gennaro; Casetta, Bruno; Chiapparino, Antonella; Bertolino, Alessandro; Tomaiuolo, Michela; Cappucci, Filomena; Gatta, Raffaella; Margaglione, Maurizio; Grandone, Elvira

    2012-01-01

    A simple liquid chromatographic tandem mass spectrometry (LC-MS/MS) method has been developed for simultaneous analysis of 17 basic and one acid psychotropic drugs in human plasma. The method relies on a protein precipitation step for sample preparation and offers high sensitivity, wide linearity without interferences from endogenous matrix components. Chromatography was run on a reversed-phase column with an acetonitrile-H₂O mixture. The quantification of target compounds was performed in multiple reaction monitoring (MRM) and by switching the ionization polarity within the analytical run. A further sensitivity increase was obtained by implementing the functionality "scheduled multiple reaction monitoring" (sMRM) offered by the recent version of the software package managing the instrument. The overall injection interval was less than 5.5 min. Regression coefficients of the calibration curves and limits of quantification (LOQ) showed a good coverage of over-therapeutic, therapeutic and sub-therapeutic ranges. Recovery rates, measured as percentage of recovery of spiked plasma samples, were ≥ 94%. Precision and accuracy data have been satisfactory for a therapeutic drug monitoring (TDM) service as for managing plasma samples from patients receiving psycho-pharmacological treatment. Copyright © 2012 Elsevier B.V. All rights reserved.

  19. Trypanosoma cruzi infectivity assessment in "in vitro" culture systems by automated cell counting.

    PubMed

    Liempi, Ana; Castillo, Christian; Cerda, Mauricio; Droguett, Daniel; Duaso, Juan; Barahona, Katherine; Hernández, Ariane; Díaz-Luján, Cintia; Fretes, Ricardo; Härtel, Steffen; Kemmerling, Ulrike

    2015-03-01

    Chagas disease is an endemic, neglected tropical disease in Latin America that is caused by the protozoan parasite Trypanosoma cruzi. In vitro models constitute the first experimental approach to study the physiopathology of the disease and to assay potential new trypanocidal agents. Here, we report and describe clearly the use of commercial software (MATLAB(®)) to quantify T. cruzi amastigotes and infected mammalian cells (BeWo) and compared this analysis with the manual one. There was no statistically significant difference between the manual and the automatic quantification of the parasite; the two methods showed a correlation analysis r(2) value of 0.9159. The most significant advantage of the automatic quantification was the efficiency of the analysis. The drawback of this automated cell counting method was that some parasites were assigned to the wrong BeWo cell, however this data did not exceed 5% when adequate experimental conditions were chosen. We conclude that this quantification method constitutes an excellent tool for evaluating the parasite load in cells and therefore constitutes an easy and reliable ways to study parasite infectivity. Copyright © 2014 Elsevier B.V. All rights reserved.

  20. Qualitative and quantitative analysis of pyrolysis oil by gas chromatography with flame ionization detection and comprehensive two-dimensional gas chromatography with time-of-flight mass spectrometry.

    PubMed

    Sfetsas, Themistoklis; Michailof, Chrysa; Lappas, Angelos; Li, Qiangyi; Kneale, Brian

    2011-05-27

    Pyrolysis oils have attracted a lot of interest, as they are liquid energy carriers and general sources of chemicals. In this work, gas chromatography with flame ionization detector (GC-FID) and two-dimensional gas chromatography with time-of-flight mass spectrometry (GC×GC-TOFMS) techniques were used to provide both qualitative and quantitative results of the analysis of three different pyrolysis oils. The chromatographic methods and parameters were optimized and solvent choice and separation restrictions are discussed. Pyrolysis oil samples were diluted in suitable organic solvent and were analyzed by GC×GC-TOFMS. An average of 300 compounds were detected and identified in all three samples using the ChromaToF (Leco) software. The deconvoluted spectra were compared with the NIST software library for correct matching. Group type classification was performed by use of the ChromaToF software. The quantification of 11 selected compounds was performed by means of a multiple-point external calibration curve. Afterwards, the pyrolysis oils were extracted with water, and the aqueous phase was analyzed both by GC-FID and, after proper change of solvent, by GC×GC-TOFMS. As previously, the selected compounds were quantified by both techniques, by means of multiple point external calibration curves. The parameters of the calibration curves were calculated by weighted linear regression analysis. The limit of detection, limit of quantitation and linearity range for each standard compound with each method are presented. The potency of GC×GC-TOFMS for an efficient mapping of the pyrolysis oil is undisputable, and the possibility of using it for quantification as well has been demonstrated. On the other hand, the GC-FID analysis provides reliable results that allow for a rapid screening of the pyrolysis oil. To the best of our knowledge, very few papers have been reported with quantification attempts on pyrolysis oil samples using GC×GC-TOFMS most of which make use of the internal standard method. This work provides the ground for further analysis of pyrolysis oils of diverse sources for a rational design of both their production and utilization process. Copyright © 2010 Elsevier B.V. All rights reserved.

  1. Objective detection of apoptosis in rat renal tissue sections using light microscopy and free image analysis software with subsequent machine learning: Detection of apoptosis in renal tissue.

    PubMed

    Macedo, Nayana Damiani; Buzin, Aline Rodrigues; de Araujo, Isabela Bastos Binotti Abreu; Nogueira, Breno Valentim; de Andrade, Tadeu Uggere; Endringer, Denise Coutinho; Lenz, Dominik

    2017-02-01

    The current study proposes an automated machine learning approach for the quantification of cells in cell death pathways according to DNA fragmentation. A total of 17 images of kidney histological slide samples from male Wistar rats were used. The slides were photographed using an Axio Zeiss Vert.A1 microscope with a 40x objective lens coupled with an Axio Cam MRC Zeiss camera and Zen 2012 software. The images were analyzed using CellProfiler (version 2.1.1) and CellProfiler Analyst open-source software. Out of the 10,378 objects, 4970 (47,9%) were identified as TUNEL positive, and 5408 (52,1%) were identified as TUNEL negative. On average, the sensitivity and specificity values of the machine learning approach were 0.80 and 0.77, respectively. Image cytometry provides a quantitative analytical alternative to the more traditional qualitative methods more commonly used in studies. Copyright © 2016 Elsevier Ltd. All rights reserved.

  2. PHASTpep: Analysis Software for Discovery of Cell-Selective Peptides via Phage Display and Next-Generation Sequencing

    PubMed Central

    Dasa, Siva Sai Krishna; Kelly, Kimberly A.

    2016-01-01

    Next-generation sequencing has enhanced the phage display process, allowing for the quantification of millions of sequences resulting from the biopanning process. In response, many valuable analysis programs focused on specificity and finding targeted motifs or consensus sequences were developed. For targeted drug delivery and molecular imaging, it is also necessary to find peptides that are selective—targeting only the cell type or tissue of interest. We present a new analysis strategy and accompanying software, PHage Analysis for Selective Targeted PEPtides (PHASTpep), which identifies highly specific and selective peptides. Using this process, we discovered and validated, both in vitro and in vivo in mice, two sequences (HTTIPKV and APPIMSV) targeted to pancreatic cancer-associated fibroblasts that escaped identification using previously existing software. Our selectivity analysis makes it possible to discover peptides that target a specific cell type and avoid other cell types, enhancing clinical translatability by circumventing complications with systemic use. PMID:27186887

  3. OPAD-EDIFIS Real-Time Processing

    NASA Technical Reports Server (NTRS)

    Katsinis, Constantine

    1997-01-01

    The Optical Plume Anomaly Detection (OPAD) detects engine hardware degradation of flight vehicles through identification and quantification of elemental species found in the plume by analyzing the plume emission spectra in a real-time mode. Real-time performance of OPAD relies on extensive software which must report metal amounts in the plume faster than once every 0.5 sec. OPAD software previously written by NASA scientists performed most necessary functions at speeds which were far below what is needed for real-time operation. The research presented in this report improved the execution speed of the software by optimizing the code without changing the algorithms and converting it into a parallelized form which is executed in a shared-memory multiprocessor system. The resulting code was subjected to extensive timing analysis. The report also provides suggestions for further performance improvement by (1) identifying areas of algorithm optimization, (2) recommending commercially available multiprocessor architectures and operating systems to support real-time execution and (3) presenting an initial study of fault-tolerance requirements.

  4. EEGgui: a program used to detect electroencephalogram anomalies after traumatic brain injury.

    PubMed

    Sick, Justin; Bray, Eric; Bregy, Amade; Dietrich, W Dalton; Bramlett, Helen M; Sick, Thomas

    2013-05-21

    Identifying and quantifying pathological changes in brain electrical activity is important for investigations of brain injury and neurological disease. An example is the development of epilepsy, a secondary consequence of traumatic brain injury. While certain epileptiform events can be identified visually from electroencephalographic (EEG) or electrocorticographic (ECoG) records, quantification of these pathological events has proved to be more difficult. In this study we developed MATLAB-based software that would assist detection of pathological brain electrical activity following traumatic brain injury (TBI) and present our MATLAB code used for the analysis of the ECoG. Software was developed using MATLAB(™) and features of the open access EEGLAB. EEGgui is a graphical user interface in the MATLAB programming platform that allows scientists who are not proficient in computer programming to perform a number of elaborate analyses on ECoG signals. The different analyses include Power Spectral Density (PSD), Short Time Fourier analysis and Spectral Entropy (SE). ECoG records used for demonstration of this software were derived from rats that had undergone traumatic brain injury one year earlier. The software provided in this report provides a graphical user interface for displaying ECoG activity and calculating normalized power density using fast fourier transform of the major brain wave frequencies (Delta, Theta, Alpha, Beta1, Beta2 and Gamma). The software further detects events in which power density for these frequency bands exceeds normal ECoG by more than 4 standard deviations. We found that epileptic events could be identified and distinguished from a variety of ECoG phenomena associated with normal changes in behavior. We further found that analysis of spectral entropy was less effective in distinguishing epileptic from normal changes in ECoG activity. The software presented here was a successful modification of EEGLAB in the Matlab environment that allows detection of epileptiform ECoG signals in animals after TBI. The code allows import of large EEG or ECoG data records as standard text files and uses fast fourier transform as a basis for detection of abnormal events. The software can also be used to monitor injury-induced changes in spectral entropy if required. We hope that the software will be useful for other investigators in the field of traumatic brain injury and will stimulate future advances of quantitative analysis of brain electrical activity after neurological injury or disease.

  5. Feasibility and accuracy of dual-layer spectral detector computed tomography for quantification of gadolinium: a phantom study.

    PubMed

    van Hamersvelt, Robbert W; Willemink, Martin J; de Jong, Pim A; Milles, Julien; Vlassenbroek, Alain; Schilham, Arnold M R; Leiner, Tim

    2017-09-01

    The aim of this study was to evaluate the feasibility and accuracy of dual-layer spectral detector CT (SDCT) for the quantification of clinically encountered gadolinium concentrations. The cardiac chamber of an anthropomorphic thoracic phantom was equipped with 14 tubular inserts containing different gadolinium concentrations, ranging from 0 to 26.3 mg/mL (0.0, 0.1, 0.2, 0.4, 0.5, 1.0, 2.0, 3.0, 4.0, 5.1, 10.6, 15.7, 20.7 and 26.3 mg/mL). Images were acquired using a novel 64-detector row SDCT system at 120 and 140 kVp. Acquisitions were repeated five times to assess reproducibility. Regions of interest (ROIs) were drawn on three slices per insert. A spectral plot was extracted for every ROI and mean attenuation profiles were fitted to known attenuation profiles of water and pure gadolinium using in-house-developed software to calculate gadolinium concentrations. At both 120 and 140 kVp, excellent correlations between scan repetitions and true and measured gadolinium concentrations were found (R > 0.99, P < 0.001; ICCs > 0.99, CI 0.99-1.00). Relative mean measurement errors stayed below 10% down to 2.0 mg/mL true gadolinium concentration at 120 kVp and below 5% down to 1.0 mg/mL true gadolinium concentration at 140 kVp. SDCT allows for accurate quantification of gadolinium at both 120 and 140 kVp. Lowest measurement errors were found for 140 kVp acquisitions. • Gadolinium quantification may be useful in patients with contraindication to iodine. • Dual-layer spectral detector CT allows for overall accurate quantification of gadolinium. • Interscan variability of gadolinium quantification using SDCT material decomposition is excellent.

  6. Technical advances in proteomics: new developments in data-independent acquisition.

    PubMed

    Hu, Alex; Noble, William S; Wolf-Yadlin, Alejandro

    2016-01-01

    The ultimate aim of proteomics is to fully identify and quantify the entire complement of proteins and post-translational modifications in biological samples of interest. For the last 15 years, liquid chromatography-tandem mass spectrometry (LC-MS/MS) in data-dependent acquisition (DDA) mode has been the standard for proteomics when sampling breadth and discovery were the main objectives; multiple reaction monitoring (MRM) LC-MS/MS has been the standard for targeted proteomics when precise quantification, reproducibility, and validation were the main objectives. Recently, improvements in mass spectrometer design and bioinformatics algorithms have resulted in the rediscovery and development of another sampling method: data-independent acquisition (DIA). DIA comprehensively and repeatedly samples every peptide in a protein digest, producing a complex set of mass spectra that is difficult to interpret without external spectral libraries. Currently, DIA approaches the identification breadth of DDA while achieving the reproducible quantification characteristic of MRM or its newest version, parallel reaction monitoring (PRM). In comparative de novo identification and quantification studies in human cell lysates, DIA identified up to 89% of the proteins detected in a comparable DDA experiment while providing reproducible quantification of over 85% of them. DIA analysis aided by spectral libraries derived from prior DIA experiments or auxiliary DDA data produces identification and quantification as reproducible and precise as that achieved by MRM/PRM, except on low‑abundance peptides that are obscured by stronger signals. DIA is still a work in progress toward the goal of sensitive, reproducible, and precise quantification without external spectral libraries. New software tools applied to DIA analysis have to deal with deconvolution of complex spectra as well as proper filtering of false positives and false negatives. However, the future outlook is positive, and various researchers are working on novel bioinformatics techniques to address these issues and increase the reproducibility, fidelity, and identification breadth of DIA.

  7. A Database for Propagation Models and Conversion to C++ Programming Language

    NASA Technical Reports Server (NTRS)

    Kantak, Anil V.; Angkasa, Krisjani; Rucker, James

    1996-01-01

    The telecommunications system design engineer generally needs the quantification of effects of the propagation medium (definition of the propagation channel) to design an optimal communications system. To obtain the definition of the channel, the systems engineer generally has a few choices. A search of the relevant publications such as the IEEE Transactions, CCIR's, NASA propagation handbook, etc., may be conducted to find the desired channel values. This method may need excessive amounts of time and effort on the systems engineer's part and there is a possibility that the search may not even yield the needed results. To help the researcher and the systems engineers, it was recommended by the conference participants of NASA Propagation Experimenters (NAPEX) XV (London, Ontario, Canada, June 28 and 29, 1991) that a software should be produced that would contain propagation models and the necessary prediction methods of most propagation phenomena. Moreover, the software should be flexible enough for the user to make slight changes to the models without expending a substantial effort in programming. In the past few years, a software was produced to fit these requirements as best as could be done. The software was distributed to all NAPEX participants for evaluation and use, the participant reactions, suggestions etc., were gathered and were used to improve the subsequent releases of the software. The existing database program is in the Microsoft Excel application software and works fine within the guidelines of that environment, however, recently there have been some questions about the robustness and survivability of the Excel software in the ever changing (hopefully improving) world of software packages.

  8. Comprehensive Design Reliability Activities for Aerospace Propulsion Systems

    NASA Technical Reports Server (NTRS)

    Christenson, R. L.; Whitley, M. R.; Knight, K. C.

    2000-01-01

    This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.

  9. Mineralogical characterization of strata of the Meade Peak phosphatic shale member of the Permian Phosphoria Formation: channel and individual rock samples of measured section J and their relationship to measured sections A and B, central part of Rasmussen Ridge, Caribou County, Idaho

    USGS Publications Warehouse

    Knudsen, A.C.; Gunter, M.E.; Herring, J.R.; Grauch, R.I.

    2002-01-01

    The Permian Phosphoria Formation of southeastern Idaho hosts one of the largest phosphate deposits in the world. Despite the economic significance of this Formation, the fine-grained nature of the phosphorite has discouraged detailed mineralogical characterization and quantification studies. Recently, selenium and other potentially toxic trace elements in mine wastes have drawn increased attention to this formation, and motivated additional study. This study uses powder X-ray diffraction (XRD), with Rietveld quantification software, to quantify and characterize the mineralogy of composite channel samples and individual samples collected from the stratigraphic sections measured by the U.S. Geological Survey in the Meade Peak Member of the Permian Phosphoria Formation at the Enoch Valley mine on Rasmussen Ridge, approximately 15 miles northeast of Soda Springs, Idaho.

  10. AIRFIX: the first digital postoperative chest tube airflowmetry--a novel method to quantify air leakage after lung resection.

    PubMed

    Anegg, Udo; Lindenmann, Jorg; Matzi, Veronika; Mujkic, Dzenana; Maier, Alfred; Fritz, Lukas; Smolle-Jüttner, Freyja Maria

    2006-06-01

    Prolonged air leak after pulmonary resection is a common complication and a major limiting factor for early discharge from hospital. Currently there is little consensus on its management. The aim of this study was to develop and evaluate a measuring device which allows a simple digital bed-side quantification of air-leaks compatible to standard thoracic drainage systems. The measuring device (AIRFIX) is based upon a 'mass airflow' sensor with a specially designed software package that is connected to a thoracic suction drainage system. Its efficacy in detecting pulmonary air-leaks was evaluated in a series of 204 patients; all postoperative measurements were done under standardized conditions; the patients were asked to cough, to take a deep breath, to breathe out against the resistance of a flutter valve, to keep breath and to breathe normally. As standard parameters, the leakage per breath or cough (ml/b) as well as the leakage per minute (ml/min) were displayed and recorded on the computer. Air-leaks within a range of 0.25-45 ml/b and 5-900 ml/min were found. Removal of the chest tubes was done when leakage volume on Heimlich valve was less than 1.0 ml/b or 20 ml/min. After drain removal based upon the data from chest tube airflowmetry none of the patients needed re-drainage due to pneumothorax. The AIRFIX device for bed-side quantification of air-leaks has proved to be very simple and helpful in diagnosis and management of air-leaks after lung surgery, permitting drain removal without tentative clamping.

  11. chipPCR: an R package to pre-process raw data of amplification curves.

    PubMed

    Rödiger, Stefan; Burdukiewicz, Michał; Schierack, Peter

    2015-09-01

    Both the quantitative real-time polymerase chain reaction (qPCR) and quantitative isothermal amplification (qIA) are standard methods for nucleic acid quantification. Numerous real-time read-out technologies have been developed. Despite the continuous interest in amplification-based techniques, there are only few tools for pre-processing of amplification data. However, a transparent tool for precise control of raw data is indispensable in several scenarios, for example, during the development of new instruments. chipPCR is an R: package for the pre-processing and quality analysis of raw data of amplification curves. The package takes advantage of R: 's S4 object model and offers an extensible environment. chipPCR contains tools for raw data exploration: normalization, baselining, imputation of missing values, a powerful wrapper for amplification curve smoothing and a function to detect the start and end of an amplification curve. The capabilities of the software are enhanced by the implementation of algorithms unavailable in R: , such as a 5-point stencil for derivative interpolation. Simulation tools, statistical tests, plots for data quality management, amplification efficiency/quantification cycle calculation, and datasets from qPCR and qIA experiments are part of the package. Core functionalities are integrated in GUIs (web-based and standalone shiny applications), thus streamlining analysis and report generation. http://cran.r-project.org/web/packages/chipPCR. Source code: https://github.com/michbur/chipPCR. stefan.roediger@b-tu.de Supplementary data are available at Bioinformatics online. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.

  12. Preconditioning of the background error covariance matrix in data assimilation for the Caspian Sea

    NASA Astrophysics Data System (ADS)

    Arcucci, Rossella; D'Amore, Luisa; Toumi, Ralf

    2017-06-01

    Data Assimilation (DA) is an uncertainty quantification technique used for improving numerical forecasted results by incorporating observed data into prediction models. As a crucial point into DA models is the ill conditioning of the covariance matrices involved, it is mandatory to introduce, in a DA software, preconditioning methods. Here we present first studies concerning the introduction of two different preconditioning methods in a DA software we are developing (we named S3DVAR) which implements a Scalable Three Dimensional Variational Data Assimilation model for assimilating sea surface temperature (SST) values collected into the Caspian Sea by using the Regional Ocean Modeling System (ROMS) with observations provided by the Group of High resolution sea surface temperature (GHRSST). We also present the algorithmic strategies we employ.

  13. Toward the S3DVAR data assimilation software for the Caspian Sea

    NASA Astrophysics Data System (ADS)

    Arcucci, Rossella; Celestino, Simone; Toumi, Ralf; Laccetti, Giuliano

    2017-07-01

    Data Assimilation (DA) is an uncertainty quantification technique used to incorporate observed data into a prediction model in order to improve numerical forecasted results. The forecasting model used for producing oceanographic prediction into the Caspian Sea is the Regional Ocean Modeling System (ROMS). Here we propose the computational issues we are facing in a DA software we are developing (we named S3DVAR) which implements a Scalable Three Dimensional Variational Data Assimilation model for assimilating sea surface temperature (SST) values collected into the Caspian Sea with observations provided by the Group of High resolution sea surface temperature (GHRSST). We present the algorithmic strategies we employ and the numerical issues on data collected in two of the months which present the most significant variability in water temperature: August and March.

  14. A software tool for automatic classification and segmentation of 2D/3D medical images

    NASA Astrophysics Data System (ADS)

    Strzelecki, Michal; Szczypinski, Piotr; Materka, Andrzej; Klepaczko, Artur

    2013-02-01

    Modern medical diagnosis utilizes techniques of visualization of human internal organs (CT, MRI) or of its metabolism (PET). However, evaluation of acquired images made by human experts is usually subjective and qualitative only. Quantitative analysis of MR data, including tissue classification and segmentation, is necessary to perform e.g. attenuation compensation, motion detection, and correction of partial volume effect in PET images, acquired with PET/MR scanners. This article presents briefly a MaZda software package, which supports 2D and 3D medical image analysis aiming at quantification of image texture. MaZda implements procedures for evaluation, selection and extraction of highly discriminative texture attributes combined with various classification, visualization and segmentation tools. Examples of MaZda application in medical studies are also provided.

  15. AVQS: attack route-based vulnerability quantification scheme for smart grid.

    PubMed

    Ko, Jongbin; Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification.

  16. Recent advances in stable isotope labeling based techniques for proteome relative quantification.

    PubMed

    Zhou, Yuan; Shan, Yichu; Zhang, Lihua; Zhang, Yukui

    2014-10-24

    The large scale relative quantification of all proteins expressed in biological samples under different states is of great importance for discovering proteins with important biological functions, as well as screening disease related biomarkers and drug targets. Therefore, the accurate quantification of proteins at proteome level has become one of the key issues in protein science. Herein, the recent advances in stable isotope labeling based techniques for proteome relative quantification were reviewed, from the aspects of metabolic labeling, chemical labeling and enzyme-catalyzed labeling. Furthermore, the future research direction in this field was prospected. Copyright © 2014 Elsevier B.V. All rights reserved.

  17. Stereomicroscopic imaging technique for the quantification of cold flow in drug-in-adhesive type of transdermal drug delivery systems.

    PubMed

    Krishnaiah, Yellela S R; Katragadda, Usha; Khan, Mansoor A

    2014-05-01

    Cold flow is a phenomenon occurring in drug-in-adhesive type of transdermal drug delivery systems (DIA-TDDS) because of the migration of DIA coat beyond the edge. Excessive cold flow can affect their therapeutic effectiveness, make removal of DIA-TDDS difficult from the pouch, and potentially decrease available dose if any drug remains adhered to pouch. There are no compendial or noncompendial methods available for quantification of this critical quality attribute. The objective was to develop a method for quantification of cold flow using stereomicroscopic imaging technique. Cold flow was induced by applying 1 kg force on punched-out samples of marketed estradiol DIA-TDDS (model product) stored at 25°C, 32°C, and 40°C/60% relative humidity (RH) for 1, 2, or 3 days. At the end of testing period, dimensional change in the area of DIA-TDDS samples was measured using image analysis software, and expressed as percent of cold flow. The percent of cold flow significantly decreased (p < 0.001) with increase in size of punched-out DIA-TDDS samples and increased (p < 0.001) with increase in cold flow induction temperature and time. This first ever report suggests that dimensional change in the area of punched-out samples stored at 32°C/60%RH for 2 days applied with 1 kg force could be used for quantification of cold flow in DIA-TDDS. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.

  18. Quantification of Impervious Surfaces Along the Wasatch Front, Utah: AN Object-Based Image Analysis Approach to Identifying AN Indicator for Wetland Stress

    NASA Astrophysics Data System (ADS)

    Leydsman-McGinty, E. I.; Ramsey, R. D.; McGinty, C.

    2013-12-01

    The Remote Sensing/GIS Laboratory at Utah State University, in cooperation with the United States Environmental Protection Agency, is quantifying impervious surfaces for three watershed sub-basins in Utah. The primary objective of developing watershed-scale quantifications of impervious surfaces is to provide an indicator of potential impacts to wetlands that occur within the Wasatch Front and along the Great Salt Lake. A geospatial layer of impervious surfaces can assist state agencies involved with Utah's Wetlands Program Plan (WPP) in understanding the impacts of impervious surfaces on wetlands, as well as support them in carrying out goals and actions identified in the WPP. The three watershed sub-basins, Lower Bear-Malad, Lower Weber, and Jordan, span the highly urbanized Wasatch Front and are consistent with focal areas in need of wetland monitoring and assessment as identified in Utah's WPP. Geospatial layers of impervious surface currently exist in the form of national and regional land cover datasets; however, these datasets are too coarse to be utilized in fine-scale analyses. In addition, the pixel-based image processing techniques used to develop these coarse datasets have proven insufficient in smaller scale or detailed studies, particularly when applied to high-resolution satellite imagery or aerial photography. Therefore, object-based image analysis techniques are being implemented to develop the geospatial layer of impervious surfaces. Object-based image analysis techniques employ a combination of both geospatial and image processing methods to extract meaningful information from high-resolution imagery. Spectral, spatial, textural, and contextual information is used to group pixels into image objects and then subsequently used to develop rule sets for image classification. eCognition, an object-based image analysis software program, is being utilized in conjunction with one-meter resolution National Agriculture Imagery Program (NAIP) aerial photography from 2011.

  19. Systematic Comparison of Label-Free, Metabolic Labeling, and Isobaric Chemical Labeling for Quantitative Proteomics on LTQ Orbitrap Velos

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Zhou; Adams, Rachel M; Chourey, Karuna

    2012-01-01

    A variety of quantitative proteomics methods have been developed, including label-free, metabolic labeling, and isobaric chemical labeling using iTRAQ or TMT. Here, these methods were compared in terms of the depth of proteome coverage, quantification accuracy, precision, and reproducibility using a high-performance hybrid mass spectrometer, LTQ Orbitrap Velos. Our results show that (1) the spectral counting method provides the deepest proteome coverage for identification, but its quantification performance is worse than labeling-based approaches, especially the quantification reproducibility; (2) metabolic labeling and isobaric chemical labeling are capable of accurate, precise, and reproducible quantification and provide deep proteome coverage for quantification. Isobaricmore » chemical labeling surpasses metabolic labeling in terms of quantification precision and reproducibility; (3) iTRAQ and TMT perform similarly in all aspects compared in the current study using a CID-HCD dual scan configuration. Based on the unique advantages of each method, we provide guidance for selection of the appropriate method for a quantitative proteomics study.« less

  20. Data processing has major impact on the outcome of quantitative label-free LC-MS analysis.

    PubMed

    Chawade, Aakash; Sandin, Marianne; Teleman, Johan; Malmström, Johan; Levander, Fredrik

    2015-02-06

    High-throughput multiplexed protein quantification using mass spectrometry is steadily increasing in popularity, with the two major techniques being data-dependent acquisition (DDA) and targeted acquisition using selected reaction monitoring (SRM). However, both techniques involve extensive data processing, which can be performed by a multitude of different software solutions. Analysis of quantitative LC-MS/MS data is mainly performed in three major steps: processing of raw data, normalization, and statistical analysis. To evaluate the impact of data processing steps, we developed two new benchmark data sets, one each for DDA and SRM, with samples consisting of a long-range dilution series of synthetic peptides spiked in a total cell protein digest. The generated data were processed by eight different software workflows and three postprocessing steps. The results show that the choice of the raw data processing software and the postprocessing steps play an important role in the final outcome. Also, the linear dynamic range of the DDA data could be extended by an order of magnitude through feature alignment and a charge state merging algorithm proposed here. Furthermore, the benchmark data sets are made publicly available for further benchmarking and software developments.

  1. Evaluation of a new 3-dimensional color Doppler flow method to quantify flow across the mitral valve and in the left ventricular outflow tract: an in vitro study.

    PubMed

    Kimura, Sumito; Streiff, Cole; Zhu, Meihua; Shimada, Eriko; Datta, Saurabh; Ashraf, Muhammad; Sahn, David J

    2014-02-01

    The aim of this study was to assess the accuracy, feasibility, and reproducibility of determining stroke volume from a novel 3-dimensional (3D) color Doppler flow quantification method for mitral valve (MV) inflow and left ventricular outflow tract (LVOT) outflow at different stroke volumes when compared with the actual flow rate in a pumped porcine cardiac model. Thirteen freshly harvested pig hearts were studied in a water tank. We inserted a latex balloon into each left ventricle from the MV annulus to the LVOT, which were passively pumped at different stroke volumes (30-80 mL) using a calibrated piston pump at increments of 10 mL. Four-dimensional flow volumes were obtained without electrocardiographic gating. The digital imaging data were analyzed offline using prototype software. Two hemispheric flow-sampling planes for color Doppler velocity measurements were placed at the MV annulus and LVOT. The software computed the flow volumes at the MV annulus and LVOT within the user-defined volume and cardiac cycle. This novel 3D Doppler flow quantification method detected incremental increases in MV inflow and LVOT outflow in close agreement with pumped stroke volumes (MV inflow, r = 0.96; LVOT outflow, r = 0.96; P < .01). Bland-Altman analysis demonstrated overestimation of both (MV inflow, 5.42 mL; LVOT outflow, 4.46 mL) with 95% of points within 95% limits of agreement. Interobserver variability values showed good agreement for all stroke volumes at both the MV annulus and LVOT. This study has shown that the 3D color Doppler flow quantification method we used is able to compute stroke volumes accurately at the MV annulus and LVOT in the same cardiac cycle without electrocardiographic gating. This method may be valuable for assessment of cardiac output in clinical studies.

  2. Computationally efficient and flexible modular modelling approach for river and urban drainage systems based on surrogate conceptual models

    NASA Astrophysics Data System (ADS)

    Wolfs, Vincent; Willems, Patrick

    2015-04-01

    Water managers rely increasingly on mathematical simulation models that represent individual parts of the water system, such as the river, sewer system or waste water treatment plant. The current evolution towards integral water management requires the integration of these distinct components, leading to an increased model scale and scope. Besides this growing model complexity, certain applications gained interest and importance, such as uncertainty and sensitivity analyses, auto-calibration of models and real time control. All these applications share the need for models with a very limited calculation time, either for performing a large number of simulations, or a long term simulation followed by a statistical post-processing of the results. The use of the commonly applied detailed models that solve (part of) the de Saint-Venant equations is infeasible for these applications or such integrated modelling due to several reasons, of which a too long simulation time and the inability to couple submodels made in different software environments are the main ones. Instead, practitioners must use simplified models for these purposes. These models are characterized by empirical relationships and sacrifice model detail and accuracy for increased computational efficiency. The presented research discusses the development of a flexible integral modelling platform that complies with the following three key requirements: (1) Include a modelling approach for water quantity predictions for rivers, floodplains, sewer systems and rainfall runoff routing that require a minimal calculation time; (2) A fast and semi-automatic model configuration, thereby making maximum use of data of existing detailed models and measurements; (3) Have a calculation scheme based on open source code to allow for future extensions or the coupling with other models. First, a novel and flexible modular modelling approach based on the storage cell concept was developed. This approach divides each subcomponent, such as the river, sewer of floodplain, in an arrangement of interconnected cells, thereby lumping processes in space and time. Depending on the behaviour of the system that needs to be emulated and the desired level of accuracy, variables of interest can be predicted by adopting and calibrating one of the predefined model structures, such as weir equations, transfer functions (Wolfs et al., 2013) and self-learning structures including neural networks, model trees and fuzzy systems (Wolfs and Willems, 2013, 2014). Next, a software tool was developed to facilitate and speed-up model configuration. A close integration is foreseen with the MIKE (DHI) and InfoWorks (Innovyze) software. The created software tool also automatically sets up the calculation scheme in C programming language. The developed modelling approach and software were tested extensively on multiple case studies, including uncertainty flood mapping along a river, real time control of hydraulic structures to prevent flooding, and the quantification of the effect on floods of retention basins in a coupled sewer-river system (De Vleeschauwer et al., 2014). Research is currently being done on the extension of the modelling approach and accompanying software tool with physicochemical water quality modules. Acknowledgments This research was supported by the Agency for Innovation by Science and Technology in Flanders (IWT). The authors would like to thank DHI and Innovyze for the MIKE and InfoWorks licenses. References • De Vleeschauwer, K., Weustenraad, J., Nolf, C., Wolfs, V., De Meulder, B., Shannon, K., Willems, P. (2014). Green - blue water in the city: quantification of impact of source control versus end-of-pipe solutions on sewer and river floods. Water Science and Technology, 70 (11), 1825-1837. • Wolfs, V., Villazon Gomez, M., Willems, P. (2013). Development of a semi-automated model identification and calibration tool for conceptual modelling of sewer systems. Water Science and Technology, 68 (1), 167-175. • Wolfs, V., Willems, P. (2013). A data driven approach using Takagi-Sugeno models for computationally efficient lumped floodplain modeling. Journal of Hydrology, 503, 222-232. • Wolfs, V., Willems, P. (2014). Development of discharge-stage curves affected by hysteresis using time varying models, model tree and neural networks. Environmental Modelling & Software, 55, 107-119.

  3. Social, Environmental, and Health Vulnerability to Climate Change: The Case of the Municipalities of Minas Gerais, Brazil.

    PubMed

    Quintão, Ana Flávia; Brito, Isabela; Oliveira, Frederico; Madureira, Ana Paula; Confalonieri, Ulisses

    2017-01-01

    Vulnerability to climate change is a complex and dynamic phenomenon involving both social and physical/environmental aspects. It is presented as a method for the quantification of the vulnerability of all municipalities of Minas Gerais, a state in southeastern Brazil. It is based on the aggregation of different kinds of environmental, climatic, social, institutional, and epidemiological variables, to form a composite index. This was named "Index of Human Vulnerability" and was calculated using a software (SisVuClima®) specifically developed for this purpose. Social, environmental, and health data were combined with the climatic scenarios RCP 4.5 and 8.5, downscaled from ETA-HadGEM2-ES for each municipality. The Index of Human Vulnerability associated with the RCP 8.5 has shown a higher vulnerability for municipalities in the southern and eastern parts of the state of Minas Gerais.

  4. Social, Environmental, and Health Vulnerability to Climate Change: The Case of the Municipalities of Minas Gerais, Brazil

    PubMed Central

    Brito, Isabela; Oliveira, Frederico; Madureira, Ana Paula

    2017-01-01

    Vulnerability to climate change is a complex and dynamic phenomenon involving both social and physical/environmental aspects. It is presented as a method for the quantification of the vulnerability of all municipalities of Minas Gerais, a state in southeastern Brazil. It is based on the aggregation of different kinds of environmental, climatic, social, institutional, and epidemiological variables, to form a composite index. This was named “Index of Human Vulnerability” and was calculated using a software (SisVuClima®) specifically developed for this purpose. Social, environmental, and health data were combined with the climatic scenarios RCP 4.5 and 8.5, downscaled from ETA-HadGEM2-ES for each municipality. The Index of Human Vulnerability associated with the RCP 8.5 has shown a higher vulnerability for municipalities in the southern and eastern parts of the state of Minas Gerais. PMID:28465693

  5. Inference of alternative splicing from RNA-Seq data with probabilistic splice graphs

    PubMed Central

    LeGault, Laura H.; Dewey, Colin N.

    2013-01-01

    Motivation: Alternative splicing and other processes that allow for different transcripts to be derived from the same gene are significant forces in the eukaryotic cell. RNA-Seq is a promising technology for analyzing alternative transcripts, as it does not require prior knowledge of transcript structures or genome sequences. However, analysis of RNA-Seq data in the presence of genes with large numbers of alternative transcripts is currently challenging due to efficiency, identifiability and representation issues. Results: We present RNA-Seq models and associated inference algorithms based on the concept of probabilistic splice graphs, which alleviate these issues. We prove that our models are often identifiable and demonstrate that our inference methods for quantification and differential processing detection are efficient and accurate. Availability: Software implementing our methods is available at http://deweylab.biostat.wisc.edu/psginfer. Contact: cdewey@biostat.wisc.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:23846746

  6. Integration and global analysis of isothermal titration calorimetry data for studying macromolecular interactions.

    PubMed

    Brautigam, Chad A; Zhao, Huaying; Vargas, Carolyn; Keller, Sandro; Schuck, Peter

    2016-05-01

    Isothermal titration calorimetry (ITC) is a powerful and widely used method to measure the energetics of macromolecular interactions by recording a thermogram of differential heating power during a titration. However, traditional ITC analysis is limited by stochastic thermogram noise and by the limited information content of a single titration experiment. Here we present a protocol for bias-free thermogram integration based on automated shape analysis of the injection peaks, followed by combination of isotherms from different calorimetric titration experiments into a global analysis, statistical analysis of binding parameters and graphical presentation of the results. This is performed using the integrated public-domain software packages NITPIC, SEDPHAT and GUSSI. The recently developed low-noise thermogram integration approach and global analysis allow for more precise parameter estimates and more reliable quantification of multisite and multicomponent cooperative and competitive interactions. Titration experiments typically take 1-2.5 h each, and global analysis usually takes 10-20 min.

  7. Atomic Force Microscopy for Investigation of Ribosome-inactivating Proteins' Type II Tetramerization

    NASA Astrophysics Data System (ADS)

    Savvateev, M.; Kozlovskaya, N.; Moisenovich, M.; Tonevitsky, A.; Agapov, I.; Maluchenko, N.; Bykov, V.; Kirpichnikov, M.

    2003-12-01

    Biology of the toxins violently depends on their carbohydrate-binding centres' organization. Toxin tetramerization can lead to both increasing of lectin-binding centres' number and changes in their structural organization. A number and three-dimensional localization of such centres per one molecule strongly influence on toxins' biological properties. Ricin was used to obtain the AFM images of natural dimeric RIPsII structures as far as ricinus agglutinin was used for achievement of AFM images of natural tetrameric RIPsII forms. It is well-known that viscumin (60 kDa) has a property to form tetrameric structures dependently on ambient conditions and its concentration. Usage of the model dimer-tetramer based on ricin-agglutinin allowed to identify viscumin tetramers in AFM scans and to differ them from dimeric viscumin structures. Quantification analysis produced with the NT-MDT software allowed to estimate the geometrical parameters of ricin, ricinus agglutinin and viscumin molecules.

  8. RNA-Seq-Based Transcript Structure Analysis with TrBorderExt.

    PubMed

    Wang, Yejun; Sun, Ming-An; White, Aaron P

    2018-01-01

    RNA-Seq has become a routine strategy for genome-wide gene expression comparisons in bacteria. Despite lower resolution in transcript border parsing compared with dRNA-Seq, TSS-EMOTE, Cappable-seq, Term-seq, and others, directional RNA-Seq still illustrates its advantages: low cost, quantification and transcript border analysis with a medium resolution (±10-20 nt). To facilitate mining of directional RNA-Seq datasets especially with respect to transcript structure analysis, we developed a tool, TrBorderExt, which can parse transcript start sites and termination sites accurately in bacteria. A detailed protocol is described in this chapter for how to use the software package step by step to identify bacterial transcript borders from raw RNA-Seq data. The package was developed with Perl and R programming languages, and is accessible freely through the website: http://www.szu-bioinf.org/TrBorderExt .

  9. Fuzzy sets, rough sets, and modeling evidence: Theory and Application. A Dempster-Shafer based approach to compromise decision making with multiattributes applied to product selection

    NASA Technical Reports Server (NTRS)

    Dekorvin, Andre

    1992-01-01

    The Dempster-Shafer theory of evidence is applied to a multiattribute decision making problem whereby the decision maker (DM) must compromise with available alternatives, none of which exactly satisfies his ideal. The decision mechanism is constrained by the uncertainty inherent in the determination of the relative importance of each attribute element and the classification of existing alternatives. The classification of alternatives is addressed through expert evaluation of the degree to which each element is contained in each available alternative. The relative importance of each attribute element is determined through pairwise comparisons of the elements by the decision maker and implementation of a ratio scale quantification method. Then the 'belief' and 'plausibility' that an alternative will satisfy the decision maker's ideal are calculated and combined to rank order the available alternatives. Application to the problem of selecting computer software is given.

  10. Trends in data processing of comprehensive two-dimensional chromatography: state of the art.

    PubMed

    Matos, João T V; Duarte, Regina M B O; Duarte, Armando C

    2012-12-01

    The operation of advanced chromatographic systems, namely comprehensive two-dimensional (2D) chromatography coupled to multidimensional detectors, allows achieving a great deal of data that need special care to be processed in order to characterize and quantify as much as possible the analytes under study. The aim of this review is to identify the main trends, research needs and gaps on the techniques for data processing of multidimensional data sets obtained from comprehensive 2D chromatography. The following topics have been identified as the most promising for new developments in the near future: data acquisition and handling, peak detection and quantification, measurement of overlapping of 2D peaks, and data analysis software for 2D chromatography. The rational supporting most of the data processing techniques is based on the generalization of one-dimensional (1D) chromatography although algorithms, such as the inverted watershed algorithm, use the 2D chromatographic data as such. However, for processing more complex N-way data there is a need for using more sophisticated techniques. Apart from using other concepts from 1D chromatography, which have not been tested for 2D chromatography, there is still room for new improvements and developments in algorithms and software for dealing with 2D comprehensive chromatographic data. Copyright © 2012 Elsevier B.V. All rights reserved.

  11. Peridigm summary report : lessons learned in development with agile components.

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salinger, Andrew Gerhard; Mitchell, John Anthony; Littlewood, David John

    2011-09-01

    This report details efforts to deploy Agile Components for rapid development of a peridynamics code, Peridigm. The goal of Agile Components is to enable the efficient development of production-quality software by providing a well-defined, unifying interface to a powerful set of component-based software. Specifically, Agile Components facilitate interoperability among packages within the Trilinos Project, including data management, time integration, uncertainty quantification, and optimization. Development of the Peridigm code served as a testbed for Agile Components and resulted in a number of recommendations for future development. Agile Components successfully enabled rapid integration of Trilinos packages into Peridigm. A cost of thismore » approach, however, was a set of restrictions on Peridigm's architecture which impacted the ability to track history-dependent material data, dynamically modify the model discretization, and interject user-defined routines into the time integration algorithm. These restrictions resulted in modifications to the Agile Components approach, as implemented in Peridigm, and in a set of recommendations for future Agile Components development. Specific recommendations include improved handling of material states, a more flexible flow control model, and improved documentation. A demonstration mini-application, SimpleODE, was developed at the onset of this project and is offered as a potential supplement to Agile Components documentation.« less

  12. Quantitative assessment of primary mitral regurgitation using left ventricular volumes obtained with new automated three-dimensional transthoracic echocardiographic software: A comparison with 3-Tesla cardiac magnetic resonance.

    PubMed

    Levy, Franck; Marechaux, Sylvestre; Iacuzio, Laura; Schouver, Elie Dan; Castel, Anne Laure; Toledano, Manuel; Rusek, Stephane; Dor, Vincent; Tribouilloy, Christophe; Dreyfus, Gilles

    2018-03-30

    Quantitative assessment of primary mitral regurgitation (MR) using left ventricular (LV) volumes obtained with three-dimensional transthoracic echocardiography (3D TTE) recently showed encouraging results. Nevertheless, 3D TTE is not incorporated into everyday practice, as current LV chamber quantification software products are time consuming. To investigate the accuracy and reproducibility of new automated fast 3D TTE software (HeartModel A.I. ; Philips Healthcare, Andover, MA, USA) for the quantification of LV volumes and MR severity in patients with isolated degenerative primary MR; and to compare regurgitant volume (RV) obtained with 3D TTE with a cardiac magnetic resonance (CMR) reference. Fifty-three patients (37 men; mean age 64±12 years) with at least mild primary isolated MR, and having comprehensive 3D TTE and CMR studies within 24h, were eligible for inclusion. MR RV was calculated using the proximal isovelocity surface area (PISA) method and the volumetric method (total LV stroke volume minus aortic stroke volume) with either CMR or 3D TTE. Inter- and intraobserver reproducibility of 3D TTE was excellent (coefficient of variation≤10%) for LV volumes. MR RV was similar using CMR and 3D TTE (57±23mL vs 56±28mL; P=0.22), but was significantly higher using the PISA method (69±30mL; P<0.05 compared with CMR and 3D TTE). The PISA method consistently overestimated MR RV compared with CMR (bias 12±21mL), while no significant bias was found between 3D TTE and CMR (bias 2±14mL). Concordance between echocardiography and CMR was higher using 3D TTE MR grading (intraclass correlation coefficient [ICC]=0.89) than with PISA MR grading (ICC=0.78). Complete agreement with CMR grading was more frequent with 3D TTE than with the PISA method (76% vs 63%). 3D TTE RV assessment using the new generation of automated software correlates well with CMR in patients with isolated degenerative primary MR. Copyright © 2018 Elsevier Masson SAS. All rights reserved.

  13. Titan Science Return Quantification

    NASA Technical Reports Server (NTRS)

    Weisbin, Charles R.; Lincoln, William

    2014-01-01

    Each proposal for a NASA mission concept includes a Science Traceability Matrix (STM), intended to show that what is being proposed would contribute to satisfying one or more of the agency's top-level science goals. But the information traditionally provided cannot be used directly to quantitatively compare anticipated science return. We added numerical elements to NASA's STM and developed a software tool to process the data. We then applied this methodology to evaluate a group of competing concepts for a proposed mission to Saturn's moon, Titan.

  14. Rapid development of Proteomic applications with the AIBench framework.

    PubMed

    López-Fernández, Hugo; Reboiro-Jato, Miguel; Glez-Peña, Daniel; Méndez Reboredo, José R; Santos, Hugo M; Carreira, Ricardo J; Capelo-Martínez, José L; Fdez-Riverola, Florentino

    2011-09-15

    In this paper we present two case studies of Proteomics applications development using the AIBench framework, a Java desktop application framework mainly focused in scientific software development. The applications presented in this work are Decision Peptide-Driven, for rapid and accurate protein quantification, and Bacterial Identification, for Tuberculosis biomarker search and diagnosis. Both tools work with mass spectrometry data, specifically with MALDI-TOF spectra, minimizing the time required to process and analyze the experimental data. Copyright 2011 The Author(s). Published by Journal of Integrative Bioinformatics.

  15. Development of a Framework for Model-Based Analysis, Uncertainty Quantification, and Robust Control Design of Nonlinear Smart Composite Systems

    DTIC Science & Technology

    2015-06-04

    control, vibration and noise control, health monitoring, and energy harvesting . However, these advantages come at the cost of rate-dependent hysteresis...configuration used for energy harvesting . Uncertainty Quantification Uncertainty quantification is pursued in two steps: (i) determination of densities...Crews and R.C. Smith, “Quantification of parameter and model uncertainty for shape mem- ory alloy bending actuators,” Journal of Intelligent material

  16. Design and validation of Segment--freely available software for cardiovascular image analysis.

    PubMed

    Heiberg, Einar; Sjögren, Jane; Ugander, Martin; Carlsson, Marcus; Engblom, Henrik; Arheden, Håkan

    2010-01-11

    Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page http://segment.heiberg.se. Segment is a well-validated comprehensive software package for cardiovascular image analysis. It is freely available for research purposes provided that relevant original research publications related to the software are cited.

  17. Surface smoothness: cartilage biomarkers for knee OA beyond the radiologist

    NASA Astrophysics Data System (ADS)

    Tummala, Sudhakar; Dam, Erik B.

    2010-03-01

    Fully automatic imaging biomarkers may allow quantification of patho-physiological processes that a radiologist would not be able to assess reliably. This can introduce new insight but is problematic to validate due to lack of meaningful ground truth expert measurements. Rather than quantification accuracy, such novel markers must therefore be validated against clinically meaningful end-goals such as the ability to allow correct diagnosis. We present a method for automatic cartilage surface smoothness quantification in the knee joint. The quantification is based on a curvature flow method used on tibial and femoral cartilage compartments resulting from an automatic segmentation scheme. These smoothness estimates are validated for their ability to diagnose osteoarthritis and compared to smoothness estimates based on manual expert segmentations and to conventional cartilage volume quantification. We demonstrate that the fully automatic markers eliminate the time required for radiologist annotations, and in addition provide a diagnostic marker superior to the evaluated semi-manual markers.

  18. quanTLC, an online open-source solution for videodensitometric quantification.

    PubMed

    Fichou, Dimitri; Morlock, Gertrud E

    2018-07-27

    The image is the key feature of planar chromatography. Videodensitometry by digital image conversion is the fastest way of its evaluation. Instead of scanning single sample tracks one after the other, only few clicks are needed to convert all tracks at one go. A minimalistic software was newly developed, termed quanTLC, that allowed the quantitative evaluation of samples in few minutes. quanTLC includes important assets such as open-source, online, free of charge, intuitive to use and tailored to planar chromatography, as none of the nine existent software for image evaluation covered these aspects altogether. quanTLC supports common image file formats for chromatogram upload. All necessary steps were included, i.e., videodensitogram extraction, preprocessing, automatic peak integration, calibration, statistical data analysis, reporting and data export. The default options for each step are suitable for most analyses while still being tunable, if needed. A one-minute video was recorded to serve as user manual. The software capabilities are shown on the example of a lipophilic dye mixture separation. The quantitative results were verified by comparison with those obtained by commercial videodensitometry software and opto-mechanical slit-scanning densitometry. The data can be exported at each step to be processed in further software, if required. The code was released open-source to be exploited even further. The software itself is online useable without installation and directly accessible at http://shinyapps.ernaehrung.uni-giessen.de/quanTLC. Copyright © 2018 Elsevier B.V. All rights reserved.

  19. A comprehensive evaluation of popular proteomics software workflows for label-free proteome quantification and imputation.

    PubMed

    Välikangas, Tommi; Suomi, Tomi; Elo, Laura L

    2017-05-31

    Label-free mass spectrometry (MS) has developed into an important tool applied in various fields of biological and life sciences. Several software exist to process the raw MS data into quantified protein abundances, including open source and commercial solutions. Each software includes a set of unique algorithms for different tasks of the MS data processing workflow. While many of these algorithms have been compared separately, a thorough and systematic evaluation of their overall performance is missing. Moreover, systematic information is lacking about the amount of missing values produced by the different proteomics software and the capabilities of different data imputation methods to account for them.In this study, we evaluated the performance of five popular quantitative label-free proteomics software workflows using four different spike-in data sets. Our extensive testing included the number of proteins quantified and the number of missing values produced by each workflow, the accuracy of detecting differential expression and logarithmic fold change and the effect of different imputation and filtering methods on the differential expression results. We found that the Progenesis software performed consistently well in the differential expression analysis and produced few missing values. The missing values produced by the other software decreased their performance, but this difference could be mitigated using proper data filtering or imputation methods. Among the imputation methods, we found that the local least squares (lls) regression imputation consistently increased the performance of the software in the differential expression analysis, and a combination of both data filtering and local least squares imputation increased performance the most in the tested data sets. © The Author 2017. Published by Oxford University Press.

  20. Rhizoslides: paper-based growth system for non-destructive, high throughput phenotyping of root development by means of image analysis.

    PubMed

    Le Marié, Chantal; Kirchgessner, Norbert; Marschall, Daniela; Walter, Achim; Hund, Andreas

    2014-01-01

    A quantitative characterization of root system architecture is currently being attempted for various reasons. Non-destructive, rapid analyses of root system architecture are difficult to perform due to the hidden nature of the root. Hence, improved methods to measure root architecture are necessary to support knowledge-based plant breeding and to analyse root growth responses to environmental changes. Here, we report on the development of a novel method to reveal growth and architecture of maize root systems. The method is based on the cultivation of different root types within several layers of two-dimensional, large (50 × 60 cm) plates (rhizoslides). A central plexiglass screen stabilizes the system and is covered on both sides with germination paper providing water and nutrients for the developing root, followed by a transparent cover foil to prevent the roots from falling dry and to stabilize the system. The embryonic roots grow hidden between a Plexiglas surface and paper, whereas crown roots grow visible between paper and the transparent cover. Long cultivation with good image quality up to 20 days (four fully developed leaves) was enhanced by suppressing fungi with a fungicide. Based on hyperspectral microscopy imaging, the quality of different germination papers was tested and three provided sufficient contrast to distinguish between roots and background (segmentation). Illumination, image acquisition and segmentation were optimised to facilitate efficient root image analysis. Several software packages were evaluated with regard to their precision and the time investment needed to measure root system architecture. The software 'Smart Root' allowed precise evaluation of root development but needed substantial user interference. 'GiaRoots' provided the best segmentation method for batch processing in combination with a good analysis of global root characteristics but overestimated root length due to thinning artefacts. 'WhinRhizo' offered the most rapid and precise evaluation of root lengths in diameter classes, but had weaknesses with respect to image segmentation and analysis of root system architecture. A new technique has been established for non-destructive root growth studies and quantification of architectural traits beyond seedlings stages. However, automation of the scanning process and appropriate software remains the bottleneck for high throughput analysis.

  1. Liquid Chromatography-Mass Spectrometry-based Quantitative Proteomics

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Xie, Fang; Liu, Tao; Qian, Weijun

    2011-07-22

    Liquid chromatography-mass spectrometry (LC-MS)-based quantitative proteomics has become increasingly applied for a broad range of biological applications due to growing capabilities for broad proteome coverage and good accuracy in quantification. Herein, we review the current LC-MS-based quantification methods with respect to their advantages and limitations, and highlight their potential applications.

  2. The leaf angle distribution of natural plant populations: assessing the canopy with a novel software tool.

    PubMed

    Müller-Linow, Mark; Pinto-Espinosa, Francisco; Scharr, Hanno; Rascher, Uwe

    2015-01-01

    Three-dimensional canopies form complex architectures with temporally and spatially changing leaf orientations. Variations in canopy structure are linked to canopy function and they occur within the scope of genetic variability as well as a reaction to environmental factors like light, water and nutrient supply, and stress. An important key measure to characterize these structural properties is the leaf angle distribution, which in turn requires knowledge on the 3-dimensional single leaf surface. Despite a large number of 3-d sensors and methods only a few systems are applicable for fast and routine measurements in plants and natural canopies. A suitable approach is stereo imaging, which combines depth and color information that allows for easy segmentation of green leaf material and the extraction of plant traits, such as leaf angle distribution. We developed a software package, which provides tools for the quantification of leaf surface properties within natural canopies via 3-d reconstruction from stereo images. Our approach includes a semi-automatic selection process of single leaves and different modes of surface characterization via polygon smoothing or surface model fitting. Based on the resulting surface meshes leaf angle statistics are computed on the whole-leaf level or from local derivations. We include a case study to demonstrate the functionality of our software. 48 images of small sugar beet populations (4 varieties) have been analyzed on the base of their leaf angle distribution in order to investigate seasonal, genotypic and fertilization effects on leaf angle distributions. We could show that leaf angle distributions change during the course of the season with all varieties having a comparable development. Additionally, different varieties had different leaf angle orientation that could be separated in principle component analysis. In contrast nitrogen treatment had no effect on leaf angles. We show that a stereo imaging setup together with the appropriate image processing tools is capable of retrieving the geometric leaf surface properties of plants and canopies. Our software package provides whole-leaf statistics but also a local estimation of leaf angles, which may have great potential to better understand and quantify structural canopy traits for guided breeding and optimized crop management.

  3. Recommendations for Improving Identification and Quantification in Non-Targeted, GC-MS-Based Metabolomic Profiling of Human Plasma

    PubMed Central

    Wang, Hanghang; Muehlbauer, Michael J.; O’Neal, Sara K.; Newgard, Christopher B.; Hauser, Elizabeth R.; Shah, Svati H.

    2017-01-01

    The field of metabolomics as applied to human disease and health is rapidly expanding. In recent efforts of metabolomics research, greater emphasis has been placed on quality control and method validation. In this study, we report an experience with quality control and a practical application of method validation. Specifically, we sought to identify and modify steps in gas chromatography-mass spectrometry (GC-MS)-based, non-targeted metabolomic profiling of human plasma that could influence metabolite identification and quantification. Our experimental design included two studies: (1) a limiting-dilution study, which investigated the effects of dilution on analyte identification and quantification; and (2) a concentration-specific study, which compared the optimal plasma extract volume established in the first study with the volume used in the current institutional protocol. We confirmed that contaminants, concentration, repeatability and intermediate precision are major factors influencing metabolite identification and quantification. In addition, we established methods for improved metabolite identification and quantification, which were summarized to provide recommendations for experimental design of GC-MS-based non-targeted profiling of human plasma. PMID:28841195

  4. AVQS: Attack Route-Based Vulnerability Quantification Scheme for Smart Grid

    PubMed Central

    Lim, Hyunwoo; Lee, Seokjun; Shon, Taeshik

    2014-01-01

    A smart grid is a large, consolidated electrical grid system that includes heterogeneous networks and systems. Based on the data, a smart grid system has a potential security threat in its network connectivity. To solve this problem, we develop and apply a novel scheme to measure the vulnerability in a smart grid domain. Vulnerability quantification can be the first step in security analysis because it can help prioritize the security problems. However, existing vulnerability quantification schemes are not suitable for smart grid because they do not consider network vulnerabilities. We propose a novel attack route-based vulnerability quantification scheme using a network vulnerability score and an end-to-end security score, depending on the specific smart grid network environment to calculate the vulnerability score for a particular attack route. To evaluate the proposed approach, we derive several attack scenarios from the advanced metering infrastructure domain. The experimental results of the proposed approach and the existing common vulnerability scoring system clearly show that we need to consider network connectivity for more optimized vulnerability quantification. PMID:25152923

  5. In-Gel Stable-Isotope Labeling (ISIL): a strategy for mass spectrometry-based relative quantification.

    PubMed

    Asara, John M; Zhang, Xiang; Zheng, Bin; Christofk, Heather H; Wu, Ning; Cantley, Lewis C

    2006-01-01

    Most proteomics approaches for relative quantification of protein expression use a combination of stable-isotope labeling and mass spectrometry. Traditionally, researchers have used difference gel electrophoresis (DIGE) from stained 1D and 2D gels for relative quantification. While differences in protein staining intensity can often be visualized, abundant proteins can obscure less abundant proteins, and quantification of post-translational modifications is difficult. A method is presented for quantifying changes in the abundance of a specific protein or changes in specific modifications of a protein using In-gel Stable-Isotope Labeling (ISIL). Proteins extracted from any source (tissue, cell line, immunoprecipitate, etc.), treated under two experimental conditions, are resolved in separate lanes by gel electrophoresis. The regions of interest (visualized by staining) are reacted separately with light versus heavy isotope-labeled reagents, and the gel slices are then mixed and digested with proteases. The resulting peptides are then analyzed by LC-MS to determine relative abundance of light/heavy isotope pairs and analyzed by LC-MS/MS for identification of sequence and modifications. The strategy compares well with other relative quantification strategies, and in silico calculations reveal its effectiveness as a global relative quantification strategy. An advantage of ISIL is that visualization of gel differences can be used as a first quantification step followed by accurate and sensitive protein level stable-isotope labeling and mass spectrometry-based relative quantification.

  6. 18O-labeled proteome reference as global internal standards for targeted quantification by selected reaction monitoring-mass spectrometry.

    PubMed

    Kim, Jong-Seo; Fillmore, Thomas L; Liu, Tao; Robinson, Errol; Hossain, Mahmud; Champion, Boyd L; Moore, Ronald J; Camp, David G; Smith, Richard D; Qian, Wei-Jun

    2011-12-01

    Selected reaction monitoring (SRM)-MS is an emerging technology for high throughput targeted protein quantification and verification in biomarker discovery studies; however, the cost associated with the application of stable isotope-labeled synthetic peptides as internal standards can be prohibitive for screening a large number of candidate proteins as often required in the preverification phase of discovery studies. Herein we present a proof of concept study using an (18)O-labeled proteome reference as global internal standards (GIS) for SRM-based relative quantification. The (18)O-labeled proteome reference (or GIS) can be readily prepared and contains a heavy isotope ((18)O)-labeled internal standard for every possible tryptic peptide. Our results showed that the percentage of heavy isotope ((18)O) incorporation applying an improved protocol was >99.5% for most peptides investigated. The accuracy, reproducibility, and linear dynamic range of quantification were further assessed based on known ratios of standard proteins spiked into the labeled mouse plasma reference. Reliable quantification was observed with high reproducibility (i.e. coefficient of variance <10%) for analyte concentrations that were set at 100-fold higher or lower than those of the GIS based on the light ((16)O)/heavy ((18)O) peak area ratios. The utility of (18)O-labeled GIS was further illustrated by accurate relative quantification of 45 major human plasma proteins. Moreover, quantification of the concentrations of C-reactive protein and prostate-specific antigen was illustrated by coupling the GIS with standard additions of purified protein standards. Collectively, our results demonstrated that the use of (18)O-labeled proteome reference as GIS provides a convenient, low cost, and effective strategy for relative quantification of a large number of candidate proteins in biological or clinical samples using SRM.

  7. Computational analysis of PET by AIBL (CapAIBL): a cloud-based processing pipeline for the quantification of PET images

    NASA Astrophysics Data System (ADS)

    Bourgeat, Pierrick; Dore, Vincent; Fripp, Jurgen; Villemagne, Victor L.; Rowe, Chris C.; Salvado, Olivier

    2015-03-01

    With the advances of PET tracers for β-Amyloid (Aβ) detection in neurodegenerative diseases, automated quantification methods are desirable. For clinical use, there is a great need for PET-only quantification method, as MR images are not always available. In this paper, we validate a previously developed PET-only quantification method against MR-based quantification using 6 tracers: 18F-Florbetaben (N=148), 18F-Florbetapir (N=171), 18F-NAV4694 (N=47), 18F-Flutemetamol (N=180), 11C-PiB (N=381) and 18F-FDG (N=34). The results show an overall mean absolute percentage error of less than 5% for each tracer. The method has been implemented as a remote service called CapAIBL (http://milxcloud.csiro.au/capaibl). PET images are uploaded to a cloud platform where they are spatially normalised to a standard template and quantified. A report containing global as well as local quantification, along with surface projection of the β-Amyloid deposition is automatically generated at the end of the pipeline and emailed to the user.

  8. Boiler: lossy compression of RNA-seq alignments using coverage vectors.

    PubMed

    Pritt, Jacob; Langmead, Ben

    2016-09-19

    We describe Boiler, a new software tool for compressing and querying large collections of RNA-seq alignments. Boiler discards most per-read data, keeping only a genomic coverage vector plus a few empirical distributions summarizing the alignments. Since most per-read data is discarded, storage footprint is often much smaller than that achieved by other compression tools. Despite this, the most relevant per-read data can be recovered; we show that Boiler compression has only a slight negative impact on results given by downstream tools for isoform assembly and quantification. Boiler also allows the user to pose fast and useful queries without decompressing the entire file. Boiler is free open source software available from github.com/jpritt/boiler. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.

  9. The Accuracy of Al and Cu Film Thickness Determinations and the Implications for Electron Probe Microanalysis.

    PubMed

    Matthews, Mike B; Kearns, Stuart L; Buse, Ben

    2018-04-01

    The accuracy to which Cu and Al coatings can be determined, and the effect this has on the quantification of the substrate, is investigated. Cu and Al coatings of nominally 5, 10, 15, and 20 nm were sputter coated onto polished Bi using two configurations of coater: One with the film thickness monitor (FTM) sensor colocated with the samples, and one where the sensor is located to one side. The FTM thicknesses are compared against those calculated from measured Cu Lα and Al Kα k-ratios using PENEPMA, GMRFilm, and DTSA-II. Selected samples were also cross-sectioned using focused ion beam. Both systems produced repeatable coatings, the thickest coating being approximately four times the thinnest coating. The side-located FTM sensor indicated thicknesses less than half those of the software modeled results, propagating on to 70% errors in substrate quantification at 5 kV. The colocated FTM sensor produced errors in film thickness and substrate quantification of 10-20%. Over the range of film thicknesses and accelerating voltages modeled both the substrate and coating k-ratios can be approximated by linear trends as functions of film thickness. The Al films were found to have a reduced density of ~2 g/cm2.

  10. Real-time polymerase chain reaction-based approach for quantification of the pat gene in the T25 Zea mays event.

    PubMed

    Weighardt, Florian; Barbati, Cristina; Paoletti, Claudia; Querci, Maddalena; Kay, Simon; De Beuckeleer, Marc; Van den Eede, Guy

    2004-01-01

    In Europe, a growing interest for reliable techniques for the quantification of genetically modified component(s) of food matrixes is arising from the need to comply with the European legislative framework on novel food products. Real-time polymerase chain reaction (PCR) is currently the most powerful technique for the quantification of specific nucleic acid sequences. Several real-time PCR methodologies based on different molecular principles have been developed for this purpose. The most frequently used approach in the field of genetically modified organism (GMO) quantification in food or feed samples is based on the 5'-3'-exonuclease activity of Taq DNA polymerase on specific degradation probes (TaqMan principle). A novel approach was developed for the establishment of a TaqMan quantification system assessing GMO contents around the 1% threshold stipulated under European Union (EU) legislation for the labeling of food products. The Zea mays T25 elite event was chosen as a model for the development of the novel GMO quantification approach. The most innovative aspect of the system is represented by the use of sequences cloned in plasmids as reference standards. In the field of GMO quantification, plasmids are an easy to use, cheap, and reliable alternative to Certified Reference Materials (CRMs), which are only available for a few of the GMOs authorized in Europe, have a relatively high production cost, and require further processing to be suitable for analysis. Strengths and weaknesses of the use of novel plasmid-based standards are addressed in detail. In addition, the quantification system was designed to avoid the use of a reference gene (e.g., a single copy, species-specific gene) as normalizer, i.e., to perform a GMO quantification based on an absolute instead of a relative measurement. In fact, experimental evidences show that the use of reference genes adds variability to the measurement system because a second independent real-time PCR-based measurement must be performed. Moreover, for some reference genes no sufficient information on copy number in and among genomes of different lines is available, making adequate quantification difficult. Once developed, the method was subsequently validated according to IUPAC and ISO 5725 guidelines. Thirteen laboratories from 8 EU countries participated in the trial. Eleven laboratories provided results complying with the predefined study requirements. Repeatability (RSDr) values ranged from 8.7 to 15.9%, with a mean value of 12%. Reproducibility (RSDR) values ranged from 16.3 to 25.5%, with a mean value of 21%. Following Codex Alimentarius Committee guidelines, both the limits of detection and quantitation were determined to be <0.1%.

  11. Translational value of liquid chromatography coupled with tandem mass spectrometry-based quantitative proteomics for in vitro-in vivo extrapolation of drug metabolism and transport and considerations in selecting appropriate techniques.

    PubMed

    Al Feteisi, Hajar; Achour, Brahim; Rostami-Hodjegan, Amin; Barber, Jill

    2015-01-01

    Drug-metabolizing enzymes and transporters play an important role in drug absorption, distribution, metabolism and excretion and, consequently, they influence drug efficacy and toxicity. Quantification of drug-metabolizing enzymes and transporters in various tissues is therefore essential for comprehensive elucidation of drug absorption, distribution, metabolism and excretion. Recent advances in liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS) have improved the quantification of pharmacologically relevant proteins. This report presents an overview of mass spectrometry-based methods currently used for the quantification of drug-metabolizing enzymes and drug transporters, mainly focusing on applications and cost associated with various quantitative strategies based on stable isotope-labeled standards (absolute quantification peptide standards, quantification concatemers, protein standards for absolute quantification) and label-free analysis. In mass spectrometry, there is no simple relationship between signal intensity and analyte concentration. Proteomic strategies are therefore complex and several factors need to be considered when selecting the most appropriate method for an intended application, including the number of proteins and samples. Quantitative strategies require appropriate mass spectrometry platforms, yet choice is often limited by the availability of appropriate instrumentation. Quantitative proteomics research requires specialist practical skills and there is a pressing need to dedicate more effort and investment to training personnel in this area. Large-scale multicenter collaborations are also needed to standardize quantitative strategies in order to improve physiologically based pharmacokinetic models.

  12. Photoacoustic bio-quantification of graphene based nanomaterials at a single cell level (Conference Presentation)

    NASA Astrophysics Data System (ADS)

    Nedosekin, Dmitry A.; Nolan, Jacqueline; Biris, Alexandru S.; Zharov, Vladimir P.

    2017-03-01

    Arkansas Nanomedicine Center at the University of Arkansas for Medical Sciences in collaboration with other Arkansas Universities and the FDA-based National Center of Toxicological Research in Jefferson, AR is developing novel techniques for rapid quantification of graphene-based nanomaterials (GBNs) in various biological samples. All-carbon GBNs have wide range of potential applications in industry, agriculture, food processing and medicine; however, quantification of GBNs is difficult in carbon reach biological tissues. The accurate quantification of GBNs is essential for research on material toxicity and the development of GBNs-based drug delivery platforms. We have developed microscopy and cytometry platforms for detection and quantification of GBNs in single cells, tissue and blood samples using photoacoustic contrast of GBNs. We demonstrated PA quantification of individual graphene uptake by single cells. High-resolution PA microscopy provided mapping of GBN distribution within live cells to establish correlation with intracellular toxic phenomena using apoptotic and necrotic assays. This new methodology and corresponding technical platform provide the insight on possible toxicological risks of GBNs at singe cells levels. In addition, in vivo PA image flow cytometry demonstrated the capability to monitor of GBNs pharmacokinetics in mouse model and to map the resulting biodistribution of GBNs in mouse tissues. The integrated PA platform provided an unprecedented sensitivity toward GBNs and allowed to enhance conventional toxicology research by providing a direct correlation between uptake of GBNs at a single cell level and cell viability status.

  13. Quantifying reproducibility in computational biology: the case of the tuberculosis drugome.

    PubMed

    Garijo, Daniel; Kinnings, Sarah; Xie, Li; Xie, Lei; Zhang, Yinliang; Bourne, Philip E; Gil, Yolanda

    2013-01-01

    How easy is it to reproduce the results found in a typical computational biology paper? Either through experience or intuition the reader will already know that the answer is with difficulty or not at all. In this paper we attempt to quantify this difficulty by reproducing a previously published paper for different classes of users (ranging from users with little expertise to domain experts) and suggest ways in which the situation might be improved. Quantification is achieved by estimating the time required to reproduce each of the steps in the method described in the original paper and make them part of an explicit workflow that reproduces the original results. Reproducing the method took several months of effort, and required using new versions and new software that posed challenges to reconstructing and validating the results. The quantification leads to "reproducibility maps" that reveal that novice researchers would only be able to reproduce a few of the steps in the method, and that only expert researchers with advance knowledge of the domain would be able to reproduce the method in its entirety. The workflow itself is published as an online resource together with supporting software and data. The paper concludes with a brief discussion of the complexities of requiring reproducibility in terms of cost versus benefit, and a desiderata with our observations and guidelines for improving reproducibility. This has implications not only in reproducing the work of others from published papers, but reproducing work from one's own laboratory.

  14. An accurate proteomic quantification method: fluorescence labeling absolute quantification (FLAQ) using multidimensional liquid chromatography and tandem mass spectrometry.

    PubMed

    Liu, Junyan; Liu, Yang; Gao, Mingxia; Zhang, Xiangmin

    2012-08-01

    A facile proteomic quantification method, fluorescent labeling absolute quantification (FLAQ), was developed. Instead of using MS for quantification, the FLAQ method is a chromatography-based quantification in combination with MS for identification. Multidimensional liquid chromatography (MDLC) with laser-induced fluorescence (LIF) detection with high accuracy and tandem MS system were employed for FLAQ. Several requirements should be met for fluorescent labeling in MS identification: Labeling completeness, minimum side-reactions, simple MS spectra, and no extra tandem MS fragmentations for structure elucidations. A fluorescence dye, 5-iodoacetamidofluorescein, was finally chosen to label proteins on all cysteine residues. The fluorescent dye was compatible with the process of the trypsin digestion and MALDI MS identification. Quantitative labeling was achieved with optimization of reacting conditions. A synthesized peptide and model proteins, BSA (35 cysteines), OVA (five cysteines), were used for verifying the completeness of labeling. Proteins were separated through MDLC and quantified based on fluorescent intensities, followed by MS identification. High accuracy (RSD% < 1.58) and wide linearity of quantification (1-10(5) ) were achieved by LIF detection. The limit of quantitation for the model protein was as low as 0.34 amol. Parts of proteins in human liver proteome were quantified and demonstrated using FLAQ. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  15. Automated quantification of renal interstitial fibrosis for computer-aided diagnosis: A comprehensive tissue structure segmentation method.

    PubMed

    Tey, Wei Keat; Kuang, Ye Chow; Ooi, Melanie Po-Leen; Khoo, Joon Joon

    2018-03-01

    Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses. This study proposes an automated quantification system for measuring the amount of interstitial fibrosis in renal biopsy images as a consistent basis of comparison among pathologists. The system extracts and segments the renal tissue structures based on colour information and structural assumptions of the tissue structures. The regions in the biopsy representing the interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area and quantified as a percentage of the total area of the biopsy sample. A ground truth image dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated a good correlation in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. Interstitial fibrosis in renal biopsy samples is a scarring tissue structure that may be visually quantified by pathologists as an indicator to the presence and extent of chronic kidney disease. The standard method of quantification by visual evaluation presents reproducibility issues in the diagnoses due to the uncertainties in human judgement. An automated quantification system for accurately measuring the amount of interstitial fibrosis in renal biopsy images is presented as a consistent basis of comparison among pathologists. The system identifies the renal tissue structures through knowledge-based rules employing colour space transformations and structural features extraction from the images. In particular, the renal glomerulus identification is based on a multiscale textural feature analysis and a support vector machine. The regions in the biopsy representing interstitial fibrosis are deduced through the elimination of non-interstitial fibrosis structures from the biopsy area. The experiments conducted evaluate the system in terms of quantification accuracy, intra- and inter-observer variability in visual quantification by pathologists, and the effect introduced by the automated quantification system on the pathologists' diagnosis. A 40-image ground truth dataset has been manually prepared by consulting an experienced pathologist for the validation of the segmentation algorithms. The results from experiments involving experienced pathologists have demonstrated an average error of 9 percentage points in quantification result between the automated system and the pathologists' visual evaluation. Experiments investigating the variability in pathologists involving samples from 70 kidney patients also proved the automated quantification error rate to be on par with the average intra-observer variability in pathologists' quantification. The accuracy of the proposed quantification system has been validated with the ground truth dataset and compared against the pathologists' quantification results. It has been shown that the correlation between different pathologists' estimation of interstitial fibrosis area has significantly improved, demonstrating the effectiveness of the quantification system as a diagnostic aide. Copyright © 2017 Elsevier B.V. All rights reserved.

  16. Assessing the detectability of antioxidants in two-dimensional high-performance liquid chromatography.

    PubMed

    Bassanese, Danielle N; Conlan, Xavier A; Barnett, Neil W; Stevenson, Paul G

    2015-05-01

    This paper explores the analytical figures of merit of two-dimensional high-performance liquid chromatography for the separation of antioxidant standards. The cumulative two-dimensional high-performance liquid chromatography peak area was calculated for 11 antioxidants by two different methods--the areas reported by the control software and by fitting the data with a Gaussian model; these methods were evaluated for precision and sensitivity. Both methods demonstrated excellent precision in regards to retention time in the second dimension (%RSD below 1.16%) and cumulative second dimension peak area (%RSD below 3.73% from the instrument software and 5.87% for the Gaussian method). Combining areas reported by the high-performance liquid chromatographic control software displayed superior limits of detection, in the order of 1 × 10(-6) M, almost an order of magnitude lower than the Gaussian method for some analytes. The introduction of the countergradient eliminated the strong solvent mismatch between dimensions, leading to a much improved peak shape and better detection limits for quantification. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  17. Multiplex Droplet Digital PCR Protocols for Quantification of GM Maize Events.

    PubMed

    Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Štebih, Dejan; Morisset, Dany; Holst-Jensen, Arne; Žel, Jana

    2018-01-01

    The standard-curve based simplex quantitative polymerase chain reaction (qPCR) has been the gold standard for DNA target quantification for more than a decade. The large and growing number of individual analyses needed to test for genetically modified organisms (GMOs) is reducing the cost-effectiveness of qPCR. Droplet digital PCR (ddPCR) enables absolute quantification without standard curves, avoids the amplification efficiency bias observed with qPCR, allows more accurate estimations at low target copy numbers and, in combination with multiplexing, significantly improves cost efficiency. Here we describe two protocols for multiplex quantification of GM maize events: (1) nondiscriminating, with multiplex quantification of targets as a group (12 GM maize lines) and (2) discriminating, with multiplex quantification of individual targets (events). The first enables the quantification of twelve European Union authorized GM maize events as a group with only two assays, but does not permit determination of the individual events present. The second protocol enables the quantification of four individual targets (three GM events and one endogene) in a single reaction. Both protocols can be modified for quantification of any other DNA target.

  18. XGlycScan: An Open-source Software For N-linked Glycosite Assignment, Quantification and Quality Assessment of Data from Mass Spectrometry-based Glycoproteomic Analysis.

    PubMed

    Aiyetan, Paul; Zhang, Bai; Zhang, Zhen; Zhang, Hui

    2014-01-01

    Mass spectrometry based glycoproteomics has become a major means of identifying and characterizing previously N-linked glycan attached loci (glycosites). In the bottom-up approach, several factors which include but not limited to sample preparation, mass spectrometry analyses, and protein sequence database searches result in previously N-linked peptide spectrum matches (PSMs) of varying lengths. Given that multiple PSM scan map to a glycosite, we reason that identified PSMs are varying length peptide species of a unique set of glycosites. Because associated spectra of these PSMs are typically summed separately, true glycosite associated spectra counts are lost or complicated. Also, these varying length peptide species complicate protein inference as smaller sized peptide sequences are more likely to map to more proteins than larger sized peptides or actual glycosite sequences. Here, we present XGlycScan. XGlycScan maps varying length peptide species to glycosites to facilitate an accurate quantification of glycosite associated spectra counts. We observed that this reduced the variability in reported identifications of mass spectrometry technical replicates of our sample dataset. We also observed that mapping identified peptides to glycosites provided an assessment of search-engine identification. Inherently, XGlycScan reported glycosites reduce the complexity in protein inference. We implemented XGlycScan in the platform independent Java programing language and have made it available as open source. XGlycScan's source code is freely available at https://bitbucket.org/paiyetan/xglycscan/src and its compiled binaries and documentation can be freely downloaded at https://bitbucket.org/paiyetan/xglycscan/downloads. The graphical user interface version can also be found at https://bitbucket.org/paiyetan/xglycscangui/src and https://bitbucket.org/paiyetan/xglycscangui/downloads respectively.

  19. Optofluidic technology for monitoring rotifer Brachionus calyciflorus responses to regular light pulses

    NASA Astrophysics Data System (ADS)

    Cartlidge, Rhys; Campana, Olivia; Nugegoda, Dayanthi; Wlodkowic, Donald

    2016-12-01

    Behavioural alterations can occur as a result of a toxicant exposure at concentrations significantly lower than lethal effects that are commonly measured in acute toxicity testing. The use of alternating light and dark photoperiods to test phototactic responses of aquatic invertebrates in the presence of environmental contaminants provides an attractive analytical avenue. Quantification of phototactic responses represents a sublethal endpoint that can be employed as an early warning signal. Despite the benefits associated with the assessment of these endpoints, there is currently a lack of automated and miniaturized bioanalytical technologies to implement the development of toxicity testing with small aquatic species. In this study we present a proof-of-concept microfluidic Lab-on-a-Chip (LOC) platform for the assessment of rotifer swimming behavior in the presence of the toxicant copper sulfate. The device was designed to assess impact of toxicants at sub-lethal concentrations on freshwater crustacean Brachionus calyciflorus, testing behavioral endpoints such as animal swimming distance, speed and acceleration. The LOC device presented in this work enabled straightforward caging of microscopic crustaceans as well as non-invasive analysis of rapidly swimming animals in a focal plane of a video-microscopy system. The chip-based technology was fabricated using a new photolithography method that enabled formation of thick photoresist layers with minimal distortion. Photoresist molds were then employed for replica molding of LOC devices with poly(dimethylsiloxane) (PDMS) elastomer. The complete bioanalytical system consisted of: (i) microfluidic PDMS chip-based device; (ii) peristaltic microperfusion pumping manifold; (iii) miniaturized CMOS camera for video data acquisition; and (iv) video analysis software algorithms for quantification of changes in swimming behaviour of B. calyciflorus in response to reference toxicants.

  20. Mass Spectrometric Quantification of N-Linked Glycans by Reference to Exogenous Standards.

    PubMed

    Mehta, Nickita; Porterfield, Mindy; Struwe, Weston B; Heiss, Christian; Azadi, Parastoo; Rudd, Pauline M; Tiemeyer, Michael; Aoki, Kazuhiro

    2016-09-02

    Environmental and metabolic processes shape the profile of glycoprotein glycans expressed by cells, whether in culture, developing tissues, or mature organisms. Quantitative characterization of glycomic changes associated with these conditions has been achieved historically by reductive coupling of oligosaccharides to various fluorophores following release from glycoprotein and subsequent HPLC or capillary electrophoretic separation. Such labeling-based approaches provide a robust means of quantifying glycan amount based on fluorescence yield. Mass spectrometry, on the other hand, has generally been limited to relative quantification in which the contribution of the signal intensity for an individual glycan is expressed as a percent of the signal intensity summed over the total profile. Relative quantification has been valuable for highlighting changes in glycan expression between samples; sensitivity is high, and structural information can be derived by fragmentation. We have investigated whether MS-based glycomics is amenable to absolute quantification by referencing signal intensities to well-characterized oligosaccharide standards. We report the qualification of a set of N-linked oligosaccharide standards by NMR, HPLC, and MS. We also demonstrate the dynamic range, sensitivity, and recovery from complex biological matrices for these standards in their permethylated form. Our results indicate that absolute quantification for MS-based glycomic analysis is reproducible and robust utilizing currently available glycan standards.

  1. CPTAC Evaluates Long-Term Reproducibility of Quantitative Proteomics Using Breast Cancer Xenografts | Office of Cancer Clinical Proteomics Research

    Cancer.gov

    Liquid chromatography tandem-mass spectrometry (LC-MS/MS)- based methods such as isobaric tags for relative and absolute quantification (iTRAQ) and tandem mass tags (TMT) have been shown to provide overall better quantification accuracy and reproducibility over other LC-MS/MS techniques. However, large scale projects like the Clinical Proteomic Tumor Analysis Consortium (CPTAC) require comparisons across many genomically characterized clinical specimens in a single study and often exceed the capability of traditional iTRAQ-based quantification.

  2. Weather Augmented Risk Determination (WARD) System

    NASA Astrophysics Data System (ADS)

    Niknejad, M.; Mazdiyasni, O.; Momtaz, F.; AghaKouchak, A.

    2017-12-01

    Extreme climatic events have direct and indirect impacts on society, economy and the environment. Based on the United States Bureau of Economic Analysis (BEA) data, over one third of the U.S. GDP can be considered as weather-sensitive involving some degree of weather risk. This expands from a local scale concrete foundation construction to large scale transportation systems. Extreme and unexpected weather conditions have always been considered as one of the probable risks to human health, productivity and activities. The construction industry is a large sector of the economy, and is also greatly influenced by weather-related risks including work stoppage and low labor productivity. Identification and quantification of these risks, and providing mitigation of their effects are always the concerns of construction project managers. In addition to severe weather conditions' destructive effects, seasonal changes in weather conditions can also have negative impacts on human health. Work stoppage and reduced labor productivity can be caused by precipitation, wind, temperature, relative humidity and other weather conditions. Historical and project-specific weather information can improve better project management and mitigation planning, and ultimately reduce the risk of weather-related conditions. This paper proposes new software for project-specific user-defined data analysis that offers (a) probability of work stoppage and the estimated project length considering weather conditions; (b) information on reduced labor productivity and its impacts on project duration; and (c) probabilistic information on the project timeline based on both weather-related work stoppage and labor productivity. The software (WARD System) is designed such that it can be integrated into the already available project management tools. While the system and presented application focuses on the construction industry, the developed software is general and can be used for any application that involves labor productivity (e.g., farming) and work stoppage due to weather conditions (e.g., transportation, agriculture industry).

  3. A Comprehensive Validation Approach Using The RAVEN Code

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Alfonsi, Andrea; Rabiti, Cristian; Cogliati, Joshua J

    2015-06-01

    The RAVEN computer code , developed at the Idaho National Laboratory, is a generic software framework to perform parametric and probabilistic analysis based on the response of complex system codes. RAVEN is a multi-purpose probabilistic and uncertainty quantification platform, capable to communicate with any system code. A natural extension of the RAVEN capabilities is the imple- mentation of an integrated validation methodology, involving several different metrics, that represent an evolution of the methods currently used in the field. The state-of-art vali- dation approaches use neither exploration of the input space through sampling strategies, nor a comprehensive variety of metrics neededmore » to interpret the code responses, with respect experimental data. The RAVEN code allows to address both these lacks. In the following sections, the employed methodology, and its application to the newer developed thermal-hydraulic code RELAP-7, is reported.The validation approach has been applied on an integral effect experiment, representing natu- ral circulation, based on the activities performed by EG&G Idaho. Four different experiment configurations have been considered and nodalized.« less

  4. Quantitative risk management in gas injection project: a case study from Oman oil and gas industry

    NASA Astrophysics Data System (ADS)

    Khadem, Mohammad Miftaur Rahman Khan; Piya, Sujan; Shamsuzzoha, Ahm

    2017-09-01

    The purpose of this research was to study the recognition, application and quantification of the risks associated in managing projects. In this research, the management of risks in an oil and gas project is studied and implemented within a case company in Oman. In this study, at first, the qualitative data related to risks in the project were identified through field visits and extensive interviews. These data were then translated into numerical values based on the expert's opinion. Further, the numerical data were used as an input to Monte Carlo simulation. RiskyProject Professional™ software was used to simulate the system based on the identified risks. The simulation result predicted a delay of about 2 years as a worse case with no chance of meeting the project's on stream date. Also, it has predicted 8% chance of exceeding the total estimated budget. The result of numerical analysis from the proposed model is validated by comparing it with the result of qualitative analysis, which was obtained through discussion with various project managers of company.

  5. Clinical applications of MS-based protein quantification.

    PubMed

    Sabbagh, Bassel; Mindt, Sonani; Neumaier, Michael; Findeisen, Peter

    2016-04-01

    Mass spectrometry-based assays are increasingly important in clinical laboratory medicine and nowadays are already commonly used in several areas of routine diagnostics. These include therapeutic drug monitoring, toxicology, endocrinology, pediatrics, and microbiology. Accordingly, some of the most common analyses are therapeutic drug monitoring of immunosuppressants, vitamin D, steroids, newborn screening, and bacterial identification. However, MS-based quantification of peptides and proteins for routine diagnostic use is rather rare up to now despite excellent analytical specificity and good sensitivity. Here, we want to give an overview over current fit-for-purpose assays for MS-based protein quantification. Advantages as well as challenges of this approach will be discussed with focus on feasibility for routine diagnostic use. © 2016 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.

  6. UROKIN: A Software to Enhance Our Understanding of Urogenital Motion.

    PubMed

    Czyrnyj, Catriona S; Labrosse, Michel R; Graham, Ryan B; McLean, Linda

    2018-05-01

    Transperineal ultrasound (TPUS) allows for objective quantification of mid-sagittal urogenital mechanics, yet current practice omits dynamic motion information in favor of analyzing only a rest and a peak motion frame. This work details the development of UROKIN, a semi-automated software which calculates kinematic curves of urogenital landmark motion. A proof of concept analysis, performed using UROKIN on TPUS video recorded from 20 women with and 10 women without stress urinary incontinence (SUI) performing maximum voluntary contraction of the pelvic floor muscles. The anorectal angle and bladder neck were tracked while the motion of the pubic symphysis was used to compensate for the error incurred by TPUS probe motion during imaging. Kinematic curves of landmark motion were generated for each video and curves were smoothed, time normalized, and averaged within groups. Kinematic data yielded by the UROKIN software showed statistically significant differences between women with and without SUI in terms of magnitude and timing characteristics of the kinematic curves depicting landmark motion. Results provide insight into the ways in which UROKIN may be useful to study differences in pelvic floor muscle contraction mechanics between women with and without SUI and other pelvic floor disorders. The UROKIN software improves on methods described in the literature and provides unique capacity to further our understanding of urogenital biomechanics.

  7. Geological modelling of mineral deposits for prediction in mining

    NASA Astrophysics Data System (ADS)

    Sides, E. J.

    Accurate prediction of the shape, location, size and properties of the solid rock materials to be extracted during mining is essential for reliable technical and financial planning. This is achieved through geological modelling of the three-dimensional (3D) shape and properties of the materials present in mineral deposits, and the presentation of results in a form which is accessible to mine planning engineers. In recent years the application of interactive graphics software, offering 3D database handling, modelling and visualisation, has greatly enhanced the options available for predicting the subsurface limits and characteristics of mineral deposits. A review of conventional 3D geological interpretation methods, and the model struc- tures and modelling methods used in reserve estimation and mine planning software packages, illustrates the importance of such approaches in the modern mining industry. Despite the widespread introduction and acceptance of computer hardware and software in mining applications, in recent years, there has been little fundamental change in the way in which geology is used in orebody modelling for predictive purposes. Selected areas of current research, aimed at tackling issues such as the use of orientation data, quantification of morphological differences, incorporation of geological age relationships, multi-resolution models and the application of virtual reality hardware and software, are discussed.

  8. Uncertainty quantification metrics for whole product life cycle cost estimates in aerospace innovation

    NASA Astrophysics Data System (ADS)

    Schwabe, O.; Shehab, E.; Erkoyuncu, J.

    2015-08-01

    The lack of defensible methods for quantifying cost estimate uncertainty over the whole product life cycle of aerospace innovations such as propulsion systems or airframes poses a significant challenge to the creation of accurate and defensible cost estimates. Based on the axiomatic definition of uncertainty as the actual prediction error of the cost estimate, this paper provides a comprehensive overview of metrics used for the uncertainty quantification of cost estimates based on a literature review, an evaluation of publicly funded projects such as part of the CORDIS or Horizon 2020 programs, and an analysis of established approaches used by organizations such NASA, the U.S. Department of Defence, the ESA, and various commercial companies. The metrics are categorized based on their foundational character (foundations), their use in practice (state-of-practice), their availability for practice (state-of-art) and those suggested for future exploration (state-of-future). Insights gained were that a variety of uncertainty quantification metrics exist whose suitability depends on the volatility of available relevant information, as defined by technical and cost readiness level, and the number of whole product life cycle phases the estimate is intended to be valid for. Information volatility and number of whole product life cycle phases can hereby be considered as defining multi-dimensional probability fields admitting various uncertainty quantification metric families with identifiable thresholds for transitioning between them. The key research gaps identified were the lacking guidance grounded in theory for the selection of uncertainty quantification metrics and lacking practical alternatives to metrics based on the Central Limit Theorem. An innovative uncertainty quantification framework consisting of; a set-theory based typology, a data library, a classification system, and a corresponding input-output model are put forward to address this research gap as the basis for future work in this field.

  9. Automated classification of self-grooming in mice using open-source software.

    PubMed

    van den Boom, Bastijn J G; Pavlidi, Pavlina; Wolf, Casper J H; Mooij, Adriana H; Willuhn, Ingo

    2017-09-01

    Manual analysis of behavior is labor intensive and subject to inter-rater variability. Although considerable progress in automation of analysis has been made, complex behavior such as grooming still lacks satisfactory automated quantification. We trained a freely available, automated classifier, Janelia Automatic Animal Behavior Annotator (JAABA), to quantify self-grooming duration and number of bouts based on video recordings of SAPAP3 knockout mice (a mouse line that self-grooms excessively) and wild-type animals. We compared the JAABA classifier with human expert observers to test its ability to measure self-grooming in three scenarios: mice in an open field, mice on an elevated plus-maze, and tethered mice in an open field. In each scenario, the classifier identified both grooming and non-grooming with great accuracy and correlated highly with results obtained by human observers. Consistently, the JAABA classifier confirmed previous reports of excessive grooming in SAPAP3 knockout mice. Thus far, manual analysis was regarded as the only valid quantification method for self-grooming. We demonstrate that the JAABA classifier is a valid and reliable scoring tool, more cost-efficient than manual scoring, easy to use, requires minimal effort, provides high throughput, and prevents inter-rater variability. We introduce the JAABA classifier as an efficient analysis tool for the assessment of rodent self-grooming with expert quality. In our "how-to" instructions, we provide all information necessary to implement behavioral classification with JAABA. Copyright © 2017 Elsevier B.V. All rights reserved.

  10. Long-term room temperature preservation of corpse soft tissue: an approach for tissue sample storage

    PubMed Central

    2011-01-01

    Background Disaster victim identification (DVI) represents one of the most difficult challenges in forensic sciences, and subsequent DNA typing is essential. Collected samples for DNA-based human identification are usually stored at low temperature to halt the degradation processes of human remains. We have developed a simple and reliable procedure for soft tissue storage and preservation for DNA extraction. It ensures high quality DNA suitable for PCR-based DNA typing after at least 1 year of room temperature storage. Methods Fragments of human psoas muscle were exposed to three different environmental conditions for diverse time periods at room temperature. Storage conditions included: (a) a preserving medium consisting of solid sodium chloride (salt), (b) no additional substances and (c) garden soil. DNA was extracted with proteinase K/SDS followed by organic solvent treatment and concentration by centrifugal filter devices. Quantification was carried out by real-time PCR using commercial kits. Short tandem repeat (STR) typing profiles were analysed with 'expert software'. Results DNA quantities recovered from samples stored in salt were similar up to the complete storage time and underscored the effectiveness of the preservation method. It was possible to reliably and accurately type different genetic systems including autosomal STRs and mitochondrial and Y-chromosome haplogroups. Autosomal STR typing quality was evaluated by expert software, denoting high quality profiles from DNA samples obtained from corpse tissue stored in salt for up to 365 days. Conclusions The procedure proposed herein is a cost efficient alternative for storage of human remains in challenging environmental areas, such as mass disaster locations, mass graves and exhumations. This technique should be considered as an additional method for sample storage when preservation of DNA integrity is required for PCR-based DNA typing. PMID:21846338

  11. Automated volumetry of hippocampus is useful to confirm unilateral mesial temporal sclerosis in patients with radiologically positive findings.

    PubMed

    Silva, Guilherme; Martins, Cristina; Moreira da Silva, Nádia; Vieira, Duarte; Costa, Dias; Rego, Ricardo; Fonseca, José; Silva Cunha, João Paulo

    2017-08-01

    Background and purpose We evaluated two methods to identify mesial temporal sclerosis (MTS): visual inspection by experienced epilepsy neuroradiologists based on structural magnetic resonance imaging sequences and automated hippocampal volumetry provided by a processing pipeline based on the FMRIB Software Library. Methods This retrospective study included patients from the epilepsy monitoring unit database of our institution. All patients underwent brain magnetic resonance imaging in 1.5T and 3T scanners with protocols that included thin coronal T2, T1 and fluid-attenuated inversion recovery and isometric T1 acquisitions. Two neuroradiologists with experience in epilepsy and blinded to clinical data evaluated magnetic resonance images for the diagnosis of MTS. The diagnosis of MTS based on an automated method included the calculation of a volumetric asymmetry index between the two hippocampi of each patient and a threshold value to define the presence of MTS obtained through statistical tests (receiver operating characteristics curve). Hippocampi were segmented for volumetric quantification using the FIRST tool and fslstats from the FMRIB Software Library. Results The final cohort included 19 patients with unilateral MTS (14 left side): 14 women and a mean age of 43.4 ± 10.4 years. Neuroradiologists had a sensitivity of 100% and specificity of 73.3% to detect MTS (gold standard, k = 0.755). Automated hippocampal volumetry had a sensitivity of 84.2% and specificity of 86.7% (k = 0.704). Combined, these methods had a sensitivity of 84.2% and a specificity of 100% (k = 0.825). Conclusions Automated volumetry of the hippocampus could play an important role in temporal lobe epilepsy evaluation, namely on confirmation of unilateral MTS diagnosis in patients with radiological suggestive findings.

  12. Negative dielectrophoresis spectroscopy for rare analyte quantification in biological samples

    NASA Astrophysics Data System (ADS)

    Kirmani, Syed Abdul Mannan; Gudagunti, Fleming Dackson; Velmanickam, Logeeshan; Nawarathna, Dharmakeerthi; Lima, Ivan T., Jr.

    2017-03-01

    We propose the use of negative dielectrophoresis (DEP) spectroscopy as a technique to improve the detection limit of rare analytes in biological samples. We observe a significant dependence of the negative DEP force on functionalized polystyrene beads at the edges of interdigitated electrodes with respect to the frequency of the electric field. We measured this velocity of repulsion for 0% and 0.8% conjugation of avidin with biotin functionalized polystyrene beads with our automated software through real-time image processing that monitors the Rayleigh scattering from the beads. A significant difference in the velocity of the beads was observed in the presence of as little as 80 molecules of avidin per biotin functionalized bead. This technology can be applied in the detection and quantification of rare analytes that can be useful in the diagnosis and the treatment of diseases, such as cancer and myocardial infarction, with the use of polystyrene beads functionalized with antibodies for the target biomarkers.

  13. Current trends in quantitative proteomics - an update.

    PubMed

    Li, H; Han, J; Pan, J; Liu, T; Parker, C E; Borchers, C H

    2017-05-01

    Proteins can provide insights into biological processes at the functional level, so they are very promising biomarker candidates. The quantification of proteins in biological samples has been routinely used for the diagnosis of diseases and monitoring the treatment. Although large-scale protein quantification in complex samples is still a challenging task, a great amount of effort has been made to advance the technologies that enable quantitative proteomics. Seven years ago, in 2009, we wrote an article about the current trends in quantitative proteomics. In writing this current paper, we realized that, today, we have an even wider selection of potential tools for quantitative proteomics. These tools include new derivatization reagents, novel sampling formats, new types of analyzers and scanning techniques, and recently developed software to assist in assay development and data analysis. In this review article, we will discuss these innovative methods, and their current and potential applications in proteomics. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.

  14. Application of Photoshop and Scion Image analysis to quantification of signals in histochemistry, immunocytochemistry and hybridocytochemistry.

    PubMed

    Tolivia, Jorge; Navarro, Ana; del Valle, Eva; Perez, Cristina; Ordoñez, Cristina; Martínez, Eva

    2006-02-01

    To describe a simple method to achieve the differential selection and subsequent quantification of the strength signal using only one section. Several methods for performing quantitative histochemistry, immunocytochemistry or hybridocytochemistry, without use of specific commercial image analysis systems, rely on pixel-counting algorithms, which do not provide information on the amount of chromogen present in the section. Other techniques use complex algorithms to calculate the cumulative signal strength using two consecutive sections. To separate the chromogen signal we used the "Color range" option of the Adobe Photoshop program, which provides a specific file for a particular chromogen selection that could be applied on similar sections. The measurement of the chromogen signal strength of the specific staining is achieved with the Scion Image software program. The method described in this paper can also be applied to simultaneous detection of different signals on the same section or different parameters (area of particles, number of particles, etc.) when the "Analyze particles" tool of the Scion program is used.

  15. Procedures of assessment on the quantification of thoracic kyphosis and lumbar lordosis by radiography and photogrammetry: A literature review.

    PubMed

    Porto, Alessandra Beggiato; Okazaki, Victor Hugo Alves

    2017-10-01

    The quantification of thoracic kyphosis and lumbar lordosis can be assessed in different ways; among them radiography and photogrammetry. However, the assessment procedures are not consistent in the literature for either method. The objective of this study was to conduct a literature review about postural assessment through radiography and photogrammetry, for delineating the procedures for both methods. In total 38 studies were selected by an online search in the MEDLINE and LILACS databases with the keywords: radiograph and posture, postural alignment, photogrammetry or photometry or biophotogrammetry. For the radiographic method, the results showed divergences in arm positioning and in the calculation of thoracic and lumbar angles. The photogrammetry demonstrated differences in relation to the camera, tripod, plumb line and feet positioning, angle calculation, software utilization, and the use of footwear. Standardization is proposed for both methods to help establish normative values and comparisons between diagnoses. Copyright © 2017 Elsevier Ltd. All rights reserved.

  16. Automated muscle fiber type population analysis with ImageJ of whole rat muscles using rapid myosin heavy chain immunohistochemistry.

    PubMed

    Bergmeister, Konstantin D; Gröger, Marion; Aman, Martin; Willensdorfer, Anna; Manzano-Szalai, Krisztina; Salminger, Stefan; Aszmann, Oskar C

    2016-08-01

    Skeletal muscle consists of different fiber types which adapt to exercise, aging, disease, or trauma. Here we present a protocol for fast staining, automatic acquisition, and quantification of fiber populations with ImageJ. Biceps and lumbrical muscles were harvested from Sprague-Dawley rats. Quadruple immunohistochemical staining was performed on single sections using antibodies against myosin heavy chains and secondary fluorescent antibodies. Slides were scanned automatically with a slide scanner. Manual and automatic analyses were performed and compared statistically. The protocol provided rapid and reliable staining for automated image acquisition. Analyses between manual and automatic data indicated Pearson correlation coefficients for biceps of 0.645-0.841 and 0.564-0.673 for lumbrical muscles. Relative fiber populations were accurate to a degree of ± 4%. This protocol provides a reliable tool for quantification of muscle fiber populations. Using freely available software, it decreases the required time to analyze whole muscle sections. Muscle Nerve 54: 292-299, 2016. © 2016 Wiley Periodicals, Inc.

  17. NASA Tech Briefs, March 2012

    NASA Technical Reports Server (NTRS)

    2012-01-01

    The topics include: 1) Spectral Profiler Probe for In Situ Snow Grain Size and Composition Stratigraphy; 2) Portable Fourier Transform Spectroscopy for Analysis of Surface Contamination and Quality Control; 3) In Situ Geochemical Analysis and Age Dating of Rocks Using Laser Ablation-Miniature Mass Spectrometer; 4) Physics Mining of Multi-Source Data Sets; 5) Photogrammetry Tool for Forensic Analysis; 6) Connect Global Positioning System RF Module; 7) Simple Cell Balance Circuit; 8) Miniature EVA Software Defined Radio; 9) Remotely Accessible Testbed for Software Defined Radio Development; 10) System-of-Systems Technology-Portfolio-Analysis Tool; 11) VESGEN Software for Mapping and Quantification of Vascular Regulators; 12) Constructing a Database From Multiple 2D Images for Camera Pose Estimation and Robot Localization; 13) Adaption of G-TAG Software for Validating Touch and Go Asteroid Sample Return Design Methodology; 14) 3D Visualization for Phoenix Mars Lander Science Operations; 15) RxGen General Optical Model Prescription Generator; 16) Carbon Nanotube Bonding Strength Enhancement Using Metal Wicking Process; 17) Multi-Layer Far-Infrared Component Technology; 18) Germanium Lift-Off Masks for Thin Metal Film Patterning; 19) Sealing Materials for Use in Vacuum at High Temperatures; 20) Radiation Shielding System Using a Composite of Carbon Nanotubes Loaded With Electropolymers; 21) Nano Sponges for Drug Delivery and Medicinal Applications; 22) Molecular Technique to Understand Deep Microbial Diversity; 23) Methods and Compositions Based on Culturing Microorganisms in Low Sedimental Fluid Shear Conditions; 24) Secure Peer-to-Peer Networks for Scientific Information Sharing; 25) Multiplexer/Demultiplexer Loading Tool (MDMLT); 26) High-Rate Data-Capture for an Airborne Lidar System; 27) Wavefront Sensing Analysis of Grazing Incidence Optical Systems; 28) Foam-on-Tile Damage Model; 29) Instrument Package Manipulation Through the Generation and Use of an Attenuated-Fluent Gas Fold; 30) Multicolor Detectors for Ultrasensitive Long-Wave Imaging Cameras; 31) Lunar Reconnaissance Orbiter (LRO) Command and Data Handling Flight Electronics Subsystem; and 32) Electro-Optic Segment-Segment Sensors for Radio and Optical Telescopes.

  18. Validation of a free software for unsupervised assessment of abdominal fat in MRI.

    PubMed

    Maddalo, Michele; Zorza, Ivan; Zubani, Stefano; Nocivelli, Giorgio; Calandra, Giulio; Soldini, Pierantonio; Mascaro, Lorella; Maroldi, Roberto

    2017-05-01

    To demonstrate the accuracy of an unsupervised (fully automated) software for fat segmentation in magnetic resonance imaging. The proposed software is a freeware solution developed in ImageJ that enables the quantification of metabolically different adipose tissues in large cohort studies. The lumbar part of the abdomen (19cm in craniocaudal direction, centered in L3) of eleven healthy volunteers (age range: 21-46years, BMI range: 21.7-31.6kg/m 2 ) was examined in a breath hold on expiration with a GE T1 Dixon sequence. Single-slice and volumetric data were considered for each subject. The results of the visceral and subcutaneous adipose tissue assessments obtained by the unsupervised software were compared to supervised segmentations of reference. The associated statistical analysis included Pearson correlations, Bland-Altman plots and volumetric differences (VD % ). Values calculated by the unsupervised software significantly correlated with corresponding supervised segmentations of reference for both subcutaneous adipose tissue - SAT (R=0.9996, p<0.001) and visceral adipose tissue - VAT (R=0.995, p<0.001). Bland-Altman plots showed the absence of systematic errors and a limited spread of the differences. In the single-slice analysis, VD % were (1.6±2.9)% for SAT and (4.9±6.9)% for VAT. In the volumetric analysis, VD % were (1.3±0.9)% for SAT and (2.9±2.7)% for VAT. The developed software is capable of segmenting the metabolically different adipose tissues with a high degree of accuracy. This free add-on software for ImageJ can easily have a widespread and enable large-scale population studies regarding the adipose tissue and its related diseases. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.

  19. Reproducibility of dynamic contrast-enhanced MRI and dynamic susceptibility contrast MRI in the study of brain gliomas: a comparison of data obtained using different commercial software.

    PubMed

    Conte, Gian Marco; Castellano, Antonella; Altabella, Luisa; Iadanza, Antonella; Cadioli, Marcello; Falini, Andrea; Anzalone, Nicoletta

    2017-04-01

    Dynamic susceptibility contrast MRI (DSC) and dynamic contrast-enhanced MRI (DCE) are useful tools in the diagnosis and follow-up of brain gliomas; nevertheless, both techniques leave the open issue of data reproducibility. We evaluated the reproducibility of data obtained using two different commercial software for perfusion maps calculation and analysis, as one of the potential sources of variability can be the software itself. DSC and DCE analyses from 20 patients with gliomas were tested for both the intrasoftware (as intraobserver and interobserver reproducibility) and the intersoftware reproducibility, as well as the impact of different postprocessing choices [vascular input function (VIF) selection and deconvolution algorithms] on the quantification of perfusion biomarkers plasma volume (Vp), volume transfer constant (K trans ) and rCBV. Data reproducibility was evaluated with the intraclass correlation coefficient (ICC) and Bland-Altman analysis. For all the biomarkers, the intra- and interobserver reproducibility resulted in almost perfect agreement in each software, whereas for the intersoftware reproducibility the value ranged from 0.311 to 0.577, suggesting fair to moderate agreement; Bland-Altman analysis showed high dispersion of data, thus confirming these findings. Comparisons of different VIF estimation methods for DCE biomarkers resulted in ICC of 0.636 for K trans and 0.662 for Vp; comparison of two deconvolution algorithms in DSC resulted in an ICC of 0.999. The use of single software ensures very good intraobserver and interobservers reproducibility. Caution should be taken when comparing data obtained using different software or different postprocessing within the same software, as reproducibility is not guaranteed anymore.

  20. A common base method for analysis of qPCR data and the application of simple blocking in qPCR experiments.

    PubMed

    Ganger, Michael T; Dietz, Geoffrey D; Ewing, Sarah J

    2017-12-01

    qPCR has established itself as the technique of choice for the quantification of gene expression. Procedures for conducting qPCR have received significant attention; however, more rigorous approaches to the statistical analysis of qPCR data are needed. Here we develop a mathematical model, termed the Common Base Method, for analysis of qPCR data based on threshold cycle values (C q ) and efficiencies of reactions (E). The Common Base Method keeps all calculations in the logscale as long as possible by working with log 10 (E) ∙ C q , which we call the efficiency-weighted C q value; subsequent statistical analyses are then applied in the logscale. We show how efficiency-weighted C q values may be analyzed using a simple paired or unpaired experimental design and develop blocking methods to help reduce unexplained variation. The Common Base Method has several advantages. It allows for the incorporation of well-specific efficiencies and multiple reference genes. The method does not necessitate the pairing of samples that must be performed using traditional analysis methods in order to calculate relative expression ratios. Our method is also simple enough to be implemented in any spreadsheet or statistical software without additional scripts or proprietary components.

  1. Customized Consensus Spectral Library Building for Untargeted Quantitative Metabolomics Analysis with Data Independent Acquisition Mass Spectrometry and MetaboDIA Workflow.

    PubMed

    Chen, Gengbo; Walmsley, Scott; Cheung, Gemmy C M; Chen, Liyan; Cheng, Ching-Yu; Beuerman, Roger W; Wong, Tien Yin; Zhou, Lei; Choi, Hyungwon

    2017-05-02

    Data independent acquisition-mass spectrometry (DIA-MS) coupled with liquid chromatography is a promising approach for rapid, automatic sampling of MS/MS data in untargeted metabolomics. However, wide isolation windows in DIA-MS generate MS/MS spectra containing a mixed population of fragment ions together with their precursor ions. This precursor-fragment ion map in a comprehensive MS/MS spectral library is crucial for relative quantification of fragment ions uniquely representative of each precursor ion. However, existing reference libraries are not sufficient for this purpose since the fragmentation patterns of small molecules can vary in different instrument setups. Here we developed a bioinformatics workflow called MetaboDIA to build customized MS/MS spectral libraries using a user's own data dependent acquisition (DDA) data and to perform MS/MS-based quantification with DIA data, thus complementing conventional MS1-based quantification. MetaboDIA also allows users to build a spectral library directly from DIA data in studies of a large sample size. Using a marine algae data set, we show that quantification of fragment ions extracted with a customized MS/MS library can provide as reliable quantitative data as the direct quantification of precursor ions based on MS1 data. To test its applicability in complex samples, we applied MetaboDIA to a clinical serum metabolomics data set, where we built a DDA-based spectral library containing consensus spectra for 1829 compounds. We performed fragment ion quantification using DIA data using this library, yielding sensitive differential expression analysis.

  2. Stormbow: A Cloud-Based Tool for Reads Mapping and Expression Quantification in Large-Scale RNA-Seq Studies

    PubMed Central

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance

    2013-01-01

    RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets. PMID:25937948

  3. Java-based browsing, visualization and processing of heterogeneous medical data from remote repositories.

    PubMed

    Masseroli, M; Bonacina, S; Pinciroli, F

    2004-01-01

    The actual development of distributed information technologies and Java programming enables employing them also in the medical arena to support the retrieval, integration and evaluation of heterogeneous data and multimodal images in a web browser environment. With this aim, we used them to implement a client-server architecture based on software agents. The client side is a Java applet running in a web browser and providing a friendly medical user interface to browse and visualize different patient and medical test data, integrating them properly. The server side manages secure connections and queries to heterogeneous remote databases and file systems containing patient personal and clinical data. Based on the Java Advanced Imaging API, processing and analysis tools were developed to support the evaluation of remotely retrieved bioimages through the quantification of their features in different regions of interest. The Java platform-independence allows the centralized management of the implemented prototype and its deployment to each site where an intranet or internet connection is available. Giving healthcare providers effective support for comprehensively browsing, visualizing and evaluating medical images and records located in different remote repositories, the developed prototype can represent an important aid in providing more efficient diagnoses and medical treatments.

  4. Stormbow: A Cloud-Based Tool for Reads Mapping and Expression Quantification in Large-Scale RNA-Seq Studies.

    PubMed

    Zhao, Shanrong; Prenger, Kurt; Smith, Lance

    2013-01-01

    RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets.

  5. Normal values and standardization of parameters in nuclear cardiology: Japanese Society of Nuclear Medicine working group database.

    PubMed

    Nakajima, Kenichi; Matsumoto, Naoya; Kasai, Tokuo; Matsuo, Shinro; Kiso, Keisuke; Okuda, Koichi

    2016-04-01

    As a 2-year project of the Japanese Society of Nuclear Medicine working group activity, normal myocardial imaging databases were accumulated and summarized. Stress-rest with gated and non-gated image sets were accumulated for myocardial perfusion imaging and could be used for perfusion defect scoring and normal left ventricular (LV) function analysis. For single-photon emission computed tomography (SPECT) with multi-focal collimator design, databases of supine and prone positions and computed tomography (CT)-based attenuation correction were created. The CT-based correction provided similar perfusion patterns between genders. In phase analysis of gated myocardial perfusion SPECT, a new approach for analyzing dyssynchrony, normal ranges of parameters for phase bandwidth, standard deviation and entropy were determined in four software programs. Although the results were not interchangeable, dependency on gender, ejection fraction and volumes were common characteristics of these parameters. Standardization of (123)I-MIBG sympathetic imaging was performed regarding heart-to-mediastinum ratio (HMR) using a calibration phantom method. The HMRs from any collimator types could be converted to the value with medium-energy comparable collimators. Appropriate quantification based on common normal databases and standard technology could play a pivotal role for clinical practice and researches.

  6. iPhos: a toolkit to streamline the alkaline phosphatase-assisted comprehensive LC-MS phosphoproteome investigation

    PubMed Central

    2014-01-01

    Background Comprehensive characterization of the phosphoproteome in living cells is critical in signal transduction research. But the low abundance of phosphopeptides among the total proteome in cells remains an obstacle in mass spectrometry-based proteomic analysis. To provide a solution, an alternative analytic strategy to confidently identify phosphorylated peptides by using the alkaline phosphatase (AP) treatment combined with high-resolution mass spectrometry was provided. While the process is applicable, the key integration along the pipeline was mostly done by tedious manual work. Results We developed a software toolkit, iPhos, to facilitate and streamline the work-flow of AP-assisted phosphoproteome characterization. The iPhos tookit includes one assister and three modules. The iPhos Peak Extraction Assister automates the batch mode peak extraction for multiple liquid chromatography mass spectrometry (LC-MS) runs. iPhos Module-1 can process the peak lists extracted from the LC-MS analyses derived from the original and dephosphorylated samples to mine out potential phosphorylated peptide signals based on mass shift caused by the loss of some multiples of phosphate groups. And iPhos Module-2 provides customized inclusion lists with peak retention time windows for subsequent targeted LC-MS/MS experiments. Finally, iPhos Module-3 facilitates to link the peptide identifications from protein search engines to the quantification results from pattern-based label-free quantification tools. We further demonstrated the utility of the iPhos toolkit on the data of human metastatic lung cancer cells (CL1-5). Conclusions In the comparison study of the control group of CL1-5 cell lysates and the treatment group of dasatinib-treated CL1-5 cell lysates, we demonstrated the applicability of the iPhos toolkit and reported the experimental results based on the iPhos-facilitated phosphoproteome investigation. And further, we also compared the strategy with pure DDA-based LC-MS/MS phosphoproteome investigation. The results of iPhos-facilitated targeted LC-MS/MS analysis convey more thorough and confident phosphopeptide identification than the results of pure DDA-based analysis. PMID:25521246

  7. 3D for the people: multi-camera motion capture in the field with consumer-grade cameras and open source software

    PubMed Central

    Evangelista, Dennis J.; Ray, Dylan D.; Hedrick, Tyson L.

    2016-01-01

    ABSTRACT Ecological, behavioral and biomechanical studies often need to quantify animal movement and behavior in three dimensions. In laboratory studies, a common tool to accomplish these measurements is the use of multiple, calibrated high-speed cameras. Until very recently, the complexity, weight and cost of such cameras have made their deployment in field situations risky; furthermore, such cameras are not affordable to many researchers. Here, we show how inexpensive, consumer-grade cameras can adequately accomplish these measurements both within the laboratory and in the field. Combined with our methods and open source software, the availability of inexpensive, portable and rugged cameras will open up new areas of biological study by providing precise 3D tracking and quantification of animal and human movement to researchers in a wide variety of field and laboratory contexts. PMID:27444791

  8. The Rényi divergence enables accurate and precise cluster analysis for localisation microscopy.

    PubMed

    Staszowska, Adela D; Fox-Roberts, Patrick; Hirvonen, Liisa M; Peddie, Christopher J; Collinson, Lucy M; Jones, Gareth E; Cox, Susan

    2018-06-01

    Clustering analysis is a key technique for quantitatively characterising structures in localisation microscopy images. To build up accurate information about biological structures, it is critical that the quantification is both accurate (close to the ground truth) and precise (has small scatter and is reproducible). Here we describe how the Rényi divergence can be used for cluster radius measurements in localisation microscopy data. We demonstrate that the Rényi divergence can operate with high levels of background and provides results which are more accurate than Ripley's functions, Voronoi tesselation or DBSCAN. Data supporting this research will be made accessible via a web link. Software codes developed for this work can be accessed via http://coxphysics.com/Renyi_divergence_software.zip. Implemented in C ++. Correspondence and requests for materials can be also addressed to the corresponding author. adela.staszowska@gmail.com or susan.cox@kcl.ac.uk. Supplementary data are available at Bioinformatics online.

  9. TASI: A software tool for spatial-temporal quantification of tumor spheroid dynamics.

    PubMed

    Hou, Yue; Konen, Jessica; Brat, Daniel J; Marcus, Adam I; Cooper, Lee A D

    2018-05-08

    Spheroid cultures derived from explanted cancer specimens are an increasingly utilized resource for studying complex biological processes like tumor cell invasion and metastasis, representing an important bridge between the simplicity and practicality of 2-dimensional monolayer cultures and the complexity and realism of in vivo animal models. Temporal imaging of spheroids can capture the dynamics of cell behaviors and microenvironments, and when combined with quantitative image analysis methods, enables deep interrogation of biological mechanisms. This paper presents a comprehensive open-source software framework for Temporal Analysis of Spheroid Imaging (TASI) that allows investigators to objectively characterize spheroid growth and invasion dynamics. TASI performs spatiotemporal segmentation of spheroid cultures, extraction of features describing spheroid morpho-phenotypes, mathematical modeling of spheroid dynamics, and statistical comparisons of experimental conditions. We demonstrate the utility of this tool in an analysis of non-small cell lung cancer spheroids that exhibit variability in metastatic and proliferative behaviors.

  10. Mapping RNA-seq Reads with STAR

    PubMed Central

    Dobin, Alexander; Gingeras, Thomas R.

    2015-01-01

    Mapping of large sets of high-throughput sequencing reads to a reference genome is one of the foundational steps in RNA-seq data analysis. The STAR software package performs this task with high levels of accuracy and speed. In addition to detecting annotated and novel splice junctions, STAR is capable of discovering more complex RNA sequence arrangements, such as chimeric and circular RNA. STAR can align spliced sequences of any length with moderate error rates providing scalability for emerging sequencing technologies. STAR generates output files that can be used for many downstream analyses such as transcript/gene expression quantification, differential gene expression, novel isoform reconstruction, signal visualization, and so forth. In this unit we describe computational protocols that produce various output files, use different RNA-seq datatypes, and utilize different mapping strategies. STAR is Open Source software that can be run on Unix, Linux or Mac OS X systems. PMID:26334920

  11. Mapping RNA-seq Reads with STAR.

    PubMed

    Dobin, Alexander; Gingeras, Thomas R

    2015-09-03

    Mapping of large sets of high-throughput sequencing reads to a reference genome is one of the foundational steps in RNA-seq data analysis. The STAR software package performs this task with high levels of accuracy and speed. In addition to detecting annotated and novel splice junctions, STAR is capable of discovering more complex RNA sequence arrangements, such as chimeric and circular RNA. STAR can align spliced sequences of any length with moderate error rates, providing scalability for emerging sequencing technologies. STAR generates output files that can be used for many downstream analyses such as transcript/gene expression quantification, differential gene expression, novel isoform reconstruction, and signal visualization. In this unit, we describe computational protocols that produce various output files, use different RNA-seq datatypes, and utilize different mapping strategies. STAR is open source software that can be run on Unix, Linux, or Mac OS X systems. Copyright © 2015 John Wiley & Sons, Inc.

  12. Development of a Protein Standard Absolute Quantification (PSAQ™) assay for the quantification of Staphylococcus aureus enterotoxin A in serum.

    PubMed

    Adrait, Annie; Lebert, Dorothée; Trauchessec, Mathieu; Dupuis, Alain; Louwagie, Mathilde; Masselon, Christophe; Jaquinod, Michel; Chevalier, Benoît; Vandenesch, François; Garin, Jérôme; Bruley, Christophe; Brun, Virginie

    2012-06-06

    Enterotoxin A (SEA) is a staphylococcal virulence factor which is suspected to worsen septic shock prognosis. However, the presence of SEA in the blood of sepsis patients has never been demonstrated. We have developed a mass spectrometry-based assay for the targeted and absolute quantification of SEA in serum. To enhance sensitivity and specificity, we combined an immunoaffinity-based sample preparation with mass spectrometry analysis in the selected reaction monitoring (SRM) mode. Absolute quantification of SEA was performed using the PSAQ™ method (Protein Standard Absolute Quantification), which uses a full-length isotope-labeled SEA as internal standard. The lower limit of detection (LLOD) and lower limit of quantification (LLOQ) were estimated at 352pg/mL and 1057pg/mL, respectively. SEA recovery after immunocapture was determined to be 7.8±1.4%. Therefore, we assumed that less than 1femtomole of each SEA proteotypic peptide was injected on the liquid chromatography column before SRM analysis. From a 6-point titration experiment, quantification accuracy was determined to be 77% and precision at LLOQ was lower than 5%. With this sensitive PSAQ-SRM assay, we expect to contribute to decipher the pathophysiological role of SEA in severe sepsis. This article is part of a Special Issue entitled: Proteomics: The clinical link. Copyright © 2011 Elsevier B.V. All rights reserved.

  13. Emphysema quantification and lung volumetry in chest X-ray equivalent ultralow dose CT - Intra-individual comparison with standard dose CT.

    PubMed

    Messerli, Michael; Ottilinger, Thorsten; Warschkow, René; Leschka, Sebastian; Alkadhi, Hatem; Wildermuth, Simon; Bauer, Ralf W

    2017-06-01

    To determine whether ultralow dose chest CT with tin filtration can be used for emphysema quantification and lung volumetry and to assess differences in emphysema measurements and lung volume between standard dose and ultralow dose CT scans using advanced modeled iterative reconstruction (ADMIRE). 84 consecutive patients from a prospective, IRB-approved single-center study were included and underwent clinically indicated standard dose chest CT (1.7±0.6mSv) and additional single-energy ultralow dose CT (0.14±0.01mSv) at 100kV and fixed tube current at 70mAs with tin filtration in the same session. Forty of the 84 patients (48%) had no emphysema, 44 (52%) had emphysema. One radiologist performed fully automated software-based pulmonary emphysema quantification and lung volumetry of standard and ultralow dose CT with different levels of ADMIRE. Friedman test and Wilcoxon rank sum test were used for multiple comparison of emphysema and lung volume. Lung volumes were compared using the concordance correlation coefficient. The median low-attenuation areas (LAA) using filtered back projection (FBP) in standard dose was 4.4% and decreased to 2.6%, 2.1% and 1.8% using ADMIRE 3, 4, and 5, respectively. The median values of LAA in ultralow dose CT were 5.7%, 4.1% and 2.4% for ADMIRE 3, 4, and 5, respectively. There was no statistically significant difference between LAA in standard dose CT using FBP and ultralow dose using ADMIRE 4 (p=0.358) as well as in standard dose CT using ADMIRE 3 and ultralow dose using ADMIRE 5 (p=0.966). In comparison with standard dose FBP the concordance correlation coefficients of lung volumetry were 1.000, 0.999, and 0.999 for ADMIRE 3, 4, and 5 in standard dose, and 0.972 for ADMIRE 3, 4 and 5 in ultralow dose CT. Ultralow dose CT at chest X-ray equivalent dose levels allows for lung volumetry as well as detection and quantification of emphysema. However, longitudinal emphysema analyses should be performed with the same scan protocol and reconstruction algorithms for reproducibility. Copyright © 2017 Elsevier B.V. All rights reserved.

  14. 78 FR 52898 - Science-Based Methods for Entity-Scale Quantification of Greenhouse Gas Sources and Sinks From...

    Federal Register 2010, 2011, 2012, 2013, 2014

    2013-08-27

    ... DEPARTMENT OF AGRICULTURE [Docket Number: USDA-2013-0003] Science-Based Methods for Entity-Scale Quantification of Greenhouse Gas Sources and Sinks From Agriculture and Forestry Practices AGENCY: Office of the... of Agriculture (USDA) has prepared a report containing methods for quantifying entity-scale...

  15. Towards machine learned quality control: A benchmark for sharpness quantification in digital pathology.

    PubMed

    Campanella, Gabriele; Rajanna, Arjun R; Corsale, Lorraine; Schüffler, Peter J; Yagi, Yukako; Fuchs, Thomas J

    2018-04-01

    Pathology is on the verge of a profound change from an analog and qualitative to a digital and quantitative discipline. This change is mostly driven by the high-throughput scanning of microscope slides in modern pathology departments, reaching tens of thousands of digital slides per month. The resulting vast digital archives form the basis of clinical use in digital pathology and allow large scale machine learning in computational pathology. One of the most crucial bottlenecks of high-throughput scanning is quality control (QC). Currently, digital slides are screened manually to detected out-of-focus regions, to compensate for the limitations of scanner software. We present a solution to this problem by introducing a benchmark dataset for blur detection, an in-depth comparison of state-of-the art sharpness descriptors and their prediction performance within a random forest framework. Furthermore, we show that convolution neural networks, like residual networks, can be used to train blur detectors from scratch. We thoroughly evaluate the accuracy of feature based and deep learning based approaches for sharpness classification (99.74% accuracy) and regression (MSE 0.004) and additionally compare them to domain experts in a comprehensive human perception study. Our pipeline outputs spacial heatmaps enabling to quantify and localize blurred areas on a slide. Finally, we tested the proposed framework in the clinical setting and demonstrate superior performance over the state-of-the-art QC pipeline comprising commercial software and human expert inspection by reducing the error rate from 17% to 4.7%. Copyright © 2017. Published by Elsevier Ltd.

  16. SiMA: A simplified migration assay for analyzing neutrophil migration.

    PubMed

    Weckmann, Markus; Becker, Tim; Nissen, Gyde; Pech, Martin; Kopp, Matthias V

    2017-07-01

    In lung inflammation, neutrophils are the first leukocytes migrating to an inflammatory site, eliminating pathogens by multiple mechanisms. The term "migration" describes several stages of neutrophil movement to reach the site of inflammation, of which the passage of the interstitium and basal membrane of the airway are necessary to reach the site of bronchial inflammation. Currently, several methods exist (e.g., Boyden Chamber, under-agarose assay, or microfluidic systems) to assess neutrophil mobility. However, these methods do not allow for parameterization on single cell level, that is, the individual neutrophil pathway analysis is still considered challenging. This study sought to develop a simplified yet flexible method to monitor and quantify neutrophil chemotaxis by utilizing commercially available tissue culture hardware, simple video microscopic equipment and highly standardized tracking. A chemotaxis 3D µ-slide (IBIDI) was used with different chemoattractants [interleukin-8 (IL-8), fMLP, and Leukotriene B4 (LTB 4 )] to attract neutrophils in different matrices like Fibronectin (FN) or human placental matrix. Migration was recorded for 60 min using phase contrast microscopy with an EVOS ® FL Cell Imaging System. The images were normalized and texture based image segmentation was used to generate neutrophil trajectories. Based on these spatio-temporal information a comprehensive parameter set is extracted from each time series describing the neutrophils motility, including velocity and directness and neutrophil chemotaxis. To characterize the latter one, a sector analysis was employed enabling the quantification of the neutrophils response to the chemoattractant. Using this hard- and software framework we were able to identify typical migration profiles of the chemoattractants IL-8, fMLP, and LTB 4 , the effect of the matrices FN versus HEM as well as the response to different medications (Prednisolone). Additionally, a comparison of four asthmatic and three non-asthmatic patients gives a first hint to the capability of SiMA assay in the context of migration based diagnostics. Using SiMA we were able to identify typical migration profiles of the chemoattractants IL-8, fMLP, and LTB 4 , the effect of the matrices FN versus HEM as well as the response to different medications, that is, Prednisolone induced a change of direction of migrating neutrophils in FN but no such effect was observed in human placental matrix. In addition, neutrophils of asthmatic individuals showed an increased proportion of cells migrating toward the vehicle. With the SiMA platform we presented a simplified but yet flexible platform for cost-effective tracking and quantification of neutrophil migration. The introduced method is based on a simple microscopic video stage, standardized, commercially available, µ-fluidic migration chambers and automated image analysis, and track validation software. © 2017 International Society for Advancement of Cytometry. © 2017 International Society for Advancement of Cytometry.

  17. Development of a screening method for genetically modified soybean by plasmid-based quantitative competitive polymerase chain reaction.

    PubMed

    Shimizu, Eri; Kato, Hisashi; Nakagawa, Yuki; Kodama, Takashi; Futo, Satoshi; Minegishi, Yasutaka; Watanabe, Takahiro; Akiyama, Hiroshi; Teshima, Reiko; Furui, Satoshi; Hino, Akihiro; Kitta, Kazumi

    2008-07-23

    A novel type of quantitative competitive polymerase chain reaction (QC-PCR) system for the detection and quantification of the Roundup Ready soybean (RRS) was developed. This system was designed based on the advantage of a fully validated real-time PCR method used for the quantification of RRS in Japan. A plasmid was constructed as a competitor plasmid for the detection and quantification of genetically modified soy, RRS. The plasmid contained the construct-specific sequence of RRS and the taxon-specific sequence of lectin1 (Le1), and both had 21 bp oligonucleotide insertion in the sequences. The plasmid DNA was used as a reference molecule instead of ground seeds, which enabled us to precisely and stably adjust the copy number of targets. The present study demonstrated that the novel plasmid-based QC-PCR method could be a simple and feasible alternative to the real-time PCR method used for the quantification of genetically modified organism contents.

  18. Arkas: Rapid reproducible RNAseq analysis

    PubMed Central

    Colombo, Anthony R.; J. Triche Jr, Timothy; Ramsingh, Giridharan

    2017-01-01

    The recently introduced Kallisto pseudoaligner has radically simplified the quantification of transcripts in RNA-sequencing experiments.  We offer cloud-scale RNAseq pipelines Arkas-Quantification, and Arkas-Analysis available within Illumina’s BaseSpace cloud application platform which expedites Kallisto preparatory routines, reliably calculates differential expression, and performs gene-set enrichment of REACTOME pathways .  Due to inherit inefficiencies of scale, Illumina's BaseSpace computing platform offers a massively parallel distributive environment improving data management services and data importing.   Arkas-Quantification deploys Kallisto for parallel cloud computations and is conveniently integrated downstream from the BaseSpace Sequence Read Archive (SRA) import/conversion application titled SRA Import.  Arkas-Analysis annotates the Kallisto results by extracting structured information directly from source FASTA files with per-contig metadata, calculates the differential expression and gene-set enrichment analysis on both coding genes and transcripts. The Arkas cloud pipeline supports ENSEMBL transcriptomes and can be used downstream from the SRA Import facilitating raw sequencing importing, SRA FASTQ conversion, RNA quantification and analysis steps. PMID:28868134

  19. Spot quantification in two dimensional gel electrophoresis image analysis: comparison of different approaches and presentation of a novel compound fitting algorithm

    PubMed Central

    2014-01-01

    Background Various computer-based methods exist for the detection and quantification of protein spots in two dimensional gel electrophoresis images. Area-based methods are commonly used for spot quantification: an area is assigned to each spot and the sum of the pixel intensities in that area, the so-called volume, is used a measure for spot signal. Other methods use the optical density, i.e. the intensity of the most intense pixel of a spot, or calculate the volume from the parameters of a fitted function. Results In this study we compare the performance of different spot quantification methods using synthetic and real data. We propose a ready-to-use algorithm for spot detection and quantification that uses fitting of two dimensional Gaussian function curves for the extraction of data from two dimensional gel electrophoresis (2-DE) images. The algorithm implements fitting using logical compounds and is computationally efficient. The applicability of the compound fitting algorithm was evaluated for various simulated data and compared with other quantification approaches. We provide evidence that even if an incorrect bell-shaped function is used, the fitting method is superior to other approaches, especially when spots overlap. Finally, we validated the method with experimental data of urea-based 2-DE of Aβ peptides andre-analyzed published data sets. Our methods showed higher precision and accuracy than other approaches when applied to exposure time series and standard gels. Conclusion Compound fitting as a quantification method for 2-DE spots shows several advantages over other approaches and could be combined with various spot detection methods. The algorithm was scripted in MATLAB (Mathworks) and is available as a supplemental file. PMID:24915860

  20. Quantification of Lignin and Its Structural Features in Plant Biomass Using 13C Lignin as Internal Standard for Pyrolysis-GC-SIM-MS.

    PubMed

    van Erven, Gijs; de Visser, Ries; Merkx, Donny W H; Strolenberg, Willem; de Gijsel, Peter; Gruppen, Harry; Kabel, Mirjam A

    2017-10-17

    Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13 C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12 C and 13 C lignin were isolated from nonlabeled and uniformly 13 C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13 C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12 C lignin analogue and was shown to be extremely accurate (>99.9%, R 2 > 0.999) and precise (RSD < 1.5%). Third, 13 C-IS based lignin quantification was applied to four common poaceous biomass sources (wheat straw, barley straw, corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin.

  1. Quantification of Lignin and Its Structural Features in Plant Biomass Using 13C Lignin as Internal Standard for Pyrolysis-GC-SIM-MS

    PubMed Central

    2017-01-01

    Understanding the mechanisms underlying plant biomass recalcitrance at the molecular level can only be achieved by accurate analyses of both the content and structural features of the molecules involved. Current quantification of lignin is, however, majorly based on unspecific gravimetric analysis after sulfuric acid hydrolysis. Hence, our research aimed at specific lignin quantification with concurrent characterization of its structural features. Hereto, for the first time, a polymeric 13C lignin was used as internal standard (IS) for lignin quantification via analytical pyrolysis coupled to gas chromatography with mass-spectrometric detection in selected ion monitoring mode (py-GC-SIM-MS). In addition, relative response factors (RRFs) for the various pyrolysis products obtained were determined and applied. First, 12C and 13C lignin were isolated from nonlabeled and uniformly 13C labeled wheat straw, respectively, and characterized by heteronuclear single quantum coherence (HSQC), nuclear magnetic resonance (NMR), and py-GC/MS. The two lignin isolates were found to have identical structures. Second, 13C-IS based lignin quantification by py-GC-SIM-MS was validated in reconstituted biomass model systems with known contents of the 12C lignin analogue and was shown to be extremely accurate (>99.9%, R2 > 0.999) and precise (RSD < 1.5%). Third, 13C-IS based lignin quantification was applied to four common poaceous biomass sources (wheat straw, barley straw, corn stover, and sugar cane bagasse), and lignin contents were in good agreement with the total gravimetrically determined lignin contents. Our robust method proves to be a promising alternative for the high-throughput quantification of lignin in milled biomass samples directly and simultaneously provides a direct insight into the structural features of lignin. PMID:28926698

  2. Space Shuttle Main Engine Quantitative Risk Assessment: Illustrating Modeling of a Complex System with a New QRA Software Package

    NASA Technical Reports Server (NTRS)

    Smart, Christian

    1998-01-01

    During 1997, a team from Hernandez Engineering, MSFC, Rocketdyne, Thiokol, Pratt & Whitney, and USBI completed the first phase of a two year Quantitative Risk Assessment (QRA) of the Space Shuttle. The models for the Shuttle systems were entered and analyzed by a new QRA software package. This system, termed the Quantitative Risk Assessment System(QRAS), was designed by NASA and programmed by the University of Maryland. The software is a groundbreaking PC-based risk assessment package that allows the user to model complex systems in a hierarchical fashion. Features of the software include the ability to easily select quantifications of failure modes, draw Event Sequence Diagrams(ESDs) interactively, perform uncertainty and sensitivity analysis, and document the modeling. This paper illustrates both the approach used in modeling and the particular features of the software package. The software is general and can be used in a QRA of any complex engineered system. The author is the project lead for the modeling of the Space Shuttle Main Engines (SSMEs), and this paper focuses on the modeling completed for the SSMEs during 1997. In particular, the groundrules for the study, the databases used, the way in which ESDs were used to model catastrophic failure of the SSMES, the methods used to quantify the failure rates, and how QRAS was used in the modeling effort are discussed. Groundrules were necessary to limit the scope of such a complex study, especially with regard to a liquid rocket engine such as the SSME, which can be shut down after ignition either on the pad or in flight. The SSME was divided into its constituent components and subsystems. These were ranked on the basis of the possibility of being upgraded and risk of catastrophic failure. Once this was done the Shuttle program Hazard Analysis and Failure Modes and Effects Analysis (FMEA) were used to create a list of potential failure modes to be modeled. The groundrules and other criteria were used to screen out the many failure modes that did not contribute significantly to the catastrophic risk. The Hazard Analysis and FMEA for the SSME were also used to build ESDs that show the chain of events leading from the failure mode occurence to one of the following end states: catastrophic failure, engine shutdown, or siccessful operation( successful with respect to the failure mode under consideration).

  3. ProteoSign: an end-user online differential proteomics statistical analysis platform.

    PubMed

    Efstathiou, Georgios; Antonakis, Andreas N; Pavlopoulos, Georgios A; Theodosiou, Theodosios; Divanach, Peter; Trudgian, David C; Thomas, Benjamin; Papanikolaou, Nikolas; Aivaliotis, Michalis; Acuto, Oreste; Iliopoulos, Ioannis

    2017-07-03

    Profiling of proteome dynamics is crucial for understanding cellular behavior in response to intrinsic and extrinsic stimuli and maintenance of homeostasis. Over the last 20 years, mass spectrometry (MS) has emerged as the most powerful tool for large-scale identification and characterization of proteins. Bottom-up proteomics, the most common MS-based proteomics approach, has always been challenging in terms of data management, processing, analysis and visualization, with modern instruments capable of producing several gigabytes of data out of a single experiment. Here, we present ProteoSign, a freely available web application, dedicated in allowing users to perform proteomics differential expression/abundance analysis in a user-friendly and self-explanatory way. Although several non-commercial standalone tools have been developed for post-quantification statistical analysis of proteomics data, most of them are not end-user appealing as they often require very stringent installation of programming environments, third-party software packages and sometimes further scripting or computer programming. To avoid this bottleneck, we have developed a user-friendly software platform accessible via a web interface in order to enable proteomics laboratories and core facilities to statistically analyse quantitative proteomics data sets in a resource-efficient manner. ProteoSign is available at http://bioinformatics.med.uoc.gr/ProteoSign and the source code at https://github.com/yorgodillo/ProteoSign. © The Author(s) 2017. Published by Oxford University Press on behalf of Nucleic Acids Research.

  4. hEIDI: An Intuitive Application Tool To Organize and Treat Large-Scale Proteomics Data.

    PubMed

    Hesse, Anne-Marie; Dupierris, Véronique; Adam, Claire; Court, Magali; Barthe, Damien; Emadali, Anouk; Masselon, Christophe; Ferro, Myriam; Bruley, Christophe

    2016-10-07

    Advances in high-throughput proteomics have led to a rapid increase in the number, size, and complexity of the associated data sets. Managing and extracting reliable information from such large series of data sets require the use of dedicated software organized in a consistent pipeline to reduce, validate, exploit, and ultimately export data. The compilation of multiple mass-spectrometry-based identification and quantification results obtained in the context of a large-scale project represents a real challenge for developers of bioinformatics solutions. In response to this challenge, we developed a dedicated software suite called hEIDI to manage and combine both identifications and semiquantitative data related to multiple LC-MS/MS analyses. This paper describes how, through a user-friendly interface, hEIDI can be used to compile analyses and retrieve lists of nonredundant protein groups. Moreover, hEIDI allows direct comparison of series of analyses, on the basis of protein groups, while ensuring consistent protein inference and also computing spectral counts. hEIDI ensures that validated results are compliant with MIAPE guidelines as all information related to samples and results is stored in appropriate databases. Thanks to the database structure, validated results generated within hEIDI can be easily exported in the PRIDE XML format for subsequent publication. hEIDI can be downloaded from http://biodev.extra.cea.fr/docs/heidi .

  5. mzStudio: A Dynamic Digital Canvas for User-Driven Interrogation of Mass Spectrometry Data.

    PubMed

    Ficarro, Scott B; Alexander, William M; Marto, Jarrod A

    2017-08-01

    Although not yet truly 'comprehensive', modern mass spectrometry-based experiments can generate quantitative data for a meaningful fraction of the human proteome. Importantly for large-scale protein expression analysis, robust data pipelines are in place for identification of un-modified peptide sequences and aggregation of these data to protein-level quantification. However, interoperable software tools that enable scientists to computationally explore and document novel hypotheses for peptide sequence, modification status, or fragmentation behavior are not well-developed. Here, we introduce mzStudio, an open-source Python module built on our multiplierz project. This desktop application provides a highly-interactive graphical user interface (GUI) through which scientists can examine and annotate spectral features, re-search existing PSMs to test different modifications or new spectral matching algorithms, share results with colleagues, integrate other domain-specific software tools, and finally create publication-quality graphics. mzStudio leverages our common application programming interface (mzAPI) for access to native data files from multiple instrument platforms, including ion trap, quadrupole time-of-flight, Orbitrap, matrix-assisted laser desorption ionization, and triple quadrupole mass spectrometers and is compatible with several popular search engines including Mascot, Proteome Discoverer, X!Tandem, and Comet. The mzStudio toolkit enables researchers to create a digital provenance of data analytics and other evidence that support specific peptide sequence assignments.

  6. Single Color Multiplexed ddPCR Copy Number Measurements and Single Nucleotide Variant Genotyping.

    PubMed

    Wood-Bouwens, Christina M; Ji, Hanlee P

    2018-01-01

    Droplet digital PCR (ddPCR) allows for accurate quantification of genetic events such as copy number variation and single nucleotide variants. Probe-based assays represent the current "gold-standard" for detection and quantification of these genetic events. Here, we introduce a cost-effective single color ddPCR assay that allows for single genome resolution quantification of copy number and single nucleotide variation.

  7. Agreement between clinical estimation and a new quantitative analysis by Photoshop software in fundus and angiographic image variables.

    PubMed

    Ramezani, Alireza; Ahmadieh, Hamid; Azarmina, Mohsen; Soheilian, Masoud; Dehghan, Mohammad H; Mohebbi, Mohammad R

    2009-12-01

    To evaluate the validity of a new method for the quantitative analysis of fundus or angiographic images using Photoshop 7.0 (Adobe, USA) software by comparing with clinical evaluation. Four hundred and eighteen fundus and angiographic images of diabetic patients were evaluated by three retina specialists and then by computing using Photoshop 7.0 software. Four variables were selected for comparison: amount of hard exudates (HE) on color pictures, amount of HE on red-free pictures, severity of leakage, and the size of the foveal avascular zone (FAZ). The coefficient of agreement (Kappa) between the two methods in the amount of HE on color and red-free photographs were 85% (0.69) and 79% (0.59), respectively. The agreement for severity of leakage was 72% (0.46). In the two methods for the evaluation of the FAZ size using the magic and lasso software tools, the agreement was 54% (0.09) and 89% (0.77), respectively. Agreement in the estimation of the FAZ size by the lasso magnetic tool was excellent and was almost as good in the quantification of HE on color and on red-free images. Considering the agreement of this new technique for the measurement of variables in fundus images using Photoshop software with the clinical evaluation, this method seems to have sufficient validity to be used for the quantitative analysis of HE, leakage, and FAZ size on the angiograms of diabetic patients.

  8. Follow-up segmentation of lung tumors in PET and CT data

    NASA Astrophysics Data System (ADS)

    Opfer, Roland; Kabus, Sven; Schneider, Torben; Carlsen, Ingwer C.; Renisch, Steffen; Sabczynski, Jörg

    2009-02-01

    Early response assessment of cancer therapy is a crucial component towards a more effective and patient individualized cancer therapy. Integrated PET/CT systems provide the opportunity to combine morphologic with functional information. We have developed algorithms which allow the user to track both tumor volume and standardized uptake value (SUV) measurements during the therapy from series of CT and PET images, respectively. To prepare for tumor volume estimation we have developed a new technique for a fast, flexible, and intuitive 3D definition of meshes. This initial surface is then automatically adapted by means of a model-based segmentation algorithm and propagated to each follow-up scan. If necessary, manual corrections can be added by the user. To determine SUV measurements a prioritized region growing algorithm is employed. For an improved workflow all algorithms are embedded in a PET/CT therapy monitoring software suite giving the clinician a unified and immediate access to all data sets. Whenever the user clicks on a tumor in a base-line scan, the courses of segmented tumor volumes and SUV measurements are automatically identified and displayed to the user as a graph plot. According to each course, the therapy progress can be classified as complete or partial response or as progressive or stable disease. We have tested our methods with series of PET/CT data from 9 lung cancer patients acquired at Princess Margaret Hospital in Toronto. Each patient underwent three PET/CT scans during a radiation therapy. Our results indicate that a combination of mean metabolic activity in the tumor with the PET-based tumor volume can lead to an earlier response detection than a purely volume based (CT diameter) or purely functional based (e.g. SUV max or SUV mean) response measures. The new software seems applicable for easy, faster, and reproducible quantification to routinely monitor tumor therapy.

  9. Materials integrity in microsystems: a framework for a petascale predictive-science-based multiscale modeling and simulation system

    NASA Astrophysics Data System (ADS)

    To, Albert C.; Liu, Wing Kam; Olson, Gregory B.; Belytschko, Ted; Chen, Wei; Shephard, Mark S.; Chung, Yip-Wah; Ghanem, Roger; Voorhees, Peter W.; Seidman, David N.; Wolverton, Chris; Chen, J. S.; Moran, Brian; Freeman, Arthur J.; Tian, Rong; Luo, Xiaojuan; Lautenschlager, Eric; Challoner, A. Dorian

    2008-09-01

    Microsystems have become an integral part of our lives and can be found in homeland security, medical science, aerospace applications and beyond. Many critical microsystem applications are in harsh environments, in which long-term reliability needs to be guaranteed and repair is not feasible. For example, gyroscope microsystems on satellites need to function for over 20 years under severe radiation, thermal cycling, and shock loading. Hence a predictive-science-based, verified and validated computational models and algorithms to predict the performance and materials integrity of microsystems in these situations is needed. Confidence in these predictions is improved by quantifying uncertainties and approximation errors. With no full system testing and limited sub-system testings, petascale computing is certainly necessary to span both time and space scales and to reduce the uncertainty in the prediction of long-term reliability. This paper presents the necessary steps to develop predictive-science-based multiscale modeling and simulation system. The development of this system will be focused on the prediction of the long-term performance of a gyroscope microsystem. The environmental effects to be considered include radiation, thermo-mechanical cycling and shock. Since there will be many material performance issues, attention is restricted to creep resulting from thermal aging and radiation-enhanced mass diffusion, material instability due to radiation and thermo-mechanical cycling and damage and fracture due to shock. To meet these challenges, we aim to develop an integrated multiscale software analysis system that spans the length scales from the atomistic scale to the scale of the device. The proposed software system will include molecular mechanics, phase field evolution, micromechanics and continuum mechanics software, and the state-of-the-art model identification strategies where atomistic properties are calibrated by quantum calculations. We aim to predict the long-term (in excess of 20 years) integrity of the resonator, electrode base, multilayer metallic bonding pads, and vacuum seals in a prescribed mission. Although multiscale simulations are efficient in the sense that they focus the most computationally intensive models and methods on only the portions of the space time domain needed, the execution of the multiscale simulations associated with evaluating materials and device integrity for aerospace microsystems will require the application of petascale computing. A component-based software strategy will be used in the development of our massively parallel multiscale simulation system. This approach will allow us to take full advantage of existing single scale modeling components. An extensive, pervasive thrust in the software system development is verification, validation, and uncertainty quantification (UQ). Each component and the integrated software system need to be carefully verified. An UQ methodology that determines the quality of predictive information available from experimental measurements and packages the information in a form suitable for UQ at various scales needs to be developed. Experiments to validate the model at the nanoscale, microscale, and macroscale are proposed. The development of a petascale predictive-science-based multiscale modeling and simulation system will advance the field of predictive multiscale science so that it can be used to reliably analyze problems of unprecedented complexity, where limited testing resources can be adequately replaced by petascale computational power, advanced verification, validation, and UQ methodologies.

  10. PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data*

    PubMed Central

    Mitchell, Christopher J.; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh

    2016-01-01

    Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, 15N, 13C, or 18O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25–45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. PMID:27231314

  11. Biofilm Quantification on Nasolacrimal Silastic Stents After Dacryocystorhinostomy.

    PubMed

    Murphy, Jae; Ali, Mohammed Javed; Psaltis, Alkis James

    2015-01-01

    Biofilms are now recognized as potential factors in the pathogenesis of chronic inflammatory and infective diseases. The aim of this study was to examine the presence of biofilms and quantify their biomass on silastic nasolacrimal duct stents inserted after dacryocystorhinostomy (DCR). A prospective study was performed on a series of patients undergoing DCR with O'Donoghue stent insertion. After removal, the stents were subjected to biofilm analysis using standard protocols of confocal laser scanning microscopy (CLSM) and scanning electron microscopy. These stents were compared against negative controls and positive in vitro ones established using Staphylococcus aureus strain ATCC 25923. Biofilm quantification was performed using the COMSTAT2 software and the total biofilm biomass was calculated. A total of nine consecutive patient samples were included in this prospective study. None of the patients had any evidence of postoperative infection. All the stents demonstrated evidence of biofilm formation using both imaging modalities. The presence of various different sized organisms within a common exopolysaccharide matrix on CLSM suggested the existence of polymicrobial communities. The mean biomass of patient samples was 0.9385 μm³/μm² (range: 0.3901-1.9511 μm³/μm²). This is the first study to report the quantification of biomass on lacrimal stents. The presence of biofilms on lacrimal stents after DCR is a common finding but this need not necessarily translate to postoperative clinical infection.

  12. A Probabilistic Framework for Peptide and Protein Quantification from Data-Dependent and Data-Independent LC-MS Proteomics Experiments

    PubMed Central

    Richardson, Keith; Denny, Richard; Hughes, Chris; Skilling, John; Sikora, Jacek; Dadlez, Michał; Manteca, Angel; Jung, Hye Ryung; Jensen, Ole Nørregaard; Redeker, Virginie; Melki, Ronald; Langridge, James I.; Vissers, Johannes P.C.

    2013-01-01

    A probability-based quantification framework is presented for the calculation of relative peptide and protein abundance in label-free and label-dependent LC-MS proteomics data. The results are accompanied by credible intervals and regulation probabilities. The algorithm takes into account data uncertainties via Poisson statistics modified by a noise contribution that is determined automatically during an initial normalization stage. Protein quantification relies on assignments of component peptides to the acquired data. These assignments are generally of variable reliability and may not be present across all of the experiments comprising an analysis. It is also possible for a peptide to be identified to more than one protein in a given mixture. For these reasons the algorithm accepts a prior probability of peptide assignment for each intensity measurement. The model is constructed in such a way that outliers of any type can be automatically reweighted. Two discrete normalization methods can be employed. The first method is based on a user-defined subset of peptides, while the second method relies on the presence of a dominant background of endogenous peptides for which the concentration is assumed to be unaffected. Normalization is performed using the same computational and statistical procedures employed by the main quantification algorithm. The performance of the algorithm will be illustrated on example data sets, and its utility demonstrated for typical proteomics applications. The quantification algorithm supports relative protein quantification based on precursor and product ion intensities acquired by means of data-dependent methods, originating from all common isotopically-labeled approaches, as well as label-free ion intensity-based data-independent methods. PMID:22871168

  13. Quantitative proteome analysis using isobaric peptide termini labeling (IPTL).

    PubMed

    Arntzen, Magnus O; Koehler, Christian J; Treumann, Achim; Thiede, Bernd

    2011-01-01

    The quantitative comparison of proteome level changes across biological samples has become an essential feature in proteomics that remains challenging. We have recently introduced isobaric peptide termini labeling (IPTL), a novel strategy for isobaric quantification based on the derivatization of peptide termini with complementary isotopically labeled reagents. Unlike non-isobaric quantification methods, sample complexity at the MS level is not increased, providing improved sensitivity and protein coverage. The distinguishing feature of IPTL when comparing it to more established isobaric labeling methods (iTRAQ and TMT) is the presence of quantification signatures in all sequence-determining ions in MS/MS spectra, not only in the low mass reporter ion region. This makes IPTL a quantification method that is accessible to mass spectrometers with limited capabilities in the low mass range. Also, the presence of several quantification points in each MS/MS spectrum increases the robustness of the quantification procedure.

  14. Quantification of uncertainties for application in detonation simulation

    NASA Astrophysics Data System (ADS)

    Zheng, Miao; Ma, Zhibo

    2016-06-01

    Numerical simulation has become an important means in designing detonation systems, and the quantification of its uncertainty is also necessary to reliability certification. As to quantifying the uncertainty, it is the most important to analyze how the uncertainties occur and develop, and how the simulations develop from benchmark models to new models. Based on the practical needs of engineering and the technology of verification & validation, a framework of QU(quantification of uncertainty) is brought forward in the case that simulation is used on detonation system for scientific prediction. An example is offered to describe the general idea of quantification of simulation uncertainties.

  15. Quantification of the xenoestrogens 4-tert.-octylphenol and bisphenol A in water and in fish tissue based on microwave assisted extraction, solid-phase extraction and liquid chromatography-mass spectrometry.

    PubMed

    Pedersen, S N; Lindholst, C

    1999-12-09

    Extraction methods were developed for quantification of the xenoestrogens 4-tert.-octylphenol (tOP) and bisphenol A (BPA) in water and in liver and muscle tissue from the rainbow trout (Oncorhynchus mykiss). The extraction of tOP and BPA from tissue samples was carried out using microwave-assisted solvent extraction (MASE) followed by solid-phase extraction (SPE). Water samples were extracted using only SPE. For the quantification of tOP and BPA, liquid chromatography mass spectrometry (LC-MS) equipped with an atmospheric pressure chemical ionisation interface (APCI) was applied. The combined methods for tissue extraction allow the use of small sample amounts of liver or muscle (typically 1 g), low volumes of solvent (20 ml), and short extraction times (25 min). Limits of quantification of tOP in tissue samples were found to be approximately 10 ng/g in muscle and 50 ng/g in liver (both based on 1 g of fresh tissue). The corresponding values for BPA were approximately 50 ng/g in both muscle and liver tissue. In water, the limit of quantification for tOP and BPA was approximately 0.1 microg/l (based on 100 ml sample size).

  16. Polarization sensitive camera for the in vitro diagnostic and monitoring of dental erosion

    NASA Astrophysics Data System (ADS)

    Bossen, Anke; Rakhmatullina, Ekaterina; Lussi, Adrian; Meier, Christoph

    Due to a frequent consumption of acidic food and beverages, the prevalence of dental erosion increases worldwide. In an initial erosion stage, the hard dental tissue is softened due to acidic demineralization. As erosion progresses, a gradual tissue wear occurs resulting in thinning of the enamel. Complete loss of the enamel tissue can be observed in severe clinical cases. Therefore, it is essential to provide a diagnosis tool for an accurate detection and monitoring of dental erosion already at early stages. In this manuscript, we present the development of a polarization sensitive imaging camera for the visualization and quantification of dental erosion. The system consists of two CMOS cameras mounted on two sides of a polarizing beamsplitter. A horizontal linearly polarized light source is positioned orthogonal to the camera to ensure an incidence illumination and detection angles of 45°. The specular reflected light from the enamel surface is collected with an objective lens mounted on the beam splitter and divided into horizontal (H) and vertical (V) components on each associate camera. Images of non-eroded and eroded enamel surfaces at different erosion degrees were recorded and assessed with diagnostic software. The software was designed to generate and display two types of images: distribution of the reflection intensity (V) and a polarization ratio (H-V)/(H+V) throughout the analyzed tissue area. The measurements and visualization of these two optical parameters, i.e. specular reflection intensity and the polarization ratio, allowed detection and quantification of enamel erosion at early stages in vitro.

  17. Inductively Coupled Plasma Mass Spectrometry (ICP-MS) Applications in Quantitative Proteomics.

    PubMed

    Chahrour, Osama; Malone, John

    2017-01-01

    Recent advances in inductively coupled plasma mass spectrometry (ICP-MS) hyphenated to different separation techniques have promoted it as a valuable tool in protein/peptide quantification. These emerging ICP-MS applications allow absolute quantification by measuring specific elemental responses. One approach quantifies elements already present in the structure of the target peptide (e.g. phosphorus and sulphur) as natural tags. Quantification of these natural tags allows the elucidation of the degree of protein phosphorylation in addition to absolute protein quantification. A separate approach is based on utilising bi-functional labelling substances (those containing ICP-MS detectable elements), that form a covalent chemical bond with the protein thus creating analogs which are detectable by ICP-MS. Based on the previously established stoichiometries of the labelling reagents, quantification can be achieved. This technique is very useful for the design of precise multiplexed quantitation schemes to address the challenges of biomarker screening and discovery. This review discusses the capabilities and different strategies to implement ICP-MS in the field of quantitative proteomics. Copyright© Bentham Science Publishers; For any queries, please email at epub@benthamscience.org.

  18. Generic method for the absolute quantification of glutathione S-conjugates: Application to the conjugates of acetaminophen, clozapine and diclofenac.

    PubMed

    den Braver, Michiel W; Vermeulen, Nico P E; Commandeur, Jan N M

    2017-03-01

    Modification of cellular macromolecules by reactive drug metabolites is considered to play an important role in the initiation of tissue injury by many drugs. Detection and identification of reactive intermediates is often performed by analyzing the conjugates formed after trapping by glutathione (GSH). Although sensitivity of modern mass spectrometrical methods is extremely high, absolute quantification of GSH-conjugates is critically dependent on the availability of authentic references. Although 1 H NMR is currently the method of choice for quantification of metabolites formed biosynthetically, its intrinsically low sensitivity can be a limiting factor in quantification of GSH-conjugates which generally are formed at low levels. In the present study, a simple but sensitive and generic method for absolute quantification of GSH-conjugates is presented. The method is based on quantitative alkaline hydrolysis of GSH-conjugates and subsequent quantification of glutamic acid and glycine by HPLC after precolumn derivatization with o-phthaldialdehyde/N-acetylcysteine (OPA/NAC). Because of the lower stability of the glycine OPA/NAC-derivate, quantification of the glutamic acid OPA/NAC-derivate appeared most suitable for quantification of GSH-conjugates. The novel method was used to quantify the concentrations of GSH-conjugates of diclofenac, clozapine and acetaminophen and quantification was consistent with 1 H NMR, but with a more than 100-fold lower detection limit for absolute quantification. Copyright © 2017. Published by Elsevier B.V.

  19. Albany v. 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Salinger, Andrew; Phipps, Eric; Ostien, Jakob

    2016-01-13

    The Albany code is a general-purpose finite element code for solving partial differential equations (PDEs). Albany is a research code that demonstrates how a PDE code can be built by interfacing many of the open-source software libraries that are released under Sandia's Trilinos project. Part of the mission of Albany is to be a testbed for new Trilinos libraries, to refine their methods, usability, and interfaces. Albany includes hooks to optimization and uncertainty quantification algorithms, including those in Trilinos as well as those in the Dakota toolkit. Because of this, Albany is a desirable starting point for new code developmentmore » efforts that wish to make heavy use of Trilinos. Albany is both a framework and the host for specific finite element applications. These applications have project names, and can be controlled by configuration option when the code is compiled, but are all developed and released as part of the single Albany code base, These include LCM, QCAD, FELIX, Aeras, and ATO applications.« less

  20. Lipidomics informatics for life-science.

    PubMed

    Schwudke, D; Shevchenko, A; Hoffmann, N; Ahrends, R

    2017-11-10

    Lipidomics encompasses analytical approaches that aim to identify and quantify the complete set of lipids, defined as lipidome in a given cell, tissue or organism as well as their interactions with other molecules. The majority of lipidomics workflows is based on mass spectrometry and has been proven as a powerful tool in system biology in concert with other Omics disciplines. Unfortunately, bioinformatics infrastructures for this relatively young discipline are limited only to some specialists. Search engines, quantification algorithms, visualization tools and databases developed by the 'Lipidomics Informatics for Life-Science' (LIFS) partners will be restructured and standardized to provide broad access to these specialized bioinformatics pipelines. There are many medical challenges related to lipid metabolic alterations that will be fostered by capacity building suggested by LIFS. LIFS as member of the 'German Network for Bioinformatics' (de.NBI) node for 'Bioinformatics for Proteomics' (BioInfra.Prot) and will provide access to the described software as well as to tutorials and consulting services via a unified web-portal. Copyright © 2017 Elsevier B.V. All rights reserved.

  1. Portable Multispectral Colorimeter for Metallic Ion Detection and Classification

    PubMed Central

    Jaimes, Ruth F. V. V.; Borysow, Walter; Gomes, Osmar F.; Salcedo, Walter J.

    2017-01-01

    This work deals with a portable device system applied to detect and classify different metallic ions as proposed and developed, aiming its application for hydrological monitoring systems such as rivers, lakes and groundwater. Considering the system features, a portable colorimetric system was developed by using a multispectral optoelectronic sensor. All the technology of quantification and classification of metallic ions using optoelectronic multispectral sensors was fully integrated in the embedded hardware FPGA ( Field Programmable Gate Array) technology and software based on virtual instrumentation (NI LabView®). The system draws on an indicative colorimeter by using the chromogen reagent of 1-(2-pyridylazo)-2-naphthol (PAN). The results obtained with the signal processing and pattern analysis using the method of the linear discriminant analysis, allows excellent results during detection and classification of Pb(II), Cd(II), Zn(II), Cu(II), Fe(III) and Ni(II) ions, with almost the same level of performance as for those obtained from the Ultravioled and visible (UV-VIS) spectrophotometers of high spectral resolution. PMID:28788082

  2. Quantification of N-Acetyl Aspartyl Glutamate in Human Brain using Proton Magnetic Resonance Spectroscopy at 7 T

    NASA Astrophysics Data System (ADS)

    Elywa, M.

    2015-07-01

    The separation of N-acetyl aspartyl glutamate (NAAG) from N-acetyl aspartate (NAA) and other metabolites, such as glutamate, by in vivo proton magnetic resonance spectroscopy at 7 T is described. This method is based on the stimulated echo acquisition mode (STEAM), with short and long echo time (TE) and allows quantitative measurements of NAAG in the parietal and pregenual anterior cingulate cortex (pgACC) of human brain. Two basesets for the LCModel have been established using nuclear magnetic resonance simulator software (NMR-SIM). Six healthy volunteers (age 25-35 years) have been examined at 7 T. It has been established that NAAG can be separated and quantified in the parietal location and does not get quantified in the pgACC location when using a short echo time, TE = 20 ms. On the other hand, by using a long echo time, TE = 74 ms, NAAG can be quantified in pgACC structures.

  3. Development of a versatile user-friendly IBA experimental chamber

    NASA Astrophysics Data System (ADS)

    Kakuee, Omidreza; Fathollahi, Vahid; Lamehi-Rachti, Mohammad

    2016-03-01

    Reliable performance of the Ion Beam Analysis (IBA) techniques is based on the accurate geometry of the experimental setup, employment of the reliable nuclear data and implementation of dedicated analysis software for each of the IBA techniques. It has already been shown that geometrical imperfections lead to significant uncertainties in quantifications of IBA measurements. To minimize these uncertainties, a user-friendly experimental chamber with a heuristic sample positioning system for IBA analysis was recently developed in the Van de Graaff laboratory in Tehran. This system enhances IBA capabilities and in particular Nuclear Reaction Analysis (NRA) and Elastic Recoil Detection Analysis (ERDA) techniques. The newly developed sample manipulator provides the possibility of both controlling the tilt angle of the sample and analyzing samples with different thicknesses. Moreover, a reasonable number of samples can be loaded in the sample wheel. A comparison of the measured cross section data of the 16O(d,p1)17O reaction with the data reported in the literature confirms the performance and capability of the newly developed experimental chamber.

  4. Portable Multispectral Colorimeter for Metallic Ion Detection and Classification.

    PubMed

    Braga, Mauro S; Jaimes, Ruth F V V; Borysow, Walter; Gomes, Osmar F; Salcedo, Walter J

    2017-07-28

    This work deals with a portable device system applied to detect and classify different metallic ions as proposed and developed, aiming its application for hydrological monitoring systems such as rivers, lakes and groundwater. Considering the system features, a portable colorimetric system was developed by using a multispectral optoelectronic sensor. All the technology of quantification and classification of metallic ions using optoelectronic multispectral sensors was fully integrated in the embedded hardware FPGA ( Field Programmable Gate Array) technology and software based on virtual instrumentation (NI LabView ® ). The system draws on an indicative colorimeter by using the chromogen reagent of 1-(2-pyridylazo)-2-naphthol (PAN). The results obtained with the signal processing and pattern analysis using the method of the linear discriminant analysis, allows excellent results during detection and classification of Pb(II), Cd(II), Zn(II), Cu(II), Fe(III) and Ni(II) ions, with almost the same level of performance as for those obtained from the Ultravioled and visible (UV-VIS) spectrophotometers of high spectral resolution.

  5. Geographic information system (GIS)-based image analysis for assessing growth of Physarum polycephalum on a solid medium.

    PubMed

    Tran, Hanh T M; Stephenson, Steven L; Tullis, Jason A

    2015-01-01

    The conventional method used to assess growth of the plasmodium of the slime mold Physarum polycephalum in solid culture is to measure the extent of plasmodial expansion from the point of inoculation by using a ruler. However, plasmodial growth is usually rather irregular, so the values obtained are not especially accurate. Similar challenges exist in quantification of the growth of a fungal mycelium. In this paper, we describe a method that uses geographic information system software to obtain highly accurate estimates of plasmodial growth over time. This approach calculates plasmodial area from images obtained at particular intervals following inoculation. In addition, the correlation between plasmodial area and its dry cell weight value was determined. The correlation could be used for biomass estimation without the need of having to terminate the cultures in question. The method described herein is simple but effective and could also be used for growth measurements of other microorganisms such as fungi on solid media.

  6. Automated Video-Based Analysis of Contractility and Calcium Flux in Human-Induced Pluripotent Stem Cell-Derived Cardiomyocytes Cultured over Different Spatial Scales.

    PubMed

    Huebsch, Nathaniel; Loskill, Peter; Mandegar, Mohammad A; Marks, Natalie C; Sheehan, Alice S; Ma, Zhen; Mathur, Anurag; Nguyen, Trieu N; Yoo, Jennie C; Judge, Luke M; Spencer, C Ian; Chukka, Anand C; Russell, Caitlin R; So, Po-Lin; Conklin, Bruce R; Healy, Kevin E

    2015-05-01

    Contractile motion is the simplest metric of cardiomyocyte health in vitro, but unbiased quantification is challenging. We describe a rapid automated method, requiring only standard video microscopy, to analyze the contractility of human-induced pluripotent stem cell-derived cardiomyocytes (iPS-CM). New algorithms for generating and filtering motion vectors combined with a newly developed isogenic iPSC line harboring genetically encoded calcium indicator, GCaMP6f, allow simultaneous user-independent measurement and analysis of the coupling between calcium flux and contractility. The relative performance of these algorithms, in terms of improving signal to noise, was tested. Applying these algorithms allowed analysis of contractility in iPS-CM cultured over multiple spatial scales from single cells to three-dimensional constructs. This open source software was validated with analysis of isoproterenol response in these cells, and can be applied in future studies comparing the drug responsiveness of iPS-CM cultured in different microenvironments in the context of tissue engineering.

  7. An Uncertainty Quantification Framework for Prognostics and Condition-Based Monitoring

    NASA Technical Reports Server (NTRS)

    Sankararaman, Shankar; Goebel, Kai

    2014-01-01

    This paper presents a computational framework for uncertainty quantification in prognostics in the context of condition-based monitoring of aerospace systems. The different sources of uncertainty and the various uncertainty quantification activities in condition-based prognostics are outlined in detail, and it is demonstrated that the Bayesian subjective approach is suitable for interpreting uncertainty in online monitoring. A state-space model-based framework for prognostics, that can rigorously account for the various sources of uncertainty, is presented. Prognostics consists of two important steps. First, the state of the system is estimated using Bayesian tracking, and then, the future states of the system are predicted until failure, thereby computing the remaining useful life of the system. The proposed framework is illustrated using the power system of a planetary rover test-bed, which is being developed and studied at NASA Ames Research Center.

  8. Reproducibility of a novel echocardiographic 3D automated software for the assessment of mitral valve anatomy.

    PubMed

    Aquila, Iolanda; González, Ariana; Fernández-Golfín, Covadonga; Rincón, Luis Miguel; Casas, Eduardo; García, Ana; Hinojar, Rocio; Jiménez-Nacher, José Julio; Zamorano, José Luis

    2016-05-17

    3D transesophageal echocardiography (TEE) is superior to 2D TEE in quantitative anatomic evaluation of the mitral valve (MV) but it shows limitations regarding automatic quantification. Here, we tested the inter-/intra-observer reproducibility of a novel full-automated software in the evaluation of MV anatomy compared to manual 3D assessment. Thirty-six out of 61 screened patients referred to our Cardiac Imaging Unit for TEE were retrospectively included. 3D TEE analysis was performed both manually and with the automated software by two independent operators. Mitral annular area, intercommissural distance, anterior leaflet length and posterior leaflet length were assessed. A significant correlation between both methods was found for all variables: intercommissural diameter (r = 0.84, p < 0.01), mitral annular area (r = 0.94, p > 0, 01), anterior leaflet length (r = 0.83, p < 0.01) and posterior leaflet length (r = 0.67, p < 0.01). Interobserver variability assessed by the intraclass correlation coefficient was superior for the automatic software: intercommisural distance 0.997 vs. 0.76; mitral annular area 0.957 vs. 0.858; anterior leaflet length 0.963 vs. 0.734 and posterior leaflet length 0.936 vs. 0.838. Intraobserver variability was good for both methods with a better level of agreement with the automatic software. The novel 3D automated software is reproducible in MV anatomy assessment. The incorporation of this new tool in clinical MV assessment may improve patient selection and outcomes for MV interventions as well as patient diagnosis and prognosis stratification. Yet, high-quality 3D images are indispensable.

  9. 3D echocardiographic analysis of aortic annulus for transcatheter aortic valve replacement using novel aortic valve quantification software: Comparison with computed tomography.

    PubMed

    Mediratta, Anuj; Addetia, Karima; Medvedofsky, Diego; Schneider, Robert J; Kruse, Eric; Shah, Atman P; Nathan, Sandeep; Paul, Jonathan D; Blair, John E; Ota, Takeyoshi; Balkhy, Husam H; Patel, Amit R; Mor-Avi, Victor; Lang, Roberto M

    2017-05-01

    With the increasing use of transcatheter aortic valve replacement (TAVR) in patients with aortic stenosis (AS), computed tomography (CT) remains the standard for annulus sizing. However, 3D transesophageal echocardiography (TEE) has been an alternative in patients with contraindications to CT. We sought to (1) test the feasibility, accuracy, and reproducibility of prototype 3DTEE analysis software (Philips) for aortic annular measurements and (2) compare the new approach to the existing echocardiographic techniques. We prospectively studied 52 patients who underwent gated contrast CT, procedural 3DTEE, and TAVR. 3DTEE images were analyzed using novel semi-automated software designed for 3D measurements of the aortic root, which uses multiplanar reconstruction, similar to CT analysis. Aortic annulus measurements included area, perimeter, and diameter calculations from these measurements. The results were compared to CT-derived values. Additionally, 3D echocardiographic measurements (3D planimetry and mitral valve analysis software adapted for the aortic valve) were also compared to the CT reference values. 3DTEE image quality was sufficient in 90% of patients for aortic annulus measurements using the new software, which were in good agreement with CT (r-values: .89-.91) and small (<4%) inter-modality nonsignificant biases. Repeated measurements showed <10% measurements variability. The new 3D analysis was the more accurate and reproducible of the existing echocardiographic techniques. Novel semi-automated 3DTEE analysis software can accurately measure aortic annulus in patients with severe AS undergoing TAVR, in better agreement with CT than the existing methodology. Accordingly, intra-procedural TEE could potentially replace CT in patients where CT carries significant risk. © 2017, Wiley Periodicals, Inc.

  10. Optically transmitted and inductively coupled electric reference to access in vivo concentrations for quantitative proton-decoupled ¹³C magnetic resonance spectroscopy.

    PubMed

    Chen, Xing; Pavan, Matteo; Heinzer-Schweizer, Susanne; Boesiger, Peter; Henning, Anke

    2012-01-01

    This report describes our efforts on quantification of tissue metabolite concentrations in mM by nuclear Overhauser enhanced and proton decoupled (13) C magnetic resonance spectroscopy and the Electric Reference To access In vivo Concentrations (ERETIC) method. Previous work showed that a calibrated synthetic magnetic resonance spectroscopy-like signal transmitted through an optical fiber and inductively coupled into a transmit/receive coil represents a reliable reference standard for in vivo (1) H magnetic resonance spectroscopy quantification on a clinical platform. In this work, we introduce a related implementation that enables simultaneous proton decoupling and ERETIC-based metabolite quantification and hence extends the applicability of the ERETIC method to nuclear Overhauser enhanced and proton decoupled in vivo (13) C magnetic resonance spectroscopy. In addition, ERETIC signal stability under the influence of simultaneous proton decoupling is investigated. The proposed quantification method was cross-validated against internal and external reference standards on human skeletal muscle. The ERETIC signal intensity stability was 100.65 ± 4.18% over 3 months including measurements with and without proton decoupling. Glycogen and unsaturated fatty acid concentrations measured with the ERETIC method were in excellent agreement with internal creatine and external phantom reference methods, showing a difference of 1.85 ± 1.21% for glycogen and 1.84 ± 1.00% for unsaturated fatty acid between ERETIC and creatine-based quantification, whereas the deviations between external reference and creatine-based quantification are 6.95 ± 9.52% and 3.19 ± 2.60%, respectively. Copyright © 2011 Wiley Periodicals, Inc.

  11. Mass spectrometry–based relative quantification of proteins in precatalytic and catalytically active spliceosomes by metabolic labeling (SILAC), chemical labeling (iTRAQ), and label-free spectral count

    PubMed Central

    Schmidt, Carla; Grønborg, Mads; Deckert, Jochen; Bessonov, Sergey; Conrad, Thomas; Lührmann, Reinhard; Urlaub, Henning

    2014-01-01

    The spliceosome undergoes major changes in protein and RNA composition during pre-mRNA splicing. Knowing the proteins—and their respective quantities—at each spliceosomal assembly stage is critical for understanding the molecular mechanisms and regulation of splicing. Here, we applied three independent mass spectrometry (MS)–based approaches for quantification of these proteins: (1) metabolic labeling by SILAC, (2) chemical labeling by iTRAQ, and (3) label-free spectral count for quantification of the protein composition of the human spliceosomal precatalytic B and catalytic C complexes. In total we were able to quantify 157 proteins by at least two of the three approaches. Our quantification shows that only a very small subset of spliceosomal proteins (the U5 and U2 Sm proteins, a subset of U5 snRNP-specific proteins, and the U2 snRNP-specific proteins U2A′ and U2B′′) remains unaltered upon transition from the B to the C complex. The MS-based quantification approaches classify the majority of proteins as dynamically associated specifically with the B or the C complex. In terms of experimental procedure and the methodical aspect of this work, we show that metabolically labeled spliceosomes are functionally active in terms of their assembly and splicing kinetics and can be utilized for quantitative studies. Moreover, we obtain consistent quantification results from all three methods, including the relatively straightforward and inexpensive label-free spectral count technique. PMID:24448447

  12. Ensemble-based uncertainty quantification for coordination and control of thermostatically controlled loads

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Li, Weixuan; Lian, Jianming; Engel, Dave

    2017-07-27

    This paper presents a general uncertainty quantification (UQ) framework that provides a systematic analysis of the uncertainty involved in the modeling of a control system, and helps to improve the performance of a control strategy.

  13. Lamb Wave Damage Quantification Using GA-Based LS-SVM.

    PubMed

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-06-12

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification.

  14. Lamb Wave Damage Quantification Using GA-Based LS-SVM

    PubMed Central

    Sun, Fuqiang; Wang, Ning; He, Jingjing; Guan, Xuefei; Yang, Jinsong

    2017-01-01

    Lamb waves have been reported to be an efficient tool for non-destructive evaluations (NDE) for various application scenarios. However, accurate and reliable damage quantification using the Lamb wave method is still a practical challenge, due to the complex underlying mechanism of Lamb wave propagation and damage detection. This paper presents a Lamb wave damage quantification method using a least square support vector machine (LS-SVM) and a genetic algorithm (GA). Three damage sensitive features, namely, normalized amplitude, phase change, and correlation coefficient, were proposed to describe changes of Lamb wave characteristics caused by damage. In view of commonly used data-driven methods, the GA-based LS-SVM model using the proposed three damage sensitive features was implemented to evaluate the crack size. The GA method was adopted to optimize the model parameters. The results of GA-based LS-SVM were validated using coupon test data and lap joint component test data with naturally developed fatigue cracks. Cases of different loading and manufacturer were also included to further verify the robustness of the proposed method for crack quantification. PMID:28773003

  15. Demand Response Resource Quantification with Detailed Building Energy Models

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Hale, Elaine; Horsey, Henry; Merket, Noel

    Demand response is a broad suite of technologies that enables changes in electrical load operations in support of power system reliability and efficiency. Although demand response is not a new concept, there is new appetite for comprehensively evaluating its technical potential in the context of renewable energy integration. The complexity of demand response makes this task difficult -- we present new methods for capturing the heterogeneity of potential responses from buildings, their time-varying nature, and metrics such as thermal comfort that help quantify likely acceptability of specific demand response actions. Computed with an automated software framework, the methods are scalable.

  16. Automated selected reaction monitoring data analysis workflow for large-scale targeted proteomic studies.

    PubMed

    Surinova, Silvia; Hüttenhain, Ruth; Chang, Ching-Yun; Espona, Lucia; Vitek, Olga; Aebersold, Ruedi

    2013-08-01

    Targeted proteomics based on selected reaction monitoring (SRM) mass spectrometry is commonly used for accurate and reproducible quantification of protein analytes in complex biological mixtures. Strictly hypothesis-driven, SRM assays quantify each targeted protein by collecting measurements on its peptide fragment ions, called transitions. To achieve sensitive and accurate quantitative results, experimental design and data analysis must consistently account for the variability of the quantified transitions. This consistency is especially important in large experiments, which increasingly require profiling up to hundreds of proteins over hundreds of samples. Here we describe a robust and automated workflow for the analysis of large quantitative SRM data sets that integrates data processing, statistical protein identification and quantification, and dissemination of the results. The integrated workflow combines three software tools: mProphet for peptide identification via probabilistic scoring; SRMstats for protein significance analysis with linear mixed-effect models; and PASSEL, a public repository for storage, retrieval and query of SRM data. The input requirements for the protocol are files with SRM traces in mzXML format, and a file with a list of transitions in a text tab-separated format. The protocol is especially suited for data with heavy isotope-labeled peptide internal standards. We demonstrate the protocol on a clinical data set in which the abundances of 35 biomarker candidates were profiled in 83 blood plasma samples of subjects with ovarian cancer or benign ovarian tumors. The time frame to realize the protocol is 1-2 weeks, depending on the number of replicates used in the experiment.

  17. A novel real time imaging platform to quantify macrophage phagocytosis.

    PubMed

    Kapellos, Theodore S; Taylor, Lewis; Lee, Heyne; Cowley, Sally A; James, William S; Iqbal, Asif J; Greaves, David R

    2016-09-15

    Phagocytosis of pathogens, apoptotic cells and debris is a key feature of macrophage function in host defense and tissue homeostasis. Quantification of macrophage phagocytosis in vitro has traditionally been technically challenging. Here we report the optimization and validation of the IncuCyte ZOOM® real time imaging platform for macrophage phagocytosis based on pHrodo® pathogen bioparticles, which only fluoresce when localized in the acidic environment of the phagolysosome. Image analysis and fluorescence quantification were performed with the automated IncuCyte™ Basic Software. Titration of the bioparticle number showed that the system is more sensitive than a spectrofluorometer, as it can detect phagocytosis when using 20× less E. coli bioparticles. We exemplified the power of this real time imaging platform by studying phagocytosis of murine alveolar, bone marrow and peritoneal macrophages. We further demonstrate the ability of this platform to study modulation of the phagocytic process, as pharmacological inhibitors of phagocytosis suppressed bioparticle uptake in a concentration-dependent manner, whereas opsonins augmented phagocytosis. We also investigated the effects of macrophage polarization on E. coli phagocytosis. Bone marrow-derived macrophage (BMDM) priming with M2 stimuli, such as IL-4 and IL-10 resulted in higher engulfment of bioparticles in comparison with M1 polarization. Moreover, we demonstrated that tolerization of BMDMs with lipopolysaccharide (LPS) results in impaired E. coli bioparticle phagocytosis. This novel real time assay will enable researchers to quantify macrophage phagocytosis with a higher degree of accuracy and sensitivity and will allow investigation of limited populations of primary phagocytes in vitro. Copyright © 2016 The Author(s). Published by Elsevier Inc. All rights reserved.

  18. MRI-based methods for quantification of the cerebral metabolic rate of oxygen

    PubMed Central

    Rodgers, Zachary B; Detre, John A

    2016-01-01

    The brain depends almost entirely on oxidative metabolism to meet its significant energy requirements. As such, the cerebral metabolic rate of oxygen (CMRO2) represents a key measure of brain function. Quantification of CMRO2 has helped elucidate brain functional physiology and holds potential as a clinical tool for evaluating neurological disorders including stroke, brain tumors, Alzheimer’s disease, and obstructive sleep apnea. In recent years, a variety of magnetic resonance imaging (MRI)-based CMRO2 quantification methods have emerged. Unlike positron emission tomography – the current “gold standard” for measurement and mapping of CMRO2 – MRI is non-invasive, relatively inexpensive, and ubiquitously available in modern medical centers. All MRI-based CMRO2 methods are based on modeling the effect of paramagnetic deoxyhemoglobin on the magnetic resonance signal. The various methods can be classified in terms of the MRI contrast mechanism used to quantify CMRO2: T2*, T2′, T2, or magnetic susceptibility. This review article provides an overview of MRI-based CMRO2 quantification techniques. After a brief historical discussion motivating the need for improved CMRO2 methodology, current state-of-the-art MRI-based methods are critically appraised in terms of their respective tradeoffs between spatial resolution, temporal resolution, and robustness, all of critical importance given the spatially heterogeneous and temporally dynamic nature of brain energy requirements. PMID:27089912

  19. Easy, Fast, and Reproducible Quantification of Cholesterol and Other Lipids in Human Plasma by Combined High Resolution MSX and FTMS Analysis

    NASA Astrophysics Data System (ADS)

    Gallego, Sandra F.; Højlund, Kurt; Ejsing, Christer S.

    2018-01-01

    Reliable, cost-effective, and gold-standard absolute quantification of non-esterified cholesterol in human plasma is of paramount importance in clinical lipidomics and for the monitoring of metabolic health. Here, we compared the performance of three mass spectrometric approaches available for direct detection and quantification of cholesterol in extracts of human plasma. These approaches are high resolution full scan Fourier transform mass spectrometry (FTMS) analysis, parallel reaction monitoring (PRM), and novel multiplexed MS/MS (MSX) technology, where fragments from selected precursor ions are detected simultaneously. Evaluating the performance of these approaches in terms of dynamic quantification range, linearity, and analytical precision showed that the MSX-based approach is superior to that of the FTMS and PRM-based approaches. To further show the efficacy of this approach, we devised a simple routine for extensive plasma lipidome characterization using only 8 μL of plasma, using a new commercially available ready-to-spike-in mixture with 14 synthetic lipid standards, and executing a single 6 min sample injection with combined MSX analysis for cholesterol quantification and FTMS analysis for quantification of sterol esters, glycerolipids, glycerophospholipids, and sphingolipids. Using this simple routine afforded reproducible and absolute quantification of 200 lipid species encompassing 13 lipid classes in human plasma samples. Notably, the analysis time of this procedure can be shortened for high throughput-oriented clinical lipidomics studies or extended with more advanced MSALL technology (Almeida R. et al., J. Am. Soc. Mass Spectrom. 26, 133-148 [1]) to support in-depth structural elucidation of lipid molecules. [Figure not available: see fulltext.

  20. Easy, Fast, and Reproducible Quantification of Cholesterol and Other Lipids in Human Plasma by Combined High Resolution MSX and FTMS Analysis.

    PubMed

    Gallego, Sandra F; Højlund, Kurt; Ejsing, Christer S

    2018-01-01

    Reliable, cost-effective, and gold-standard absolute quantification of non-esterified cholesterol in human plasma is of paramount importance in clinical lipidomics and for the monitoring of metabolic health. Here, we compared the performance of three mass spectrometric approaches available for direct detection and quantification of cholesterol in extracts of human plasma. These approaches are high resolution full scan Fourier transform mass spectrometry (FTMS) analysis, parallel reaction monitoring (PRM), and novel multiplexed MS/MS (MSX) technology, where fragments from selected precursor ions are detected simultaneously. Evaluating the performance of these approaches in terms of dynamic quantification range, linearity, and analytical precision showed that the MSX-based approach is superior to that of the FTMS and PRM-based approaches. To further show the efficacy of this approach, we devised a simple routine for extensive plasma lipidome characterization using only 8 μL of plasma, using a new commercially available ready-to-spike-in mixture with 14 synthetic lipid standards, and executing a single 6 min sample injection with combined MSX analysis for cholesterol quantification and FTMS analysis for quantification of sterol esters, glycerolipids, glycerophospholipids, and sphingolipids. Using this simple routine afforded reproducible and absolute quantification of 200 lipid species encompassing 13 lipid classes in human plasma samples. Notably, the analysis time of this procedure can be shortened for high throughput-oriented clinical lipidomics studies or extended with more advanced MS ALL technology (Almeida R. et al., J. Am. Soc. Mass Spectrom. 26, 133-148 [1]) to support in-depth structural elucidation of lipid molecules. Graphical Abstract ᅟ.

  1. Fully automated system for the quantification of human osteoarthritic knee joint effusion volume using magnetic resonance imaging.

    PubMed

    Li, Wei; Abram, François; Pelletier, Jean-Pierre; Raynauld, Jean-Pierre; Dorais, Marc; d'Anjou, Marc-André; Martel-Pelletier, Johanne

    2010-01-01

    Joint effusion is frequently associated with osteoarthritis (OA) flare-up and is an important marker of therapeutic response. This study aimed at developing and validating a fully automated system based on magnetic resonance imaging (MRI) for the quantification of joint effusion volume in knee OA patients. MRI examinations consisted of two axial sequences: a T2-weighted true fast imaging with steady-state precession and a T1-weighted gradient echo. An automated joint effusion volume quantification system using MRI was developed and validated (a) with calibrated phantoms (cylinder and sphere) and effusion from knee OA patients; (b) with assessment by manual quantification; and (c) by direct aspiration. Twenty-five knee OA patients with joint effusion were included in the study. The automated joint effusion volume quantification was developed as a four stage sequencing process: bone segmentation, filtering of unrelated structures, segmentation of joint effusion, and subvoxel volume calculation. Validation experiments revealed excellent coefficients of variation with the calibrated cylinder (1.4%) and sphere (0.8%) phantoms. Comparison of the OA knee joint effusion volume assessed by the developed automated system and by manual quantification was also excellent (r = 0.98; P < 0.0001), as was the comparison with direct aspiration (r = 0.88; P = 0.0008). The newly developed fully automated MRI-based system provided precise quantification of OA knee joint effusion volume with excellent correlation with data from phantoms, a manual system, and joint aspiration. Such an automated system will be instrumental in improving the reproducibility/reliability of the evaluation of this marker in clinical application.

  2. Challenges in small screening laboratories: implementing an on-demand laboratory information management system.

    PubMed

    Lemmon, Vance P; Jia, Yuanyuan; Shi, Yan; Holbrook, S Douglas; Bixby, John L; Buchser, William

    2011-11-01

    The Miami Project to Cure Paralysis, part of the University of Miami Miller School of Medicine, includes a laboratory devoted to High Content Analysis (HCA) of neurons. The goal of the laboratory is to uncover signaling pathways, genes, compounds, or drugs that can be used to promote nerve growth. HCA permits the quantification of neuronal morphology, including the lengths and numbers of axons. HCA of various libraries on primary neurons requires a team-based approach, a variety of process steps and complex manipulations of cells and libraries to obtain meaningful results. HCA itself produces vast amounts of information including images, well-based data and cell-based phenotypic measures. Documenting and integrating the experimental workflows, library data and extensive experimental results is challenging. For academic laboratories generating large data sets from experiments involving thousands of perturbagens, a Laboratory Information Management System (LIMS) is the data tracking solution of choice. With both productivity and efficiency as driving rationales, the Miami Project has equipped its HCA laboratory with an On Demand or Software As A Service (SaaS) LIMS to ensure the quality of its experiments and workflows. The article discusses how the system was selected and integrated into the laboratory. The advantages of a SaaS based LIMS over a client-server based system are described. © 2011 Bentham Science Publishers

  3. 1H NMR quantification in very dilute toxin solutions: application to anatoxin-a analysis.

    PubMed

    Dagnino, Denise; Schripsema, Jan

    2005-08-01

    A complete procedure is described for the extraction, detection and quantification of anatoxin-a in biological samples. Anatoxin-a is extracted from biomass by a routine acid base extraction. The extract is analysed by GC-MS, without the need of derivatization, with a detection limit of 0.5 ng. A method was developed for the accurate quantification of anatoxin-a in the standard solution to be used for the calibration of the GC analysis. 1H NMR allowed the accurate quantification of microgram quantities of anatoxin-a. The accurate quantification of compounds in standard solutions is rarely discussed, but for compounds like anatoxin-a (toxins with prices in the range of a million dollar a gram), of which generally only milligram quantities or less are available, this factor in the quantitative analysis is certainly not trivial. The method that was developed can easily be adapted for the accurate quantification of other toxins in very dilute solutions.

  4. Good quantification practices of flavours and fragrances by mass spectrometry.

    PubMed

    Begnaud, Frédéric; Chaintreau, Alain

    2016-10-28

    Over the past 15 years, chromatographic techniques with mass spectrometric detection have been increasingly used to monitor the rapidly expanded list of regulated flavour and fragrance ingredients. This trend entails a need for good quantification practices suitable for complex media, especially for multi-analytes. In this article, we present experimental precautions needed to perform the analyses and ways to process the data according to the most recent approaches. This notably includes the identification of analytes during their quantification and method validation, when applied to real matrices, based on accuracy profiles. A brief survey of application studies based on such practices is given.This article is part of the themed issue 'Quantitative mass spectrometry'. © 2016 The Authors.

  5. Quantification of moving target cyber defenses

    NASA Astrophysics Data System (ADS)

    Farris, Katheryn A.; Cybenko, George

    2015-05-01

    Current network and information systems are static, making it simple for attackers to maintain an advantage. Adaptive defenses, such as Moving Target Defenses (MTD) have been developed as potential "game-changers" in an effort to increase the attacker's workload. With many new methods being developed, it is difficult to accurately quantify and compare their overall costs and effectiveness. This paper compares the tradeoffs between current approaches to the quantification of MTDs. We present results from an expert opinion survey on quantifying the overall effectiveness, upfront and operating costs of a select set of MTD techniques. We find that gathering informed scientific opinions can be advantageous for evaluating such new technologies as it offers a more comprehensive assessment. We end by presenting a coarse ordering of a set of MTD techniques from most to least dominant. We found that seven out of 23 methods rank as the more dominant techniques. Five of which are techniques of either address space layout randomization or instruction set randomization. The remaining two techniques are applicable to software and computer platforms. Among the techniques that performed the worst are those primarily aimed at network randomization.

  6. An automated method of quantifying ferrite microstructures using electron backscatter diffraction (EBSD) data.

    PubMed

    Shrestha, Sachin L; Breen, Andrew J; Trimby, Patrick; Proust, Gwénaëlle; Ringer, Simon P; Cairney, Julie M

    2014-02-01

    The identification and quantification of the different ferrite microconstituents in steels has long been a major challenge for metallurgists. Manual point counting from images obtained by optical and scanning electron microscopy (SEM) is commonly used for this purpose. While classification systems exist, the complexity of steel microstructures means that identifying and quantifying these phases is still a great challenge. Moreover, point counting is extremely tedious, time consuming, and subject to operator bias. This paper presents a new automated identification and quantification technique for the characterisation of complex ferrite microstructures by electron backscatter diffraction (EBSD). This technique takes advantage of the fact that different classes of ferrite exhibit preferential grain boundary misorientations, aspect ratios and mean misorientation, all of which can be detected using current EBSD software. These characteristics are set as criteria for identification and linked to grain size to determine the area fractions. The results of this method were evaluated by comparing the new automated technique with point counting results. The technique could easily be applied to a range of other steel microstructures. © 2013 Published by Elsevier B.V.

  7. FIB preparation of a NiO Wedge-Lamella and STEM X-ray microanalysis for the determination of the experimental k(O-Ni) Cliff-Lorimer coefficient.

    PubMed

    Armigliato, Aldo; Frabboni, Stefano; Gazzadi, Gian Carlo; Rosa, Rodolfo

    2013-02-01

    A method for the fabrication of a wedge-shaped thin NiO lamella by focused ion beam is reported. The starting sample is an oxidized bulk single crystalline, <100> oriented, Ni commercial standard. The lamella is employed for the determination, by analytical electron microscopy at 200 kV of the experimental k(O-Ni) Cliff-Lorimer (G. Cliff & G.W. Lorimer, J Microsc 103, 203-207, 1975) coefficient, according to the extrapolation method by Van Cappellen (E. Van Cappellen, Microsc Microstruct Microanal 1, 1-22, 1990). The result thus obtained is compared to the theoretical k(O-Ni) values either implemented into the commercial software for X-ray microanalysis quantification of the scanning transmission electron microscopy/energy dispersive spectrometry equipment or calculated by the Monte Carlo method. Significant differences among the three values are found. This confirms that for a reliable quantification of binary alloys containing light elements, the choice of the Cliff-Lorimer coefficients is crucial and experimental values are recommended.

  8. Quantitative CT: technique dependence of volume estimation on pulmonary nodules

    NASA Astrophysics Data System (ADS)

    Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Colsher, James; Amurao, Maxwell; Samei, Ehsan

    2012-03-01

    Current estimation of lung nodule size typically relies on uni- or bi-dimensional techniques. While new three-dimensional volume estimation techniques using MDCT have improved size estimation of nodules with irregular shapes, the effect of acquisition and reconstruction parameters on accuracy (bias) and precision (variance) of the new techniques has not been fully investigated. To characterize the volume estimation performance dependence on these parameters, an anthropomorphic chest phantom containing synthetic nodules was scanned and reconstructed with protocols across various acquisition and reconstruction parameters. Nodule volumes were estimated by a clinical lung analysis software package, LungVCAR. Precision and accuracy of the volume assessment were calculated across the nodules and compared between protocols via a generalized estimating equation analysis. Results showed that the precision and accuracy of nodule volume quantifications were dependent on slice thickness, with different dependences for different nodule characteristics. Other parameters including kVp, pitch, and reconstruction kernel had lower impact. Determining these technique dependences enables better volume quantification via protocol optimization and highlights the importance of consistent imaging parameters in sequential examinations.

  9. Quantification of Cannabinoid Content in Cannabis

    NASA Astrophysics Data System (ADS)

    Tian, Y.; Zhang, F.; Jia, K.; Wen, M.; Yuan, Ch.

    2015-09-01

    Cannabis is an economically important plant that is used in many fields, in addition to being the most commonly consumed illicit drug worldwide. Monitoring the spatial distribution of cannabis cultivation and judging whether it is drug- or fiber-type cannabis is critical for governments and international communities to understand the scale of the illegal drug trade. The aim of this study was to investigate whether the cannabinoids content in cannabis could be spectrally quantified using a spectrometer and to identify the optimal wavebands for quantifying the cannabinoid content. Spectral reflectance data of dried cannabis leaf samples and the cannabis canopy were measured in the laboratory and in the field, respectively. Correlation analysis and the stepwise multivariate regression method were used to select the optimal wavebands for cannabinoid content quantification based on the laboratory-measured spectral data. The results indicated that the delta-9-tetrahydrocannabinol (THC) content in cannabis leaves could be quantified using laboratory-measured spectral reflectance data and that the 695 nm band is the optimal band for THC content quantification. This study provides prerequisite information for designing spectral equipment to enable immediate quantification of THC content in cannabis and to discriminate drug- from fiber-type cannabis based on THC content quantification in the field.

  10. A Statistics-based Platform for Quantitative N-terminome Analysis and Identification of Protease Cleavage Products*

    PubMed Central

    auf dem Keller, Ulrich; Prudova, Anna; Gioia, Magda; Butler, Georgina S.; Overall, Christopher M.

    2010-01-01

    Terminal amine isotopic labeling of substrates (TAILS), our recently introduced platform for quantitative N-terminome analysis, enables wide dynamic range identification of original mature protein N-termini and protease cleavage products. Modifying TAILS by use of isobaric tag for relative and absolute quantification (iTRAQ)-like labels for quantification together with a robust statistical classifier derived from experimental protease cleavage data, we report reliable and statistically valid identification of proteolytic events in complex biological systems in MS2 mode. The statistical classifier is supported by a novel parameter evaluating ion intensity-dependent quantification confidences of single peptide quantifications, the quantification confidence factor (QCF). Furthermore, the isoform assignment score (IAS) is introduced, a new scoring system for the evaluation of single peptide-to-protein assignments based on high confidence protein identifications in the same sample prior to negative selection enrichment of N-terminal peptides. By these approaches, we identified and validated, in addition to known substrates, low abundance novel bioactive MMP-2 targets including the plasminogen receptor S100A10 (p11) and the proinflammatory cytokine proEMAP/p43 that were previously undescribed. PMID:20305283

  11. Nuclear Physicists in Finance

    NASA Astrophysics Data System (ADS)

    Mattoni, Carlo

    2017-01-01

    The financial services industry presents an interesting alternative career path for nuclear physicists. Careers in finance typically offer intellectual challenge, a fast pace, high caliber colleagues, merit-based compensation with substantial upside, and an opportunity to deploy skills learned as a physicist. Physicists are employed at a wide range of financial institutions on both the ``buy side'' (hedge fund managers, private equity managers, mutual fund managers, etc.) and the ``sell side'' (investment banks and brokerages). Historically, physicists in finance were primarily ``quants'' tasked with applying stochastic calculus to determine the price of financial derivatives. With the maturation of the field of derivative pricing, physicists in finance today find work in a variety of roles ranging from quantification and management of risk to investment analysis to development of sophisticated software used to price, trade, and risk manage securities. Only a small subset of today's finance careers for physicists require the use of advanced math and practically none provide an opportunity to tinker with an apparatus, yet most nevertheless draw on important skills honed during the training of a nuclear physicist. Intellectually rigorous critical thinking, sophisticated problem solving, an attention to minute detail and an ability to create and test hypotheses based on incomplete information are key to both disciplines.

  12. Preliminary results of real-time in-vitro electronic speckle pattern interferometry (ESPI) measurements in otolaryngology

    NASA Astrophysics Data System (ADS)

    Conerty, Michelle D.; Castracane, James; Cacace, Anthony T.; Parnes, Steven M.; Gardner, Glendon M.; Miller, Mitchell B.

    1995-05-01

    Electronic Speckle Pattern Interferometry (ESPI) is a nondestructive optical evaluation technique that is capable of determining surface and subsurface integrity through the quantitative evaluation of static or vibratory motion. By utilizing state of the art developments in the areas of lasers, fiber optics and solid state detector technology, this technique has become applicable in medical research and diagnostics. Based on initial support from NIDCD and continued support from InterScience, Inc., we have been developing a range of instruments for improved diagnostic evaluation in otolaryngological applications based on the technique of ESPI. These compact fiber optic instruments are capable of making real time interferometric measurements of the target tissue. Ongoing development of image post- processing software is currently capable of extracting the desired quantitative results from the acquired interferometric images. The goal of the research is to develop a fully automated system in which the image processing and quantification will be performed in hardware in near real-time. Subsurface details of both the tympanic membrane and vocal cord dynamics could speed the diagnosis of otosclerosis, laryngeal tumors, and aid in the evaluation of surgical procedures.

  13. A Guideline to Univariate Statistical Analysis for LC/MS-Based Untargeted Metabolomics-Derived Data

    PubMed Central

    Vinaixa, Maria; Samino, Sara; Saez, Isabel; Duran, Jordi; Guinovart, Joan J.; Yanes, Oscar

    2012-01-01

    Several metabolomic software programs provide methods for peak picking, retention time alignment and quantification of metabolite features in LC/MS-based metabolomics. Statistical analysis, however, is needed in order to discover those features significantly altered between samples. By comparing the retention time and MS/MS data of a model compound to that from the altered feature of interest in the research sample, metabolites can be then unequivocally identified. This paper reports on a comprehensive overview of a workflow for statistical analysis to rank relevant metabolite features that will be selected for further MS/MS experiments. We focus on univariate data analysis applied in parallel on all detected features. Characteristics and challenges of this analysis are discussed and illustrated using four different real LC/MS untargeted metabolomic datasets. We demonstrate the influence of considering or violating mathematical assumptions on which univariate statistical test rely, using high-dimensional LC/MS datasets. Issues in data analysis such as determination of sample size, analytical variation, assumption of normality and homocedasticity, or correction for multiple testing are discussed and illustrated in the context of our four untargeted LC/MS working examples. PMID:24957762

  14. A Guideline to Univariate Statistical Analysis for LC/MS-Based Untargeted Metabolomics-Derived Data.

    PubMed

    Vinaixa, Maria; Samino, Sara; Saez, Isabel; Duran, Jordi; Guinovart, Joan J; Yanes, Oscar

    2012-10-18

    Several metabolomic software programs provide methods for peak picking, retention time alignment and quantification of metabolite features in LC/MS-based metabolomics. Statistical analysis, however, is needed in order to discover those features significantly altered between samples. By comparing the retention time and MS/MS data of a model compound to that from the altered feature of interest in the research sample, metabolites can be then unequivocally identified. This paper reports on a comprehensive overview of a workflow for statistical analysis to rank relevant metabolite features that will be selected for further MS/MS experiments. We focus on univariate data analysis applied in parallel on all detected features. Characteristics and challenges of this analysis are discussed and illustrated using four different real LC/MS untargeted metabolomic datasets. We demonstrate the influence of considering or violating mathematical assumptions on which univariate statistical test rely, using high-dimensional LC/MS datasets. Issues in data analysis such as determination of sample size, analytical variation, assumption of normality and homocedasticity, or correction for multiple testing are discussed and illustrated in the context of our four untargeted LC/MS working examples.

  15. MASPECTRAS: a platform for management and analysis of proteomics LC-MS/MS data

    PubMed Central

    Hartler, Jürgen; Thallinger, Gerhard G; Stocker, Gernot; Sturn, Alexander; Burkard, Thomas R; Körner, Erik; Rader, Robert; Schmidt, Andreas; Mechtler, Karl; Trajanoski, Zlatko

    2007-01-01

    Background The advancements of proteomics technologies have led to a rapid increase in the number, size and rate at which datasets are generated. Managing and extracting valuable information from such datasets requires the use of data management platforms and computational approaches. Results We have developed the MAss SPECTRometry Analysis System (MASPECTRAS), a platform for management and analysis of proteomics LC-MS/MS data. MASPECTRAS is based on the Proteome Experimental Data Repository (PEDRo) relational database schema and follows the guidelines of the Proteomics Standards Initiative (PSI). Analysis modules include: 1) import and parsing of the results from the search engines SEQUEST, Mascot, Spectrum Mill, X! Tandem, and OMSSA; 2) peptide validation, 3) clustering of proteins based on Markov Clustering and multiple alignments; and 4) quantification using the Automated Statistical Analysis of Protein Abundance Ratios algorithm (ASAPRatio). The system provides customizable data retrieval and visualization tools, as well as export to PRoteomics IDEntifications public repository (PRIDE). MASPECTRAS is freely available at Conclusion Given the unique features and the flexibility due to the use of standard software technology, our platform represents significant advance and could be of great interest to the proteomics community. PMID:17567892

  16. Uncertainties in the Antarctic Ice Sheet Contribution to Sea Level Rise: Exploration of Model Response to Errors in Climate Forcing, Boundary Conditions, and Internal Parameters

    NASA Astrophysics Data System (ADS)

    Schlegel, N.; Seroussi, H. L.; Boening, C.; Larour, E. Y.; Limonadi, D.; Schodlok, M.; Watkins, M. M.

    2017-12-01

    The Jet Propulsion Laboratory-University of California at Irvine Ice Sheet System Model (ISSM) is a thermo-mechanical 2D/3D parallelized finite element software used to physically model the continental-scale flow of ice at high resolutions. Embedded into ISSM are uncertainty quantification (UQ) tools, based on the Design Analysis Kit for Optimization and Terascale Applications (DAKOTA) software. ISSM-DAKOTA offers various UQ methods for the investigation of how errors in model input impact uncertainty in simulation results. We utilize these tools to regionally sample model input and key parameters, based on specified bounds of uncertainty, and run a suite of continental-scale 100-year ISSM forward simulations of the Antarctic Ice Sheet. Resulting diagnostics (e.g., spread in local mass flux and regional mass balance) inform our conclusion about which parameters and/or forcing has the greatest impact on century-scale model simulations of ice sheet evolution. The results allow us to prioritize the key datasets and measurements that are critical for the minimization of ice sheet model uncertainty. Overall, we find that Antartica's total sea level contribution is strongly affected by grounding line retreat, which is driven by the magnitude of ice shelf basal melt rates and by errors in bedrock topography. In addition, results suggest that after 100 years of simulation, Thwaites glacier is the most significant source of model uncertainty, and its drainage basin has the largest potential for future sea level contribution. This work is performed at and supported by the California Institute of Technology's Jet Propulsion Laboratory. Supercomputing time is also supported through a contract with the National Aeronautics and Space Administration's Cryosphere program.

  17. Simulating patient-specific heart shape and motion using SPECT perfusion images with the MCAT phantom

    NASA Astrophysics Data System (ADS)

    Faber, Tracy L.; Garcia, Ernest V.; Lalush, David S.; Segars, W. Paul; Tsui, Benjamin M.

    2001-05-01

    The spline-based Mathematical Cardiac Torso (MCAT) phantom is a realistic software simulation designed to simulate single photon emission computed tomographic (SPECT) data. It incorporates a heart model of known size and shape; thus, it is invaluable for measuring accuracy of acquisition, reconstruction, and post-processing routines. New functionality has been added by replacing the standard heart model with left ventricular (LV) epicaridal and endocardial surface points detected from actual patient SPECT perfusion studies. LV surfaces detected from standard post-processing quantitation programs are converted through interpolation in space and time into new B-spline models. Perfusion abnormalities are added to the model based on results of standard perfusion quantification. The new LV is translated and rotated to fit within existing atria and right ventricular models, which are scaled based on the size of the LV. Simulations were created for five different patients with myocardial infractions who had undergone SPECT perfusion imaging. Shape, size, and motion of the resulting activity map were compared visually to the original SPECT images. In all cases, size, shape and motion of simulated LVs matched well with the original images. Thus, realistic simulations with known physiologic and functional parameters can be created for evaluating efficacy of processing algorithms.

  18. Design and finite element modeling of a novel optical microsystems-based tactile sensor for minimal invasive robotic surgery

    NASA Astrophysics Data System (ADS)

    Ghanbari Mardasi, Amir; Ghanbari, Mahmood; Salmani Tehrani, Mehdi

    2014-09-01

    Although recently Minimal Invasive Robotic Surgery (MIRS) has been more addressed because of its wide range of benefits, however there are still some limitations in this regard. In order to address the shortcomings of MIRS systems, various types of tactile sensors with different sensing principles have been presented in the last few years. In the present paper a MEMS-based optical sensor, which has been recently proposed by researchers, is investigated using numerical simulation. By this type of sensors real time quantification of both dynamic and statics contact forces between the tissue and surgical instrument would be possible. The presented sensor has one moving part and works based on the intensity modulation principle of optical fibers. It is electrically-passive, MRI-compatible and it is possible to be fabricated using available standard micro fabrication techniques. The behavior of the sensor has been simulated using COMSOL MULTIPHYSICS 3.5 software. Stress analysis is conducted on the sensor to assess the deflection of the moving part of the sensor due to applied force. The optical simulation is then conducted to estimate the power loss due to the moving part deflection. Using FEM modeling, the relation between force and deflection is derived which is necessary for the calibration of the sensor.

  19. Molecular beacon probes-base multiplex NASBA Real-time for detection of HIV-1 and HCV.

    PubMed

    Mohammadi-Yeganeh, S; Paryan, M; Mirab Samiee, S; Kia, V; Rezvan, H

    2012-06-01

    Developed in 1991, nucleic acid sequence-based amplification (NASBA) has been introduced as a rapid molecular diagnostic technique, where it has been shown to give quicker results than PCR, and it can also be more sensitive. This paper describes the development of a molecular beacon-based multiplex NASBA assay for simultaneous detection of HIV-1 and HCV in plasma samples. A well-conserved region in the HIV-1 pol gene and 5'-NCR of HCV genome were used for primers and molecular beacon design. The performance features of HCV/HIV-1 multiplex NASBA assay including analytical sensitivity and specificity, clinical sensitivity and clinical specificity were evaluated. The analysis of scalar concentrations of the samples indicated that the limit of quantification of the assay was <1000 copies/ml for HIV-1 and <500 copies/ml for HCV with 95% confidence interval. Multiplex NASBA assay showed a 98% sensitivity and 100% specificity. The analytical specificity study with BLAST software demonstrated that the primers do not attach to any other sequences except for that of HIV-1 or HCV. The primers and molecular beacon probes detected all HCV genotypes and all major variants of HIV-1. This method may represent a relatively inexpensive isothermal method for detection of HIV-1/HCV co-infection in monitoring of patients.

  20. Challenges in Small Screening Laboratories: SaaS to the rescue

    PubMed Central

    Lemmon, Vance P.; Jia, Yuanyuan; Shi, Yan; Holbrook, S. Douglas; Bixby, John L; Buchser, William

    2012-01-01

    The Miami Project to Cure Paralysis, part of the University of Miami Miller School of Medicine, includes a laboratory devoted to High Content Analysis (HCA) of neurons. The goal of the laboratory is to uncover signalling pathways, genes, compounds, or drugs that can be used to promote nerve growth. HCA permits the quantification of neuronal morphology, including the lengths and numbers of axons. HCA screening of various libraries on primary neurons requires a team-based approach, a variety of process steps and complex manipulations of cells and libraries to obtain meaningful results. HCA itself produces vast amounts of information including images, well-based data and cell-based phenotypic measures. Managing experimental workflow and library data, along with the extensive amount of experimental results is challenging. For academic laboratories generating large data sets from experiments using thousands of perturbagens, a laboratory information management system (LIMS) is the data tracking solution of choice. With both productivity and efficiency as driving rationales, the Miami Project has equipped its HCA laboratory with a Software As A Service (SAAS) LIMS to ensure the quality of its experiments and workflows. The article discusses this application in detail, and how the system was selected and integrated into the laboratory. The advantages of SaaS are described. PMID:21631415

  1. Metabolomics by Gas Chromatography-Mass Spectrometry: the combination of targeted and untargeted profiling

    PubMed Central

    Fiehn, Oliver

    2016-01-01

    Gas chromatography-mass spectrometry (GC-MS)-based metabolomics is ideal for identifying and quantitating small molecular metabolites (<650 daltons), including small acids, alcohols, hydroxyl acids, amino acids, sugars, fatty acids, sterols, catecholamines, drugs, and toxins, often using chemical derivatization to make these compounds volatile enough for gas chromatography. This unit shows that on GC-MS- based metabolomics easily allows integrating targeted assays for absolute quantification of specific metabolites with untargeted metabolomics to discover novel compounds. Complemented by database annotations using large spectral libraries and validated, standardized standard operating procedures, GC-MS can identify and semi-quantify over 200 compounds per study in human body fluids (e.g., plasma, urine or stool) samples. Deconvolution software enables detection of more than 300 additional unidentified signals that can be annotated through accurate mass instruments with appropriate data processing workflows, similar to liquid chromatography-MS untargeted profiling (LC-MS). Hence, GC-MS is a mature technology that not only uses classic detectors (‘quadrupole’) but also target mass spectrometers (‘triple quadrupole’) and accurate mass instruments (‘quadrupole-time of flight’). This unit covers the following aspects of GC-MS-based metabolomics: (i) sample preparation from mammalian samples, (ii) acquisition of data, (iii) quality control, and (iv) data processing. PMID:27038389

  2. A simple and efficient method for poly-3-hydroxybutyrate quantification in diazotrophic bacteria within 5 minutes using flow cytometry.

    PubMed

    Alves, L P S; Almeida, A T; Cruz, L M; Pedrosa, F O; de Souza, E M; Chubatsu, L S; Müller-Santos, M; Valdameri, G

    2017-01-16

    The conventional method for quantification of polyhydroxyalkanoates based on whole-cell methanolysis and gas chromatography (GC) is laborious and time-consuming. In this work, a method based on flow cytometry of Nile red stained bacterial cells was established to quantify poly-3-hydroxybutyrate (PHB) production by the diazotrophic and plant-associated bacteria, Herbaspirillum seropedicae and Azospirillum brasilense. The method consists of three steps: i) cell permeabilization, ii) Nile red staining, and iii) analysis by flow cytometry. The method was optimized step-by-step and can be carried out in less than 5 min. The final results indicated a high correlation coefficient (R2=0.99) compared to a standard method based on methanolysis and GC. This method was successfully applied to the quantification of PHB in epiphytic bacteria isolated from rice roots.

  3. Parallel Reaction Monitoring: A Targeted Experiment Performed Using High Resolution and High Mass Accuracy Mass Spectrometry

    PubMed Central

    Rauniyar, Navin

    2015-01-01

    The parallel reaction monitoring (PRM) assay has emerged as an alternative method of targeted quantification. The PRM assay is performed in a high resolution and high mass accuracy mode on a mass spectrometer. This review presents the features that make PRM a highly specific and selective method for targeted quantification using quadrupole-Orbitrap hybrid instruments. In addition, this review discusses the label-based and label-free methods of quantification that can be performed with the targeted approach. PMID:26633379

  4. Identification and Affinity-Quantification of ß-Amyloid and α-Synuclein Polypeptides Using On-Line SAW-Biosensor-Mass Spectrometry

    NASA Astrophysics Data System (ADS)

    Slamnoiu, Stefan; Vlad, Camelia; Stumbaum, Mihaela; Moise, Adrian; Lindner, Kathrin; Engel, Nicole; Vilanova, Mar; Diaz, Mireia; Karreman, Christiaan; Leist, Marcel; Ciossek, Thomas; Hengerer, Bastian; Vilaseca, Marta; Przybylski, Michael

    2014-08-01

    Bioaffinity analysis using a variety of biosensors has become an established tool for detection and quantification of biomolecular interactions. Biosensors, however, are generally limited by the lack of chemical structure information of affinity-bound ligands. On-line bioaffinity-mass spectrometry using a surface-acoustic wave biosensor (SAW-MS) is a new combination providing the simultaneous affinity detection, quantification, and mass spectrometric structural characterization of ligands. We describe here an on-line SAW-MS combination for direct identification and affinity determination, using a new interface for MS of the affinity-isolated ligand eluate. Key element of the SAW-MS combination is a microfluidic interface that integrates affinity-isolation on a gold chip, in-situ sample concentration, and desalting with a microcolumn for MS of the ligand eluate from the biosensor. Suitable MS- acquisition software has been developed that provides coupling of the SAW-MS interface to a Bruker Daltonics ion trap-MS, FTICR-MS, and Waters Synapt-QTOF- MS systems. Applications are presented for mass spectrometric identifications and affinity (KD) determinations of the neurodegenerative polypeptides, ß-amyloid (Aß), and pathophysiological and physiological synucleins (α- and ß-synucleins), two key polypeptide systems for Alzheimer's disease and Parkinson's disease, respectively. Moreover, first in vivo applications of αSyn polypeptides from brain homogenate show the feasibility of on-line affinity-MS to the direct analysis of biological material. These results demonstrate on-line SAW-bioaffinity-MS as a powerful tool for structural and quantitative analysis of biopolymer interactions.

  5. Installation Restoration Program. Phase II: Stage 1 Problem Confirmation Study, Duluth International Airport, Duluth, Minnesota.

    DTIC Science & Technology

    1984-10-01

    8 iii "i t-. Table of Contents (cont.) Section Title Page -APPENDIX A Acronyms, Definitions, Nomenclature and Units of Measure B Scope of Work, Task...Identification/Records Search Phase II - Problem Confirmation and Quantification Phase III - Technology Base Development Phase IV - Corrective Action Only...Problem Identification/Records Search Phase II - Problem Confirmation and Quantification Phase III - Technology Base Development Phase IV - Corrective

  6. Synthesis and application of molecularly imprinted nanoparticles combined ultrasonic assisted for highly selective solid phase extraction trace amount of celecoxib from human plasma samples using design expert (DXB) software.

    PubMed

    Arabi, Maryam; Ghaedi, Mehrorang; Ostovan, Abbas; Tashkhourian, Javad; Asadallahzadeh, Hamideh

    2016-11-01

    In this work molecular imprinted nanoparticles (MINPs) was synthesized and applied for ultrasonic assisted solid phase extraction of celecoxib (CEL) from human plasma sample following its combination by HPLC-UV. The MINPs were prepared in a non-covalent approach using methacrylic acid as monomer, CEL as template, ethylene glycol dimethacrylate as cross-linker, and 2,2-azobisisobutyronitrile (AIBN) as the initiator of polymerization. pH, volume of rinsing and eluent solvent and amount of sorbent influence on response were investigated using factorial experimental design, while optimum point was achieved and set as 250mg sorbent, pH 7.0, 1.5mL washing solvent and 2mL eluent by analysis of results according to design expert (DX) software. At above specified conditions, CEL in human plasma with complicated matrices with acceptable high recoveries (96%) and RSD% lower than 10% was quantified and estimated. The proposed MISPE-HPLC-UV method has linear responses among peak area and concentrations of CEL in the range of 0.2-2000μgL(-1), with regression coefficient of 0.98. The limit of detection (LOD) and quantification (LOQ) based on three and ten times of the noise of HPLC peaks correspond to blank solution were 0.08 and 0.18μgL(-1), respectively. Copyright © 2016 Elsevier B.V. All rights reserved.

  7. E-learning for textile enterprises innovation improvement

    NASA Astrophysics Data System (ADS)

    Blaga, M.; Harpa, R.; Radulescu, I. R.; Stepjanovic, Z.

    2017-10-01

    The Erasmus Plus project- TEXMatrix: “Matrix of knowledge for innovation and competitiveness in textile enterprises”, financed through the Erasmus+ Programme, Strategic partnerships- KA2 for Vocational Education and Training, aims at spreading the creative and innovative organizational culture inside textile enterprises by transferring and implementing methodologies, tools and concepts for improved training. Five European partners form the project consortium: INCDTP - Bucharest, Romania (coordinator), TecMinho - Portugal, Centrocot - Italy, University Maribor, Slovenia, and “Gheorghe Asachi” Technical University of Iasi, Romania. These will help the textile enterprises involved in the project, to learn how to apply creative thinking in their organizations and how to develop the capacity for innovation and change. The project aims to bridge the gap between textile enterprises need for qualified personnel and the young workforce. It develops an innovative knowledge matrix for the tangible and intangible assets of an enterprise and a benchmarking study, based on which a dedicated software tool will be created. This software tool will aid the decision-making enterprise staff (managers, HR specialists, professionals) as well as the trainees (young employees, students, and scholars) to cope with the new challenges of innovation and competitiveness for the textile field. The purpose of this paper is to present the main objectives and achievements of the project, according to its declared goals, with the focus on the presentation of the knowledge matrix of innovation, which is a powerful instrument for the quantification of the intangible assets of textile enterprises.

  8. Faults and foibles of quantitative scanning electron microscopy/energy dispersive x-ray spectrometry (SEM/EDS)

    NASA Astrophysics Data System (ADS)

    Newbury, Dale E.; Ritchie, Nicholas W. M.

    2012-06-01

    Scanning electron microscopy with energy dispersive x-ray spectrometry (SEM/EDS) is a powerful and flexible elemental analysis method that can identify and quantify elements with atomic numbers > 4 (Be) present as major constituents (where the concentration C > 0.1 mass fraction, or 10 weight percent), minor (0.01<= C <= 0.1) and trace (C < 0.01, with a minimum detectable limit of ~+/- 0.0005 - 0.001 under routine measurement conditions, a level which is analyte and matrix dependent ). SEM/EDS can select specimen volumes with linear dimensions from ~ 500 nm to 5 μm depending on composition (masses ranging from ~ 10 pg to 100 pg) and can provide compositional maps that depict lateral elemental distributions. Despite the maturity of SEM/EDS, which has a history of more than 40 years, and the sophistication of modern analytical software, the method is vulnerable to serious shortcomings that can lead to incorrect elemental identifications and quantification errors that significantly exceed reasonable expectations. This paper will describe shortcomings in peak identification procedures, limitations on the accuracy of quantitative analysis due to specimen topography or failures in physical models for matrix corrections, and quantitative artifacts encountered in xray elemental mapping. Effective solutions to these problems are based on understanding the causes and then establishing appropriate measurement science protocols. NIST DTSA II and Lispix are open source analytical software available free at www.nist.gov that can aid the analyst in overcoming significant limitations to SEM/EDS.

  9. GobyWeb: Simplified Management and Analysis of Gene Expression and DNA Methylation Sequencing Data

    PubMed Central

    Dorff, Kevin C.; Chambwe, Nyasha; Zeno, Zachary; Simi, Manuele; Shaknovich, Rita; Campagne, Fabien

    2013-01-01

    We present GobyWeb, a web-based system that facilitates the management and analysis of high-throughput sequencing (HTS) projects. The software provides integrated support for a broad set of HTS analyses and offers a simple plugin extension mechanism. Analyses currently supported include quantification of gene expression for messenger and small RNA sequencing, estimation of DNA methylation (i.e., reduced bisulfite sequencing and whole genome methyl-seq), or the detection of pathogens in sequenced data. In contrast to previous analysis pipelines developed for analysis of HTS data, GobyWeb requires significantly less storage space, runs analyses efficiently on a parallel grid, scales gracefully to process tens or hundreds of multi-gigabyte samples, yet can be used effectively by researchers who are comfortable using a web browser. We conducted performance evaluations of the software and found it to either outperform or have similar performance to analysis programs developed for specialized analyses of HTS data. We found that most biologists who took a one-hour GobyWeb training session were readily able to analyze RNA-Seq data with state of the art analysis tools. GobyWeb can be obtained at http://gobyweb.campagnelab.org and is freely available for non-commercial use. GobyWeb plugins are distributed in source code and licensed under the open source LGPL3 license to facilitate code inspection, reuse and independent extensions http://github.com/CampagneLaboratory/gobyweb2-plugins. PMID:23936070

  10. Optimized methods for total nucleic acid extraction and quantification of the bat white-nose syndrome fungus, Pseudogymnoascus destructans, from swab and environmental samples.

    PubMed

    Verant, Michelle L; Bohuski, Elizabeth A; Lorch, Jeffery M; Blehert, David S

    2016-03-01

    The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid from P. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer-based qPCR test for P. destructans to refine quantification capabilities of this assay. © 2016 The Author(s).

  11. Optimized methods for total nucleic acid extraction and quantification of the bat white-nose syndrome fungus, Pseudogymnoascus destructans, from swab and environmental samples

    USGS Publications Warehouse

    Verant, Michelle; Bohuski, Elizabeth A.; Lorch, Jeffrey M.; Blehert, David

    2016-01-01

    The continued spread of white-nose syndrome and its impacts on hibernating bat populations across North America has prompted nationwide surveillance efforts and the need for high-throughput, noninvasive diagnostic tools. Quantitative real-time polymerase chain reaction (qPCR) analysis has been increasingly used for detection of the causative fungus, Pseudogymnoascus destructans, in both bat- and environment-associated samples and provides a tool for quantification of fungal DNA useful for research and monitoring purposes. However, precise quantification of nucleic acid fromP. destructans is dependent on effective and standardized methods for extracting nucleic acid from various relevant sample types. We describe optimized methodologies for extracting fungal nucleic acids from sediment, guano, and swab-based samples using commercial kits together with a combination of chemical, enzymatic, and mechanical modifications. Additionally, we define modifications to a previously published intergenic spacer–based qPCR test for P. destructans to refine quantification capabilities of this assay.

  12. Quaternary ammonium isobaric tag for a relative and absolute quantification of peptides.

    PubMed

    Setner, Bartosz; Stefanowicz, Piotr; Szewczuk, Zbigniew

    2018-02-01

    Isobaric labeling quantification of peptides has become a method of choice for mass spectrometry-based proteomics studies. However, despite of wide variety of commercially available isobaric tags, none of the currently available methods offers significant improvement of sensitivity of detection during MS experiment. Recently, many strategies were applied to increase the ionization efficiency of peptides involving chemical modifications introducing quaternary ammonium fixed charge. Here, we present a novel quaternary ammonium-based isobaric tag for relative and absolute quantification of peptides (QAS-iTRAQ 2-plex). Upon collisional activation, the new stable benzylic-type cationic reporter ion is liberated from the tag. Deuterium atoms were used to offset the differential masses of a reporter group. We tested the applicability of QAS-iTRAQ 2-plex reagent on a series of model peptides as well as bovine serum albumin tryptic digest. Obtained results suggest usefulness of this isobaric ionization tag for relative and absolute quantification of peptides. Copyright © 2017 John Wiley & Sons, Ltd.

  13. Assessing morphology and function of the semicircular duct system: introducing new in-situ visualization and software toolbox

    PubMed Central

    David, R.; Stoessel, A.; Berthoz, A.; Spoor, F.; Bennequin, D.

    2016-01-01

    The semicircular duct system is part of the sensory organ of balance and essential for navigation and spatial awareness in vertebrates. Its function in detecting head rotations has been modelled with increasing sophistication, but the biomechanics of actual semicircular duct systems has rarely been analyzed, foremost because the fragile membranous structures in the inner ear are hard to visualize undistorted and in full. Here we present a new, easy-to-apply and non-invasive method for three-dimensional in-situ visualization and quantification of the semicircular duct system, using X-ray micro tomography and tissue staining with phosphotungstic acid. Moreover, we introduce Ariadne, a software toolbox which provides comprehensive and improved morphological and functional analysis of any visualized duct system. We demonstrate the potential of these methods by presenting results for the duct system of humans, the squirrel monkey and the rhesus macaque, making comparisons with past results from neurophysiological, oculometric and biomechanical studies. Ariadne is freely available at http://www.earbank.org. PMID:27604473

  14. A statistical method for determining the dimensions, tolerances and specification of optics for the Laser Megajoule facility (LMJ)

    NASA Astrophysics Data System (ADS)

    Denis, Vincent

    2008-09-01

    This paper presents a statistical method for determining the dimensions, tolerance and specifications of components for the Laser MegaJoule (LMJ). Numerous constraints inherent to a large facility require specific tolerances: the huge number of optical components; the interdependence of these components between the beams of same bundle; angular multiplexing for the amplifier section; distinct operating modes between the alignment and firing phases; the definition and use of alignment software in the place of classic optimization. This method provides greater flexibility to determine the positioning and manufacturing specifications of the optical components. Given the enormous power of the Laser MegaJoule (over 18 kJ in the infrared and 9 kJ in the ultraviolet), one of the major risks is damage the optical mounts and pollution of the installation by mechanical ablation. This method enables estimation of the beam occultation probabilities and quantification of the risks for the facility. All the simulations were run using the ZEMAX-EE optical design software.

  15. Development of a competency mapping tool for undergraduate professional degree programmes, using mechanical engineering as a case study

    NASA Astrophysics Data System (ADS)

    Holmes, David W.; Sheehan, Madoc; Birks, Melanie; Smithson, John

    2018-01-01

    Mapping the curriculum of a professional degree to the associated competency standard ensures graduates have the competence to perform as professionals. Existing approaches to competence mapping vary greatly in depth, complexity, and effectiveness, and a standardised approach remains elusive. This paper describes a new mapping software tool that streamlines and standardises the competency mapping process. The available analytics facilitate ongoing programme review, management, and accreditation. The complete mapping and analysis of an Australian mechanical engineering degree programme is described as a case study. Each subject is mapped by evaluating the amount and depth of competence development present. Combining subject results then enables highly detailed programme level analysis. The mapping process is designed to be administratively light, with aspects of professional development embedded in the software. The effective competence mapping described in this paper enables quantification of learning within a professional degree programme, and provides a mechanism for holistic programme improvement.

  16. JAva GUi for Applied Research (JAGUAR) v 3.0

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    JAGUAR is a Java software tool for automatically rendering a graphical user interface (GUI) from a structured input specification. It is designed as a plug-in to the Eclipse workbench to enable users to create, edit, and externally execute analysis application input decks and then view the results. JAGUAR serves as a GUI for Sandia's DAKOTA software toolkit for optimization and uncertainty quantification. It will include problem (input deck)set-up, option specification, analysis execution, and results visualization. Through the use of wizards, templates, and views, JAGUAR helps uses navigate the complexity of DAKOTA's complete input specification. JAGUAR is implemented in Java, leveragingmore » Eclipse extension points and Eclipse user interface. JAGUAR parses a DAKOTA NIDR input specification and presents the user with linked graphical and plain text representations of problem set-up and option specification for DAKOTA studies. After the data has been input by the user, JAGUAR generates one or more input files for DAKOTA, executes DAKOTA, and captures and interprets the results« less

  17. MicroFilament Analyzer identifies actin network organizations in epidermal cells of Arabidopsis thaliana roots

    PubMed Central

    Jacques, Eveline; Lewandowski, Michal; Buytaert, Jan; Fierens, Yves; Verbelen, Jean-Pierre; Vissenberg, Kris

    2013-01-01

    The plant cytoskeleton plays a crucial role in the cells’ growth and development during different developmental stages and it undergoes many rearrangements. In order to describe the arrangements of the F-actin cytoskeleton in root epidermal cells of Arabidopsis thaliana, the recently developed software MicroFilament Analyzer (MFA) was exploited. This software enables high-throughput identification and quantification of the orientation of filamentous structures on digital images in a highly standardized and fast way. Using confocal microscopy and transgenic GFP-FABD2-GFP plants the actin cytoskeleton was visualized in the root epidermis. MFA analysis revealed that during the early stages of cell development F-actin is organized in a mainly random pattern. As the cells grow, they preferentially adopt a longitudinal organization, a pattern that is also preserved in the largest cells. In the evolution from young to old cells, an approximately even distribution of transverse, oblique or combined orientations is always present besides the switch from random to a longitudinal oriented actin cytoskeleton. PMID:23656865

  18. Automated systematic random sampling and Cavalieri stereology of histologic sections demonstrating acute tubular necrosis after cardiac arrest and cardiopulmonary resuscitation in the mouse.

    PubMed

    Wakasaki, Rumie; Eiwaz, Mahaba; McClellan, Nicholas; Matsushita, Katsuyuki; Golgotiu, Kirsti; Hutchens, Michael P

    2018-06-14

    A technical challenge in translational models of kidney injury is determination of the extent of cell death. Histologic sections are commonly analyzed by area morphometry or unbiased stereology, but stereology requires specialized equipment. Therefore, a challenge to rigorous quantification would be addressed by an unbiased stereology tool with reduced equipment dependence. We hypothesized that it would be feasible to build a novel software component which would facilitate unbiased stereologic quantification on scanned slides, and that unbiased stereology would demonstrate greater precision and decreased bias compared with 2D morphometry. We developed a macro for the widely used image analysis program, Image J, and performed cardiac arrest with cardiopulmonary resuscitation (CA/CPR, a model of acute cardiorenal syndrome) in mice. Fluorojade-B stained kidney sections were analyzed using three methods to quantify cell death: gold standard stereology using a controlled stage and commercially-available software, unbiased stereology using the novel ImageJ macro, and quantitative 2D morphometry also using the novel macro. There was strong agreement between both methods of unbiased stereology (bias -0.004±0.006 with 95% limits of agreement -0.015 to 0.007). 2D morphometry demonstrated poor agreement and significant bias compared to either method of unbiased stereology. Unbiased stereology is facilitated by a novel macro for ImageJ and results agree with those obtained using gold-standard methods. Automated 2D morphometry overestimated tubular epithelial cell death and correlated modestly with values obtained from unbiased stereology. These results support widespread use of unbiased stereology for analysis of histologic outcomes of injury models.

  19. Validation of an integrated software for the detection of rapid eye movement sleep behavior disorder.

    PubMed

    Frauscher, Birgit; Gabelia, David; Biermayr, Marlene; Stefani, Ambra; Hackner, Heinz; Mitterling, Thomas; Poewe, Werner; Högl, Birgit

    2014-10-01

    Rapid eye movement sleep without atonia (RWA) is the polysomnographic hallmark of REM sleep behavior disorder (RBD). To partially overcome the disadvantages of manual RWA scoring, which is time consuming but essential for the accurate diagnosis of RBD, we aimed to validate software specifically developed and integrated with polysomnography for RWA detection against the gold standard of manual RWA quantification. Academic referral center sleep laboratory. Polysomnographic recordings of 20 patients with RBD and 60 healthy volunteers were analyzed. N/A. Motor activity during REM sleep was quantified manually and computer assisted (with and without artifact detection) according to Sleep Innsbruck Barcelona (SINBAR) criteria for the mentalis ("any," phasic, tonic electromyographic [EMG] activity) and the flexor digitorum superficialis (FDS) muscle (phasic EMG activity). Computer-derived indices (with and without artifact correction) for "any," phasic, tonic mentalis EMG activity, phasic FDS EMG activity, and the SINBAR index ("any" mentalis + phasic FDS) correlated well with the manually derived indices (all Spearman rhos 0.66-0.98). In contrast with computerized scoring alone, computerized scoring plus manual artifact correction (median duration 5.4 min) led to a significant reduction of false positives for "any" mentalis (40%), phasic mentalis (40.6%), and the SINBAR index (41.2%). Quantification of tonic mentalis and phasic FDS EMG activity was not influenced by artifact correction. The computer algorithm used here appears to be a promising tool for REM sleep behavior disorder detection in both research and clinical routine. A short check for plausibility of automatic detection should be a basic prerequisite for this and all other available computer algorithms. © 2014 Associated Professional Sleep Societies, LLC.

  20. User-initialized active contour segmentation and golden-angle real-time cardiovascular magnetic resonance enable accurate assessment of LV function in patients with sinus rhythm and arrhythmias.

    PubMed

    Contijoch, Francisco; Witschey, Walter R T; Rogers, Kelly; Rears, Hannah; Hansen, Michael; Yushkevich, Paul; Gorman, Joseph; Gorman, Robert C; Han, Yuchi

    2015-05-21

    Data obtained during arrhythmia is retained in real-time cardiovascular magnetic resonance (rt-CMR), but there is limited and inconsistent evidence to show that rt-CMR can accurately assess beat-to-beat variation in left ventricular (LV) function or during an arrhythmia. Multi-slice, short axis cine and real-time golden-angle radial CMR data was collected in 22 clinical patients (18 in sinus rhythm and 4 patients with arrhythmia). A user-initialized active contour segmentation (ACS) software was validated via comparison to manual segmentation on clinically accepted software. For each image in the 2D acquisitions, slice volume was calculated and global LV volumes were estimated via summation across the LV using multiple slices. Real-time imaging data was reconstructed using different image exposure times and frame rates to evaluate the effect of temporal resolution on measured function in each slice via ACS. Finally, global volumetric function of ectopic and non-ectopic beats was measured using ACS in patients with arrhythmias. ACS provides global LV volume measurements that are not significantly different from manual quantification of retrospectively gated cine images in sinus rhythm patients. With an exposure time of 95.2 ms and a frame rate of > 89 frames per second, golden-angle real-time imaging accurately captures hemodynamic function over a range of patient heart rates. In four patients with frequent ectopic contractions, initial quantification of the impact of ectopic beats on hemodynamic function was demonstrated. User-initialized active contours and golden-angle real-time radial CMR can be used to determine time-varying LV function in patients. These methods will be very useful for the assessment of LV function in patients with frequent arrhythmias.

  1. MIQuant – Semi-Automation of Infarct Size Assessment in Models of Cardiac Ischemic Injury

    PubMed Central

    Esteves, Tiago; de Pina, Maria de Fátima; Guedes, Joana G.; Freire, Ana; Quelhas, Pedro; Pinto-do-Ó, Perpétua

    2011-01-01

    Background The cardiac regenerative potential of newly developed therapies is traditionally evaluated in rodent models of surgically induced myocardial ischemia. A generally accepted key parameter for determining the success of the applied therapy is the infarct size. Although regarded as a gold standard method for infarct size estimation in heart ischemia, histological planimetry is time-consuming and highly variable amongst studies. The purpose of this work is to contribute towards the standardization and simplification of infarct size assessment by providing free access to a novel semi-automated software tool. The acronym MIQuant was attributed to this application. Methodology/Principal Findings Mice were subject to permanent coronary artery ligation and the size of chronic infarcts was estimated by area and midline-length methods using manual planimetry and with MIQuant. Repeatability and reproducibility of MIQuant scores were verified. The validation showed high correlation (r midline length = 0.981; r area = 0.970 ) and agreement (Bland-Altman analysis), free from bias for midline length and negligible bias of 1.21% to 3.72% for area quantification. Further analysis demonstrated that MIQuant reduced by 4.5-fold the time spent on the analysis and, importantly, MIQuant effectiveness is independent of user proficiency. The results indicate that MIQuant can be regarded as a better alternative to manual measurement. Conclusions We conclude that MIQuant is a reliable and an easy-to-use software for infarct size quantification. The widespread use of MIQuant will contribute towards the standardization of infarct size assessment across studies and, therefore, to the systematization of the evaluation of cardiac regenerative potential of emerging therapies. PMID:21980376

  2. TargetSearch--a Bioconductor package for the efficient preprocessing of GC-MS metabolite profiling data.

    PubMed

    Cuadros-Inostroza, Alvaro; Caldana, Camila; Redestig, Henning; Kusano, Miyako; Lisec, Jan; Peña-Cortés, Hugo; Willmitzer, Lothar; Hannah, Matthew A

    2009-12-16

    Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS). The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data.

  3. TargetSearch - a Bioconductor package for the efficient preprocessing of GC-MS metabolite profiling data

    PubMed Central

    2009-01-01

    Background Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS). The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. Results We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. Conclusions TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data. PMID:20015393

  4. Design, implementation and multisite evaluation of a system suitability protocol for the quantitative assessment of instrument performance in liquid chromatography-multiple reaction monitoring-MS (LC-MRM-MS).

    PubMed

    Abbatiello, Susan E; Mani, D R; Schilling, Birgit; Maclean, Brendan; Zimmerman, Lisa J; Feng, Xingdong; Cusack, Michael P; Sedransk, Nell; Hall, Steven C; Addona, Terri; Allen, Simon; Dodder, Nathan G; Ghosh, Mousumi; Held, Jason M; Hedrick, Victoria; Inerowicz, H Dorota; Jackson, Angela; Keshishian, Hasmik; Kim, Jong Won; Lyssand, John S; Riley, C Paige; Rudnick, Paul; Sadowski, Pawel; Shaddox, Kent; Smith, Derek; Tomazela, Daniela; Wahlander, Asa; Waldemarson, Sofia; Whitwell, Corbin A; You, Jinsam; Zhang, Shucha; Kinsinger, Christopher R; Mesri, Mehdi; Rodriguez, Henry; Borchers, Christoph H; Buck, Charles; Fisher, Susan J; Gibson, Bradford W; Liebler, Daniel; Maccoss, Michael; Neubert, Thomas A; Paulovich, Amanda; Regnier, Fred; Skates, Steven J; Tempst, Paul; Wang, Mu; Carr, Steven A

    2013-09-01

    Multiple reaction monitoring (MRM) mass spectrometry coupled with stable isotope dilution (SID) and liquid chromatography (LC) is increasingly used in biological and clinical studies for precise and reproducible quantification of peptides and proteins in complex sample matrices. Robust LC-SID-MRM-MS-based assays that can be replicated across laboratories and ultimately in clinical laboratory settings require standardized protocols to demonstrate that the analysis platforms are performing adequately. We developed a system suitability protocol (SSP), which employs a predigested mixture of six proteins, to facilitate performance evaluation of LC-SID-MRM-MS instrument platforms, configured with nanoflow-LC systems interfaced to triple quadrupole mass spectrometers. The SSP was designed for use with low multiplex analyses as well as high multiplex approaches when software-driven scheduling of data acquisition is required. Performance was assessed by monitoring of a range of chromatographic and mass spectrometric metrics including peak width, chromatographic resolution, peak capacity, and the variability in peak area and analyte retention time (RT) stability. The SSP, which was evaluated in 11 laboratories on a total of 15 different instruments, enabled early diagnoses of LC and MS anomalies that indicated suboptimal LC-MRM-MS performance. The observed range in variation of each of the metrics scrutinized serves to define the criteria for optimized LC-SID-MRM-MS platforms for routine use, with pass/fail criteria for system suitability performance measures defined as peak area coefficient of variation <0.15, peak width coefficient of variation <0.15, standard deviation of RT <0.15 min (9 s), and the RT drift <0.5min (30 s). The deleterious effect of a marginally performing LC-SID-MRM-MS system on the limit of quantification (LOQ) in targeted quantitative assays illustrates the use and need for a SSP to establish robust and reliable system performance. Use of a SSP helps to ensure that analyte quantification measurements can be replicated with good precision within and across multiple laboratories and should facilitate more widespread use of MRM-MS technology by the basic biomedical and clinical laboratory research communities.

  5. SWB Groundwater Recharge Analysis, Catalina Island, California: Assessing Spatial and Temporal Recharge Patterns Within a Mediterranean Climate Zone

    NASA Astrophysics Data System (ADS)

    Harlow, J.

    2017-12-01

    Groundwater recharge quantification is a key parameter for sustainable groundwater management. Many recharge quantification techniques have been devised, each with advantages and disadvantages. A free, GIS based recharge quantification tool - the Soil Water Balance (SWB) model - was developed by the USGS to produce fine-tuned recharge constraints in watersheds and illuminate spatial and temporal dynamics of recharge. The subject of this research is to examine SWB within a Mediterranean climate zone, focusing on the Catalina Island, California. This project relied on publicly available online resources with the exception the geospatial processing software, ArcGIS. Daily climate station precipitation and temperature data was obtained from the Desert Research Institute for the years 2008-2014. Precipitation interpolations were performed with ArcGIS using the Natural Neighbor method. The USGS-National Map Viewer (NMV) website provided a 30-meter DEM - to interpolate high and low temperature ASCII grids using the Temperature Lapse Rate (TLR) method, to construct a D-8 flow direction grid for downhill redirection of soil-moisture saturated runoff toward non-saturated cells, and for aesthetic map creation. NMV also provided a modified Anderson land cover classification raster. The US Department of Agriculture-National Resource Conservation Service (NRCS) Web Soil Survey website provided shapefiles of soil water capacity and hydrologic soil groups. The Hargreaves and Samani method was implemented to determine evapotranspiration rates. The resulting SWB output data, in the form of ASCII grids are easily added to ArcGIS for quick visualization and data analysis (Figure 1). Calculated average recharge for 2008-2014 was 3537 inches/year, or 0.0174 acre feet/year. Recharge was 10.2% of the islands gross precipitation. The spatial distribution of the most significant recharge is in hotspots which dominate the residential hills above Avalon, followed by grassy/unvegetated areas associated with dirt roads, and then higher elevation southeast-eastern facing slopes. The greatest large-scale concentration of recharge is centered in the area from Two Harbors to Blackjack Mountain. Further examination within this project will determine parameter significance to recharge and runoff.

  6. Design, Implementation and Multisite Evaluation of a System Suitability Protocol for the Quantitative Assessment of Instrument Performance in Liquid Chromatography-Multiple Reaction Monitoring-MS (LC-MRM-MS)*

    PubMed Central

    Abbatiello, Susan E.; Mani, D. R.; Schilling, Birgit; MacLean, Brendan; Zimmerman, Lisa J.; Feng, Xingdong; Cusack, Michael P.; Sedransk, Nell; Hall, Steven C.; Addona, Terri; Allen, Simon; Dodder, Nathan G.; Ghosh, Mousumi; Held, Jason M.; Hedrick, Victoria; Inerowicz, H. Dorota; Jackson, Angela; Keshishian, Hasmik; Kim, Jong Won; Lyssand, John S.; Riley, C. Paige; Rudnick, Paul; Sadowski, Pawel; Shaddox, Kent; Smith, Derek; Tomazela, Daniela; Wahlander, Asa; Waldemarson, Sofia; Whitwell, Corbin A.; You, Jinsam; Zhang, Shucha; Kinsinger, Christopher R.; Mesri, Mehdi; Rodriguez, Henry; Borchers, Christoph H.; Buck, Charles; Fisher, Susan J.; Gibson, Bradford W.; Liebler, Daniel; MacCoss, Michael; Neubert, Thomas A.; Paulovich, Amanda; Regnier, Fred; Skates, Steven J.; Tempst, Paul; Wang, Mu; Carr, Steven A.

    2013-01-01

    Multiple reaction monitoring (MRM) mass spectrometry coupled with stable isotope dilution (SID) and liquid chromatography (LC) is increasingly used in biological and clinical studies for precise and reproducible quantification of peptides and proteins in complex sample matrices. Robust LC-SID-MRM-MS-based assays that can be replicated across laboratories and ultimately in clinical laboratory settings require standardized protocols to demonstrate that the analysis platforms are performing adequately. We developed a system suitability protocol (SSP), which employs a predigested mixture of six proteins, to facilitate performance evaluation of LC-SID-MRM-MS instrument platforms, configured with nanoflow-LC systems interfaced to triple quadrupole mass spectrometers. The SSP was designed for use with low multiplex analyses as well as high multiplex approaches when software-driven scheduling of data acquisition is required. Performance was assessed by monitoring of a range of chromatographic and mass spectrometric metrics including peak width, chromatographic resolution, peak capacity, and the variability in peak area and analyte retention time (RT) stability. The SSP, which was evaluated in 11 laboratories on a total of 15 different instruments, enabled early diagnoses of LC and MS anomalies that indicated suboptimal LC-MRM-MS performance. The observed range in variation of each of the metrics scrutinized serves to define the criteria for optimized LC-SID-MRM-MS platforms for routine use, with pass/fail criteria for system suitability performance measures defined as peak area coefficient of variation <0.15, peak width coefficient of variation <0.15, standard deviation of RT <0.15 min (9 s), and the RT drift <0.5min (30 s). The deleterious effect of a marginally performing LC-SID-MRM-MS system on the limit of quantification (LOQ) in targeted quantitative assays illustrates the use and need for a SSP to establish robust and reliable system performance. Use of a SSP helps to ensure that analyte quantification measurements can be replicated with good precision within and across multiple laboratories and should facilitate more widespread use of MRM-MS technology by the basic biomedical and clinical laboratory research communities. PMID:23689285

  7. Lessons from Astrobiological Planetary Analogue Exploration in Iceland: Biomarker Assay Performance and Downselection

    NASA Technical Reports Server (NTRS)

    Gentry, D. M.; Amador, E. S.; Cable, M. L.; Cantrell, T.; Chaudry, N.; Cullen, T.; Duca, Z.; Kirby, J.; Jacobsen, M.; McCaig, H.; hide

    2017-01-01

    Understanding the sensitivity of biomarker assays to the local physicochemical environment, and the underlying spatial distribution of the target biomarkers in 'homogeneous' environments, can increase mission science return. We have conducted four expeditions to Icelandic Mars analogue sites in which an increasingly refined battery of physicochemical measurements and biomarker assays were performed, staggered with scouting of further sites. Completed expeditions took place in 2012 (location scouting and field assay use testing), 2013 (sampling of two major sites with three assays and observational physicochemical measurements), 2015 (repeat sampling of prior sites and one new site, scouting of new sites, three assays and three instruments), and 2016 (preliminary sampling of new sites with analysis of returned samples). Target sites were geologically recent basaltic lava flows, and sample loci were arranged in hierarchically nested grids at 10 cm, 1 m, 10 m, 100 m, and >1 km order scales, subject to field constraints. Assays were intended to represent a diversity of potential biomarker types (cell counting via nucleic acid staining and fluorescence microscopy, ATP quantification via luciferase luminescence, and relative DNA quantification with simple domain-level primers) rather a specific mission science target, and were selected to reduce laboratory overhead, require limited consumables, and allow rapid turnaround. All analytical work was performed in situ or in a field laboratory within a day's travel of the field sites unless otherwise noted. We have demonstrated the feasibility of performing ATP quantification and qPCR analysis in a field-based laboratory with single-day turnaround. The ATP assay was generally robust and reliable and required minimal field equipment and training to produce a large amount of useful data. DNA was successfully extracted from all samples, but the serial-batch nature of qPCR significantly limited the number of primers (hence classifications) and replicates that could be run in a single day. Fluorescence microscopy did not prove feasible under the same constraints, primarily due to the large number of person-hours required to view, analyze, and record results from the images; however, this could be mitigated with higher-quality imaging instruments and appropriate image analysis software.

  8. Recent application of quantification II in Japanese medical research.

    PubMed Central

    Suzuki, T; Kudo, A

    1979-01-01

    Hayashi's Quantification II is a method of multivariate discrimination analysis to manipulate attribute data as predictor variables. It is very useful in the medical research field for estimation, diagnosis, prognosis, evaluation of epidemiological factors, and other problems based on multiplicity of attribute data. In Japan, this method is so well known that most of the computer program packages include the Hayashi Quantification, but it seems to be yet unfamiliar with the method for researchers outside Japan. In view of this situation, we introduced 19 selected articles of recent applications of the Quantification II in Japanese medical research. In reviewing these papers, special mention is made to clarify how the researchers were satisfied with findings provided by the method. At the same time, some recommendations are made about terminology and program packages. Also a brief discussion of the background of the quantification methods is given with special reference to the Behaviormetric Society of Japan. PMID:540587

  9. Rapid and Easy Protocol for Quantification of Next-Generation Sequencing Libraries.

    PubMed

    Hawkins, Steve F C; Guest, Paul C

    2018-01-01

    The emergence of next-generation sequencing (NGS) over the last 10 years has increased the efficiency of DNA sequencing in terms of speed, ease, and price. However, the exact quantification of a NGS library is crucial in order to obtain good data on sequencing platforms developed by the current market leader Illumina. Different approaches for DNA quantification are available currently and the most commonly used are based on analysis of the physical properties of the DNA through spectrophotometric or fluorometric methods. Although these methods are technically simple, they do not allow exact quantification as can be achieved using a real-time quantitative PCR (qPCR) approach. A qPCR protocol for DNA quantification with applications in NGS library preparation studies is presented here. This can be applied in various fields of study such as medical disorders resulting from nutritional programming disturbances.

  10. Automated quantification of myocardial perfusion SPECT using simplified normal limits.

    PubMed

    Slomka, Piotr J; Nishina, Hidetaka; Berman, Daniel S; Akincioglu, Cigdem; Abidov, Aiden; Friedman, John D; Hayes, Sean W; Germano, Guido

    2005-01-01

    To simplify development of normal limits for myocardial perfusion SPECT (MPS), we implemented a quantification scheme in which normal limits are derived without visual scoring of abnormal scans or optimization of regional thresholds. Normal limits were derived from same-day TI-201 rest/Tc-99m-sestamibi stress scans of male (n = 40) and female (n = 40) low-likelihood patients. Defect extent, total perfusion deficit (TPD), and regional perfusion extents were derived by comparison to normal limits in polar-map coordinates. MPS scans from 256 consecutive patients without known coronary artery disease, who underwent coronary angiography, were analyzed. The new method of quantification (TPD) was compared with our previously developed quantification system and visual scoring. The receiver operator characteristic area under the curve for detection of 50% or greater stenoses by TPD (0.88 +/- 0.02) was higher than by visual scoring (0.83 +/- 0.03) ( P = .039) or standard quantification (0.82 +/- 0.03) ( P = .004). For detection of 70% or greater stenoses, it was higher for TPD (0.89 +/- 0.02) than for standard quantification (0.85 +/- 0.02) ( P = .014). Sensitivity and specificity were 93% and 79%, respectively, for TPD; 81% and 85%, respectively, for visual scoring; and 80% and 73%, respectively, for standard quantification. The use of stress mode-specific normal limits did not improve performance. Simplified quantification achieves performance better than or equivalent to visual scoring or quantification based on per-segment visual optimization of abnormality thresholds.

  11. Crossing the chasm: how to develop weather and climate models for next generation computers?

    NASA Astrophysics Data System (ADS)

    Lawrence, Bryan N.; Rezny, Michael; Budich, Reinhard; Bauer, Peter; Behrens, Jörg; Carter, Mick; Deconinck, Willem; Ford, Rupert; Maynard, Christopher; Mullerworth, Steven; Osuna, Carlos; Porter, Andrew; Serradell, Kim; Valcke, Sophie; Wedi, Nils; Wilson, Simon

    2018-05-01

    Weather and climate models are complex pieces of software which include many individual components, each of which is evolving under pressure to exploit advances in computing to enhance some combination of a range of possible improvements (higher spatio-temporal resolution, increased fidelity in terms of resolved processes, more quantification of uncertainty, etc.). However, after many years of a relatively stable computing environment with little choice in processing architecture or programming paradigm (basically X86 processors using MPI for parallelism), the existing menu of processor choices includes significant diversity, and more is on the horizon. This computational diversity, coupled with ever increasing software complexity, leads to the very real possibility that weather and climate modelling will arrive at a chasm which will separate scientific aspiration from our ability to develop and/or rapidly adapt codes to the available hardware. In this paper we review the hardware and software trends which are leading us towards this chasm, before describing current progress in addressing some of the tools which we may be able to use to bridge the chasm. This brief introduction to current tools and plans is followed by a discussion outlining the scientific requirements for quality model codes which have satisfactory performance and portability, while simultaneously supporting productive scientific evolution. We assert that the existing method of incremental model improvements employing small steps which adjust to the changing hardware environment is likely to be inadequate for crossing the chasm between aspiration and hardware at a satisfactory pace, in part because institutions cannot have all the relevant expertise in house. Instead, we outline a methodology based on large community efforts in engineering and standardisation, which will depend on identifying a taxonomy of key activities - perhaps based on existing efforts to develop domain-specific languages, identify common patterns in weather and climate codes, and develop community approaches to commonly needed tools and libraries - and then collaboratively building up those key components. Such a collaborative approach will depend on institutions, projects, and individuals adopting new interdependencies and ways of working.

  12. Benchmarking quantitative label-free LC-MS data processing workflows using a complex spiked proteomic standard dataset.

    PubMed

    Ramus, Claire; Hovasse, Agnès; Marcellin, Marlène; Hesse, Anne-Marie; Mouton-Barbosa, Emmanuelle; Bouyssié, David; Vaca, Sebastian; Carapito, Christine; Chaoui, Karima; Bruley, Christophe; Garin, Jérôme; Cianférani, Sarah; Ferro, Myriam; Van Dorssaeler, Alain; Burlet-Schiltz, Odile; Schaeffer, Christine; Couté, Yohann; Gonzalez de Peredo, Anne

    2016-01-30

    Proteomic workflows based on nanoLC-MS/MS data-dependent-acquisition analysis have progressed tremendously in recent years. High-resolution and fast sequencing instruments have enabled the use of label-free quantitative methods, based either on spectral counting or on MS signal analysis, which appear as an attractive way to analyze differential protein expression in complex biological samples. However, the computational processing of the data for label-free quantification still remains a challenge. Here, we used a proteomic standard composed of an equimolar mixture of 48 human proteins (Sigma UPS1) spiked at different concentrations into a background of yeast cell lysate to benchmark several label-free quantitative workflows, involving different software packages developed in recent years. This experimental design allowed to finely assess their performances in terms of sensitivity and false discovery rate, by measuring the number of true and false-positive (respectively UPS1 or yeast background proteins found as differential). The spiked standard dataset has been deposited to the ProteomeXchange repository with the identifier PXD001819 and can be used to benchmark other label-free workflows, adjust software parameter settings, improve algorithms for extraction of the quantitative metrics from raw MS data, or evaluate downstream statistical methods. Bioinformatic pipelines for label-free quantitative analysis must be objectively evaluated in their ability to detect variant proteins with good sensitivity and low false discovery rate in large-scale proteomic studies. This can be done through the use of complex spiked samples, for which the "ground truth" of variant proteins is known, allowing a statistical evaluation of the performances of the data processing workflow. We provide here such a controlled standard dataset and used it to evaluate the performances of several label-free bioinformatics tools (including MaxQuant, Skyline, MFPaQ, IRMa-hEIDI and Scaffold) in different workflows, for detection of variant proteins with different absolute expression levels and fold change values. The dataset presented here can be useful for tuning software tool parameters, and also testing new algorithms for label-free quantitative analysis, or for evaluation of downstream statistical methods. Copyright © 2015 Elsevier B.V. All rights reserved.

  13. Fully automated system for the quantification of human osteoarthritic knee joint effusion volume using magnetic resonance imaging

    PubMed Central

    2010-01-01

    Introduction Joint effusion is frequently associated with osteoarthritis (OA) flare-up and is an important marker of therapeutic response. This study aimed at developing and validating a fully automated system based on magnetic resonance imaging (MRI) for the quantification of joint effusion volume in knee OA patients. Methods MRI examinations consisted of two axial sequences: a T2-weighted true fast imaging with steady-state precession and a T1-weighted gradient echo. An automated joint effusion volume quantification system using MRI was developed and validated (a) with calibrated phantoms (cylinder and sphere) and effusion from knee OA patients; (b) with assessment by manual quantification; and (c) by direct aspiration. Twenty-five knee OA patients with joint effusion were included in the study. Results The automated joint effusion volume quantification was developed as a four stage sequencing process: bone segmentation, filtering of unrelated structures, segmentation of joint effusion, and subvoxel volume calculation. Validation experiments revealed excellent coefficients of variation with the calibrated cylinder (1.4%) and sphere (0.8%) phantoms. Comparison of the OA knee joint effusion volume assessed by the developed automated system and by manual quantification was also excellent (r = 0.98; P < 0.0001), as was the comparison with direct aspiration (r = 0.88; P = 0.0008). Conclusions The newly developed fully automated MRI-based system provided precise quantification of OA knee joint effusion volume with excellent correlation with data from phantoms, a manual system, and joint aspiration. Such an automated system will be instrumental in improving the reproducibility/reliability of the evaluation of this marker in clinical application. PMID:20846392

  14. The effect of orthostasis on recurrence quantification analysis of heart rate and blood pressure dynamics.

    PubMed

    Javorka, M; Turianikova, Z; Tonhajzerova, I; Javorka, K; Baumert, M

    2009-01-01

    The purpose of this paper is to investigate the effect of orthostatic challenge on recurrence plot based complexity measures of heart rate and blood pressure variability (HRV and BPV). HRV and BPV complexities were assessed in 28 healthy subjects over 15 min in the supine and standing positions. The complexity of HRV and BPV was assessed based on recurrence quantification analysis. HRV complexity was reduced along with the HRV magnitude after changing from the supine to the standing position. In contrast, the BPV magnitude increased and BPV complexity decreased upon standing. Recurrence quantification analysis (RQA) of HRV and BPV is sensitive to orthostatic challenge and might therefore be suited to assess changes in autonomic neural outflow to the cardiovascular system.

  15. Lowering the quantification limit of the QubitTM RNA HS assay using RNA spike-in.

    PubMed

    Li, Xin; Ben-Dov, Iddo Z; Mauro, Maurizio; Williams, Zev

    2015-05-06

    RNA quantification is often a prerequisite for most RNA analyses such as RNA sequencing. However, the relatively low sensitivity and large sample consumption of traditional RNA quantification methods such as UV spectrophotometry and even the much more sensitive fluorescence-based RNA quantification assays, such as the Qubit™ RNA HS Assay, are often inadequate for measuring minute levels of RNA isolated from limited cell and tissue samples and biofluids. Thus, there is a pressing need for a more sensitive method to reliably and robustly detect trace levels of RNA without interference from DNA. To improve the quantification limit of the Qubit™ RNA HS Assay, we spiked-in a known quantity of RNA to achieve the minimum reading required by the assay. Samples containing trace amounts of RNA were then added to the spike-in and measured as a reading increase over RNA spike-in baseline. We determined the accuracy and precision of reading increases between 1 and 20 pg/μL as well as RNA-specificity in this range, and compared to those of RiboGreen(®), another sensitive fluorescence-based RNA quantification assay. We then applied Qubit™ Assay with RNA spike-in to quantify plasma RNA samples. RNA spike-in improved the quantification limit of the Qubit™ RNA HS Assay 5-fold, from 25 pg/μL down to 5 pg/μL while maintaining high specificity to RNA. This enabled quantification of RNA with original concentration as low as 55.6 pg/μL compared to 250 pg/μL for the standard assay and decreased sample consumption from 5 to 1 ng. Plasma RNA samples that were not measurable by the Qubit™ RNA HS Assay were measurable by our modified method. The Qubit™ RNA HS Assay with RNA spike-in is able to quantify RNA with high specificity at 5-fold lower concentration and uses 5-fold less sample quantity than the standard Qubit™ Assay.

  16. WE-AB-204-05: Harmonizing PET/CT Quantification in Multicenter Studies: A Case Study

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Marques da Silva, A; Fischer, A

    2015-06-15

    Purpose: To present the implementation of a strategy to harmonize FDG PET/CT quantification (SUV), performed with different scanner models and manufacturers. Methods: The strategy was based on Boellaard (2011) and EARL FDG-PET/CT accreditation program, that propose quality control measurements for harmonizing scanner performance. A NEMA IEC Body phantom study was performed using four different devices: PHP-1 (Gemini TF Base, Philips); PHP-2 (Gemini GXL, Philips); GEH (Discovery 600, General Electric); SMS (Biograph Hi-Rez 16, Siemens). The SUV Recovery Coefficient (RC) was calculated using the clinical protocol and other clinically relevant reconstruction parameters. The most appropriate reconstruction parameters (MARP) for SUV harmonization,more » in each scanner, are those which achieve EARL harmonizing standards. They were identified using the lowest root mean square errors (RMSE). To evaluate the strategy’s effectiveness, the Maximum Differences (MD) between the clinical and MARP RC values were calculated. Results: The reconstructions parameters that obtained the lowest RMSE are: FBP 5mm (PHP-1); LOR-RAMLA 2i0.008l (PHP-2); VuePointHD 2i32s10mm (GEH); and FORE+OSEM 4i8s6mm (SMS). Thus, to ensure that quantitative PET image measurements are interchangeable between these sites, images must be reconstructed with the above-mentioned parameters. Although, a decoupling between the best image for PET/CT qualitative analysis and the best image for quantification studies was observed. The MD showed that the strategy was effective in reducing the variability of SUV quantification for small structures (<17mm). Conclusion: The harmonization strategy of the SUV quantification implemented with these devices was effective in reducing the variability of small structures quantification, minimizing the inter-scanner and inter-institution differences in quantification. However, it is essential that, in addition to the harmonization of quantification, the standardization of the methodology of patient preparation must be maintained, in order to minimize the SUV variability due to biological factors. Financial support by CAPES.« less

  17. Quantifying construction and demolition waste: An analytical review

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    Wu, Zezhou; Yu, Ann T.W., E-mail: bsannyu@polyu.edu.hk; Shen, Liyin

    2014-09-15

    Highlights: • Prevailing C and D waste quantification methodologies are identified and compared. • One specific methodology cannot fulfill all waste quantification scenarios. • A relevance tree for appropriate quantification methodology selection is proposed. • More attentions should be paid to civil and infrastructural works. • Classified information is suggested for making an effective waste management plan. - Abstract: Quantifying construction and demolition (C and D) waste generation is regarded as a prerequisite for the implementation of successful waste management. In literature, various methods have been employed to quantify the C and D waste generation at both regional and projectmore » levels. However, an integrated review that systemically describes and analyses all the existing methods has yet to be conducted. To bridge this research gap, an analytical review is conducted. Fifty-seven papers are retrieved based on a set of rigorous procedures. The characteristics of the selected papers are classified according to the following criteria - waste generation activity, estimation level and quantification methodology. Six categories of existing C and D waste quantification methodologies are identified, including site visit method, waste generation rate method, lifetime analysis method, classification system accumulation method, variables modelling method and other particular methods. A critical comparison of the identified methods is given according to their characteristics and implementation constraints. Moreover, a decision tree is proposed for aiding the selection of the most appropriate quantification method in different scenarios. Based on the analytical review, limitations of previous studies and recommendations of potential future research directions are further suggested.« less

  18. CLMSVault: A Software Suite for Protein Cross-Linking Mass-Spectrometry Data Analysis and Visualization.

    PubMed

    Courcelles, Mathieu; Coulombe-Huntington, Jasmin; Cossette, Émilie; Gingras, Anne-Claude; Thibault, Pierre; Tyers, Mike

    2017-07-07

    Protein cross-linking mass spectrometry (CL-MS) enables the sensitive detection of protein interactions and the inference of protein complex topology. The detection of chemical cross-links between protein residues can identify intra- and interprotein contact sites or provide physical constraints for molecular modeling of protein structure. Recent innovations in cross-linker design, sample preparation, mass spectrometry, and software tools have significantly improved CL-MS approaches. Although a number of algorithms now exist for the identification of cross-linked peptides from mass spectral data, a dearth of user-friendly analysis tools represent a practical bottleneck to the broad adoption of the approach. To facilitate the analysis of CL-MS data, we developed CLMSVault, a software suite designed to leverage existing CL-MS algorithms and provide intuitive and flexible tools for cross-platform data interpretation. CLMSVault stores and combines complementary information obtained from different cross-linkers and search algorithms. CLMSVault provides filtering, comparison, and visualization tools to support CL-MS analyses and includes a workflow for label-free quantification of cross-linked peptides. An embedded 3D viewer enables the visualization of quantitative data and the mapping of cross-linked sites onto PDB structural models. We demonstrate the application of CLMSVault for the analysis of a noncovalent Cdc34-ubiquitin protein complex cross-linked under different conditions. CLMSVault is open-source software (available at https://gitlab.com/courcelm/clmsvault.git ), and a live demo is available at http://democlmsvault.tyerslab.com/ .

  19. MetaQuant: a tool for the automatic quantification of GC/MS-based metabolome data.

    PubMed

    Bunk, Boyke; Kucklick, Martin; Jonas, Rochus; Münch, Richard; Schobert, Max; Jahn, Dieter; Hiller, Karsten

    2006-12-01

    MetaQuant is a Java-based program for the automatic and accurate quantification of GC/MS-based metabolome data. In contrast to other programs MetaQuant is able to quantify hundreds of substances simultaneously with minimal manual intervention. The integration of a self-acting calibration function allows the parallel and fast calibration for several metabolites simultaneously. Finally, MetaQuant is able to import GC/MS data in the common NetCDF format and to export the results of the quantification into Systems Biology Markup Language (SBML), Comma Separated Values (CSV) or Microsoft Excel (XLS) format. MetaQuant is written in Java and is available under an open source license. Precompiled packages for the installation on Windows or Linux operating systems are freely available for download. The source code as well as the installation packages are available at http://bioinformatics.org/metaquant

  20. Detection and quantification of serum or plasma HCV RNA: mini review of commercially available assays.

    PubMed

    Le Guillou-Guillemette, Helene; Lunel-Fabiani, Francoise

    2009-01-01

    The treatment schedule (combination of compounds, doses, and duration) and the virological follow-up for management of antiviral treatment in patients chronically infected by HCV is now well standardized, but to ensure good monitoring of the treated patients, physicians need rapid, reproducible, and sensitive molecular virological tools with a wide range of detection and quantification of HCV RNA in blood samples. Several assays for detection and/or quantification of HCV RNA are currently commercially available. Here, all these assays are detailed, and a brief description of each step of the assay is provided. They are divided into two categories by method: those based on signal amplification and those based on target amplification. These two categories are then divided into qualitative, quantitative, and quantitative detection assays. The real-time reverse-transcription polymerase chain reaction (RT-PCR)-based assays are the most promising strategy in the HCV virological area.

  1. A simple and efficient method for poly-3-hydroxybutyrate quantification in diazotrophic bacteria within 5 minutes using flow cytometry

    PubMed Central

    Alves, L.P.S.; Almeida, A.T.; Cruz, L.M.; Pedrosa, F.O.; de Souza, E.M.; Chubatsu, L.S.; Müller-Santos, M.; Valdameri, G.

    2017-01-01

    The conventional method for quantification of polyhydroxyalkanoates based on whole-cell methanolysis and gas chromatography (GC) is laborious and time-consuming. In this work, a method based on flow cytometry of Nile red stained bacterial cells was established to quantify poly-3-hydroxybutyrate (PHB) production by the diazotrophic and plant-associated bacteria, Herbaspirillum seropedicae and Azospirillum brasilense. The method consists of three steps: i) cell permeabilization, ii) Nile red staining, and iii) analysis by flow cytometry. The method was optimized step-by-step and can be carried out in less than 5 min. The final results indicated a high correlation coefficient (R2=0.99) compared to a standard method based on methanolysis and GC. This method was successfully applied to the quantification of PHB in epiphytic bacteria isolated from rice roots. PMID:28099582

  2. Multiparameter fluorescence imaging for quantification of TH-1 and TH-2 cytokines at the single-cell level

    NASA Astrophysics Data System (ADS)

    Fekkar, Hakim; Benbernou, N.; Esnault, S.; Shin, H. C.; Guenounou, Moncef

    1998-04-01

    Immune responses are strongly influenced by the cytokines following antigenic stimulation. Distinct cytokine-producing T cell subsets are well known to play a major role in immune responses and to be differentially regulated during immunological disorders, although the characterization and quantification of the TH-1/TH-2 cytokine pattern in T cells remained not clearly defined. Expression of cytokines by T lymphocytes is a highly balanced process, involving stimulatory and inhibitory intracellular signaling pathways. The aim of this study was (1) to quantify the cytokine expression in T cells at the single cell level using optical imaging, (2) and to analyze the influence of cyclic AMP- dependent signal transduction pathway in the balance between the TH-1 and TH-2 cytokine profile. We attempted to study several cytokines (IL-2, IFN-(gamma) , IL-4, IL-10 and IL-13) in peripheral blood mononuclear cells. Cells were prestimulated in vitro using phytohemagglutinin and phorbol ester for 36h, and then further cultured for 8h in the presence of monensin. Cells were permeabilized and then simple-, double- or triple-labeled with the corresponding specific fluorescent monoclonal antibodies. The cell phenotype was also determined by analyzing the expression of each of CD4, CD8, CD45RO and CD45RA with the cytokine expression. Conventional images of cells were recorded with a Peltier- cooled CCD camera (B/W C5985, Hamamatsu photonics) through an inverted microscope equipped with epi-fluorescence (Diaphot 300, Nikon). Images were digitalized using an acquisition video interface (Oculus TCX Coreco) in 762 by 570 pixels coded in 8 bits (256 gray levels), and analyzed thereafter in an IBM PC computer based on an intel pentium processor with an adequate software (Visilog 4, Noesis). The first image processing step is the extraction of cell areas using an edge detection and a binary thresholding method. In order to reduce the background noise of fluorescence, we performed an opening procedure of the original image using a structuring element. The opened image was therefore subtracted from the original one, and the gray intensities were subsequently measured. Fluorescence intensities are mapped in MD representation using Matlab software. Consequently, quantitative comparative expression of intracellular cytokines and cell membrane markers was achieved. Using this technique, we showed that CD4+ and CD8+T lymphocytes expressed a large panel of cytokines, and that protein kinase A (PKA) activation pathway induced a polarization of activated human T cells to the TH-2 type profile. Data also showed different sensitivities of CD45 RO/CD45RA lymphocytes to the activation of PKA, thus suggesting the implication of memory CD4+- and CD8+-T cells in the T cell specific immune and inflammatory processes and their control by PKA activation pathway. Finally, this method represents a powerful tool for the detection and quantification of intracellular cytokine expression and the analysis of the functional properties of T lymphocytes during immune responses.

  3. Validation of a DIXON-based fat quantification technique for the measurement of visceral fat using a CT-based reference standard.

    PubMed

    Heckman, Katherine M; Otemuyiwa, Bamidele; Chenevert, Thomas L; Malyarenko, Dariya; Derstine, Brian A; Wang, Stewart C; Davenport, Matthew S

    2018-06-27

    The purpose of the study is to determine whether a novel semi-automated DIXON-based fat quantification algorithm can reliably quantify visceral fat using a CT-based reference standard. This was an IRB-approved retrospective cohort study of 27 subjects who underwent abdominopelvic CT within 7 days of proton density fat fraction (PDFF) mapping on a 1.5T MRI. Cross-sectional visceral fat area per slice (cm 2 ) was measured in blinded fashion in each modality at intervertebral disc levels from T12 to L4. CT estimates were obtained using a previously published semi-automated computational image processing system that sums pixels with attenuation - 205 to - 51 HU. MR estimates were obtained using two novel semi-automated DIXON-based fat quantification algorithms that measure visceral fat area by spatially regularizing non-uniform fat-only signal intensity or de-speckling PDFF 2D images and summing pixels with PDFF ≥ 50%. Pearson's correlations and Bland-Altman analyses were performed. Visceral fat area per slice ranged from 9.2 to 429.8 cm 2 for MR and from 1.6 to 405.5 cm 2 for CT. There was a strong correlation between CT and MR methods in measured visceral fat area across all studied vertebral body levels (r = 0.97; n = 101 observations); the least (r = 0.93) correlation was at T12. Bland-Altman analysis revealed a bias of 31.7 cm 2 (95% CI [- 27.1]-90.4 cm 2 ), indicating modestly higher visceral fat assessed by MR. MR- and CT-based visceral fat quantification are highly correlated and have good cross-modality reliability, indicating that visceral fat quantification by either method can yield a stable and reliable biomarker.

  4. Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations

    PubMed Central

    Higgs, Richard E.; Butler, Jon P.; Han, Bomie; Knierman, Michael D.

    2013-01-01

    Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference. PMID:23710359

  5. The Knowledge-Based Software Assistant: Beyond CASE

    NASA Technical Reports Server (NTRS)

    Carozzoni, Joseph A.

    1993-01-01

    This paper will outline the similarities and differences between two paradigms of software development. Both support the whole software life cycle and provide automation for most of the software development process, but have different approaches. The CASE approach is based on a set of tools linked by a central data repository. This tool-based approach is data driven and views software development as a series of sequential steps, each resulting in a product. The Knowledge-Based Software Assistant (KBSA) approach, a radical departure from existing software development practices, is knowledge driven and centers around a formalized software development process. KBSA views software development as an incremental, iterative, and evolutionary process with development occurring at the specification level.

  6. Rapid quantification of soilborne pathogen communities in wheat-based long-term field experiments

    USDA-ARS?s Scientific Manuscript database

    Traditional isolation and quantification of inoculum density is difficult for most soilborne pathogens. Quantitative PCR methods have been developed to rapidly identify and quantify many of these pathogens using a single DNA extract from soil. Rainfed experiments operated continuously for up to 84 y...

  7. IDENTIFICATION AND QUANTIFICATION OF AEROSOL POLAR OXYGENATED COMPOUNDS BEARING CARBOXYLIC AND/OR HYDROXYL GROUPS. 1. METHOD DEVELOPMENT

    EPA Science Inventory

    In this study, a new analytical technique was developed for the identification and quantification of multi-functional compounds containing simultaneously at least one hydroxyl or one carboxylic group, or both. This technique is based on derivatizing first the carboxylic group(s) ...

  8. Integrated protocol for reliable and fast quantification and documentation of electrophoresis gels.

    PubMed

    Rehbein, Peter; Schwalbe, Harald

    2015-06-01

    Quantitative analysis of electrophoresis gels is an important part in molecular cloning, as well as in protein expression and purification. Parallel quantifications in yield and purity can be most conveniently obtained from densitometric analysis. This communication reports a comprehensive, reliable and simple protocol for gel quantification and documentation, applicable for single samples and with special features for protein expression screens. As major component of the protocol, the fully annotated code of a proprietary open source computer program for semi-automatic densitometric quantification of digitized electrophoresis gels is disclosed. The program ("GelQuant") is implemented for the C-based macro-language of the widespread integrated development environment of IGOR Pro. Copyright © 2014 Elsevier Inc. All rights reserved.

  9. ASAP (Automatic Software for ASL Processing): A toolbox for processing Arterial Spin Labeling images.

    PubMed

    Mato Abad, Virginia; García-Polo, Pablo; O'Daly, Owen; Hernández-Tamames, Juan Antonio; Zelaya, Fernando

    2016-04-01

    The method of Arterial Spin Labeling (ASL) has experienced a significant rise in its application to functional imaging, since it is the only technique capable of measuring blood perfusion in a truly non-invasive manner. Currently, there are no commercial packages for processing ASL data and there is no recognized standard for normalizing ASL data to a common frame of reference. This work describes a new Automated Software for ASL Processing (ASAP) that can automatically process several ASL datasets. ASAP includes functions for all stages of image pre-processing: quantification, skull-stripping, co-registration, partial volume correction and normalization. To assess the applicability and validity of the toolbox, this work shows its application in the study of hypoperfusion in a sample of healthy subjects at risk of progressing to Alzheimer's disease. ASAP requires limited user intervention, minimizing the possibility of random and systematic errors, and produces cerebral blood flow maps that are ready for statistical group analysis. The software is easy to operate and results in excellent quality of spatial normalization. The results found in this evaluation study are consistent with previous studies that find decreased perfusion in Alzheimer's patients in similar regions and demonstrate the applicability of ASAP. Copyright © 2015 Elsevier Inc. All rights reserved.

  10. Qualis-SIS: automated standard curve generation and quality assessment for multiplexed targeted quantitative proteomic experiments with labeled standards.

    PubMed

    Mohammed, Yassene; Percy, Andrew J; Chambers, Andrew G; Borchers, Christoph H

    2015-02-06

    Multiplexed targeted quantitative proteomics typically utilizes multiple reaction monitoring and allows the optimized quantification of a large number of proteins. One challenge, however, is the large amount of data that needs to be reviewed, analyzed, and interpreted. Different vendors provide software for their instruments, which determine the recorded responses of the heavy and endogenous peptides and perform the response-curve integration. Bringing multiplexed data together and generating standard curves is often an off-line step accomplished, for example, with spreadsheet software. This can be laborious, as it requires determining the concentration levels that meet the required accuracy and precision criteria in an iterative process. We present here a computer program, Qualis-SIS, that generates standard curves from multiplexed MRM experiments and determines analyte concentrations in biological samples. Multiple level-removal algorithms and acceptance criteria for concentration levels are implemented. When used to apply the standard curve to new samples, the software flags each measurement according to its quality. From the user's perspective, the data processing is instantaneous due to the reactivity paradigm used, and the user can download the results of the stepwise calculations for further processing, if necessary. This allows for more consistent data analysis and can dramatically accelerate the downstream data analysis.

  11. Preliminary experience with a novel method of three-dimensional co-registration of prostate cancer digital histology and in vivo multiparametric MRI.

    PubMed

    Orczyk, C; Rusinek, H; Rosenkrantz, A B; Mikheev, A; Deng, F-M; Melamed, J; Taneja, S S

    2013-12-01

    To assess a novel method of three-dimensional (3D) co-registration of prostate cancer digital histology and in-vivo multiparametric magnetic resonance imaging (mpMRI) image sets for clinical usefulness. A software platform was developed to achieve 3D co-registration. This software was prospectively applied to three patients who underwent radical prostatectomy. Data comprised in-vivo mpMRI [T2-weighted, dynamic contrast-enhanced weighted images (DCE); apparent diffusion coefficient (ADC)], ex-vivo T2-weighted imaging, 3D-rebuilt pathological specimen, and digital histology. Internal landmarks from zonal anatomy served as reference points for assessing co-registration accuracy and precision. Applying a method of deformable transformation based on 22 internal landmarks, a 1.6 mm accuracy was reached to align T2-weighted images and the 3D-rebuilt pathological specimen, an improvement over rigid transformation of 32% (p = 0.003). The 22 zonal anatomy landmarks were more accurately mapped using deformable transformation than rigid transformation (p = 0.0008). An automatic method based on mutual information, enabled automation of the process and to include perfusion and diffusion MRI images. Evaluation of co-registration accuracy using the volume overlap index (Dice index) met clinically relevant requirements, ranging from 0.81-0.96 for sequences tested. Ex-vivo images of the specimen did not significantly improve co-registration accuracy. This preliminary analysis suggests that deformable transformation based on zonal anatomy landmarks is accurate in the co-registration of mpMRI and histology. Including diffusion and perfusion sequences in the same 3D space as histology is essential further clinical information. The ability to localize cancer in 3D space may improve targeting for image-guided biopsy, focal therapy, and disease quantification in surveillance protocols. Copyright © 2013 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.

  12. Radiological Monitoring Equipment For Real-Time Quantification Of Area Contamination In Soils And Facility Decommissioning

    DOE Office of Scientific and Technical Information (OSTI.GOV)

    M. V. Carpenter; Jay A. Roach; John R Giles

    2005-09-01

    The environmental restoration industry offers several sys¬tems that perform scan-type characterization of radiologically contaminated areas. The Idaho National Laboratory (INL) has developed and deployed a suite of field systems that rapidly scan, characterize, and analyse radiological contamination in surface soils. The base system consists of a detector, such as sodium iodide (NaI) spectrometers, a global positioning system (GPS), and an integrated user-friendly computer interface. This mobile concept was initially developed to provide precertifica¬tion analyses of soils contaminated with uranium, thorium, and radium at the Fernald Closure Project, near Cincinnati, Ohio. INL has expanded the functionality of this basic system tomore » create a suite of integrated field-deployable analytical systems. Using its engineering and radiation measurement expertise, aided by computer hardware and software support, INL has streamlined the data acquisition and analysis process to provide real-time information presented on wireless screens and in the form of coverage maps immediately available to field technicians. In addition, custom software offers a user-friendly interface with user-selectable alarm levels and automated data quality monitoring functions that validate the data. This system is deployed from various platforms, depending on the nature of the survey. The deployment platforms include a small all-terrain vehicle used to survey large, relatively flat areas, a hand-pushed unit for areas where manoeuvrability is important, an excavator-mounted system used to scan pits and trenches where personnel access is restricted, and backpack- mounted systems to survey rocky shoreline features and other physical settings that preclude vehicle-based deployment. Variants of the base system include sealed proportional counters for measuring actinides (i.e., plutonium-238 and americium-241) in building demolitions, soil areas, roadbeds, and process line routes at the Miamisburg Closure Project near Dayton, Ohio. In addition, INL supports decontamination operations at the Oak Ridge National Laboratory.« less

  13. Software tool for mining liquid chromatography/multi-stage mass spectrometry data for comprehensive glycerophospholipid profiling.

    PubMed

    Hein, Eva-Maria; Bödeker, Bertram; Nolte, Jürgen; Hayen, Heiko

    2010-07-30

    Electrospray ionization mass spectrometry (ESI-MS) has emerged as an indispensable tool in the field of lipidomics. Despite the growing interest in lipid analysis, there are only a few software tools available for data evaluation, as compared for example to proteomics applications. This makes comprehensive lipid analysis a complex challenge. Thus, a computational tool for harnessing the raw data from liquid chromatography/mass spectrometry (LC/MS) experiments was developed in this study and is available from the authors on request. The Profiler-Merger-Viewer tool is a software package for automatic processing of raw-data from data-dependent experiments, measured by high-performance liquid chromatography hyphenated to electrospray ionization hybrid linear ion trap Fourier transform mass spectrometry (FTICR-MS and Orbitrap) in single and multi-stage mode. The software contains three parts: processing of the raw data by Profiler for lipid identification, summarizing of replicate measurements by Merger and visualization of all relevant data (chromatograms as well as mass spectra) for validation of the results by Viewer. The tool is easily accessible, since it is implemented in Java and uses Microsoft Excel (XLS) as output format. The motivation was to develop a tool which supports and accelerates the manual data evaluation (identification and relative quantification) significantly but does not make a complete data analysis within a black-box system. The software's mode of operation, usage and options will be demonstrated on the basis of a lipid extract of baker's yeast (S. cerevisiae). In this study, we focused on three important representatives of lipids: glycerophospholipids, lyso-glycerophospholipids and free fatty acids. Copyright 2010 John Wiley & Sons, Ltd.

  14. PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data.

    PubMed

    Mitchell, Christopher J; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh

    2016-08-01

    Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, (15)N, (13)C, or (18)O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25-45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. © 2016 by The American Society for Biochemistry and Molecular Biology, Inc.

  15. Study of fault tolerant software technology for dynamic systems

    NASA Technical Reports Server (NTRS)

    Caglayan, A. K.; Zacharias, G. L.

    1985-01-01

    The major aim of this study is to investigate the feasibility of using systems-based failure detection isolation and compensation (FDIC) techniques in building fault-tolerant software and extending them, whenever possible, to the domain of software fault tolerance. First, it is shown that systems-based FDIC methods can be extended to develop software error detection techniques by using system models for software modules. In particular, it is demonstrated that systems-based FDIC techniques can yield consistency checks that are easier to implement than acceptance tests based on software specifications. Next, it is shown that systems-based failure compensation techniques can be generalized to the domain of software fault tolerance in developing software error recovery procedures. Finally, the feasibility of using fault-tolerant software in flight software is investigated. In particular, possible system and version instabilities, and functional performance degradation that may occur in N-Version programming applications to flight software are illustrated. Finally, a comparative analysis of N-Version and recovery block techniques in the context of generic blocks in flight software is presented.

  16. 28 CFR 100.18 - Audit.

    Code of Federal Regulations, 2012 CFR

    2012-07-01

    ... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...

  17. 28 CFR 100.18 - Audit.

    Code of Federal Regulations, 2010 CFR

    2010-07-01

    ... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...

  18. 28 CFR 100.18 - Audit.

    Code of Federal Regulations, 2011 CFR

    2011-07-01

    ... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...

  19. 28 CFR 100.18 - Audit.

    Code of Federal Regulations, 2013 CFR

    2013-07-01

    ... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...

  20. 28 CFR 100.18 - Audit.

    Code of Federal Regulations, 2014 CFR

    2014-07-01

    ... such as disk, tape) or type (e.g., data bases, applications software, data base management software...., date bases, applications software, data base management software, utilities), sufficient to reflect... timeliness of the cost data, the FBI or other representatives of the Government shall have the right to...

Top