A multi-center study benchmarks software tools for label-free proteome quantification
Gillet, Ludovic C; Bernhardt, Oliver M.; MacLean, Brendan; Röst, Hannes L.; Tate, Stephen A.; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I.; Aebersold, Ruedi; Tenzer, Stefan
2016-01-01
The consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from SWATH-MS (sequential window acquisition of all theoretical fragment ion spectra), a method that uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test datasets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation windows setups. For consistent evaluation we developed LFQbench, an R-package to calculate metrics of precision and accuracy in label-free quantitative MS, and report the identification performance, robustness and specificity of each software tool. Our reference datasets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics. PMID:27701404
A multicenter study benchmarks software tools for label-free proteome quantification.
Navarro, Pedro; Kuharev, Jörg; Gillet, Ludovic C; Bernhardt, Oliver M; MacLean, Brendan; Röst, Hannes L; Tate, Stephen A; Tsou, Chih-Chiang; Reiter, Lukas; Distler, Ute; Rosenberger, George; Perez-Riverol, Yasset; Nesvizhskii, Alexey I; Aebersold, Ruedi; Tenzer, Stefan
2016-11-01
Consistent and accurate quantification of proteins by mass spectrometry (MS)-based proteomics depends on the performance of instruments, acquisition methods and data analysis software. In collaboration with the software developers, we evaluated OpenSWATH, SWATH 2.0, Skyline, Spectronaut and DIA-Umpire, five of the most widely used software methods for processing data from sequential window acquisition of all theoretical fragment-ion spectra (SWATH)-MS, which uses data-independent acquisition (DIA) for label-free protein quantification. We analyzed high-complexity test data sets from hybrid proteome samples of defined quantitative composition acquired on two different MS instruments using different SWATH isolation-window setups. For consistent evaluation, we developed LFQbench, an R package, to calculate metrics of precision and accuracy in label-free quantitative MS and report the identification performance, robustness and specificity of each software tool. Our reference data sets enabled developers to improve their software tools. After optimization, all tools provided highly convergent identification and reliable quantification performance, underscoring their robustness for label-free quantitative proteomics.
Li, Zhucui; Lu, Yan; Guo, Yufeng; Cao, Haijie; Wang, Qinhong; Shui, Wenqing
2018-10-31
Data analysis represents a key challenge for untargeted metabolomics studies and it commonly requires extensive processing of more than thousands of metabolite peaks included in raw high-resolution MS data. Although a number of software packages have been developed to facilitate untargeted data processing, they have not been comprehensively scrutinized in the capability of feature detection, quantification and marker selection using a well-defined benchmark sample set. In this study, we acquired a benchmark dataset from standard mixtures consisting of 1100 compounds with specified concentration ratios including 130 compounds with significant variation of concentrations. Five software evaluated here (MS-Dial, MZmine 2, XCMS, MarkerView, and Compound Discoverer) showed similar performance in detection of true features derived from compounds in the mixtures. However, significant differences between untargeted metabolomics software were observed in relative quantification of true features in the benchmark dataset. MZmine 2 outperformed the other software in terms of quantification accuracy and it reported the most true discriminating markers together with the fewest false markers. Furthermore, we assessed selection of discriminating markers by different software using both the benchmark dataset and a real-case metabolomics dataset to propose combined usage of two software for increasing confidence of biomarker identification. Our findings from comprehensive evaluation of untargeted metabolomics software would help guide future improvements of these widely used bioinformatics tools and enable users to properly interpret their metabolomics results. Copyright © 2018 Elsevier B.V. All rights reserved.
Compositional Solution Space Quantification for Probabilistic Software Analysis
NASA Technical Reports Server (NTRS)
Borges, Mateus; Pasareanu, Corina S.; Filieri, Antonio; d'Amorim, Marcelo; Visser, Willem
2014-01-01
Probabilistic software analysis aims at quantifying how likely a target event is to occur during program execution. Current approaches rely on symbolic execution to identify the conditions to reach the target event and try to quantify the fraction of the input domain satisfying these conditions. Precise quantification is usually limited to linear constraints, while only approximate solutions can be provided in general through statistical approaches. However, statistical approaches may fail to converge to an acceptable accuracy within a reasonable time. We present a compositional statistical approach for the efficient quantification of solution spaces for arbitrarily complex constraints over bounded floating-point domains. The approach leverages interval constraint propagation to improve the accuracy of the estimation by focusing the sampling on the regions of the input domain containing the sought solutions. Preliminary experiments show significant improvement on previous approaches both in results accuracy and analysis time.
The Effect of AOP on Software Engineering, with Particular Attention to OIF and Event Quantification
NASA Technical Reports Server (NTRS)
Havelund, Klaus; Filman, Robert; Korsmeyer, David (Technical Monitor)
2003-01-01
We consider the impact of Aspect-Oriented Programming on Software Engineering, and, in particular, analyze two AOP systems, one of which does component wrapping and the other, quantification over events, for their software engineering effects.
Designing Control System Application Software for Change
NASA Technical Reports Server (NTRS)
Boulanger, Richard
2001-01-01
The Unified Modeling Language (UML) was used to design the Environmental Systems Test Stand (ESTS) control system software. The UML was chosen for its ability to facilitate a clear dialog between software designer and customer, from which requirements are discovered and documented in a manner which transposes directly to program objects. Applying the UML to control system software design has resulted in a baseline set of documents from which change and effort of that change can be accurately measured. As the Environmental Systems Test Stand evolves, accurate estimates of the time and effort required to change the control system software will be made. Accurate quantification of the cost of software change can be before implementation, improving schedule and budget accuracy.
The Infeasibility of Experimental Quantification of Life-Critical Software Reliability
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Finelli, George B.
1991-01-01
This paper affirms that quantification of life-critical software reliability is infeasible using statistical methods whether applied to standard software or fault-tolerant software. The key assumption of software fault tolerance|separately programmed versions fail independently|is shown to be problematic. This assumption cannot be justified by experimentation in the ultra-reliability region and subjective arguments in its favor are not sufficiently strong to justify it as an axiom. Also, the implications of the recent multi-version software experiments support this affirmation.
Bokhart, Mark T; Nazari, Milad; Garrard, Kenneth P; Muddiman, David C
2018-01-01
A major update to the mass spectrometry imaging (MSI) software MSiReader is presented, offering a multitude of newly added features critical to MSI analyses. MSiReader is a free, open-source, and vendor-neutral software written in the MATLAB platform and is capable of analyzing most common MSI data formats. A standalone version of the software, which does not require a MATLAB license, is also distributed. The newly incorporated data analysis features expand the utility of MSiReader beyond simple visualization of molecular distributions. The MSiQuantification tool allows researchers to calculate absolute concentrations from quantification MSI experiments exclusively through MSiReader software, significantly reducing data analysis time. An image overlay feature allows the incorporation of complementary imaging modalities to be displayed with the MSI data. A polarity filter has also been incorporated into the data loading step, allowing the facile analysis of polarity switching experiments without the need for data parsing prior to loading the data file into MSiReader. A quality assurance feature to generate a mass measurement accuracy (MMA) heatmap for an analyte of interest has also been added to allow for the investigation of MMA across the imaging experiment. Most importantly, as new features have been added performance has not degraded, in fact it has been dramatically improved. These new tools and the improvements to the performance in MSiReader v1.0 enable the MSI community to evaluate their data in greater depth and in less time. Graphical Abstract ᅟ.
Nasso, Sara; Goetze, Sandra; Martens, Lennart
2015-09-04
Selected reaction monitoring (SRM) MS is a highly selective and sensitive technique to quantify protein abundances in complex biological samples. To enhance the pace of SRM large studies, a validated, robust method to fully automate absolute quantification and to substitute for interactive evaluation would be valuable. To address this demand, we present Ariadne, a Matlab software. To quantify monitored targets, Ariadne exploits metadata imported from the transition lists, and targets can be filtered according to mProphet output. Signal processing and statistical learning approaches are combined to compute peptide quantifications. To robustly estimate absolute abundances, the external calibration curve method is applied, ensuring linearity over the measured dynamic range. Ariadne was benchmarked against mProphet and Skyline by comparing its quantification performance on three different dilution series, featuring either noisy/smooth traces without background or smooth traces with complex background. Results, evaluated as efficiency, linearity, accuracy, and precision of quantification, showed that Ariadne's performance is independent of data smoothness and complex background presence and that Ariadne outperforms mProphet on the noisier data set and improved 2-fold Skyline's accuracy and precision for the lowest abundant dilution with complex background. Remarkably, Ariadne could statistically distinguish from each other all different abundances, discriminating dilutions as low as 0.1 and 0.2 fmol. These results suggest that Ariadne offers reliable and automated analysis of large-scale SRM differential expression studies.
Deng, Ning; Li, Zhenye; Pan, Chao; Duan, Huilong
2015-01-01
Study of complex proteome brings forward higher request for the quantification method using mass spectrometry technology. In this paper, we present a mass spectrometry label-free quantification tool for complex proteomes, called freeQuant, which integrated quantification with functional analysis effectively. freeQuant consists of two well-integrated modules: label-free quantification and functional analysis with biomedical knowledge. freeQuant supports label-free quantitative analysis which makes full use of tandem mass spectrometry (MS/MS) spectral count, protein sequence length, shared peptides, and ion intensity. It adopts spectral count for quantitative analysis and builds a new method for shared peptides to accurately evaluate abundance of isoforms. For proteins with low abundance, MS/MS total ion count coupled with spectral count is included to ensure accurate protein quantification. Furthermore, freeQuant supports the large-scale functional annotations for complex proteomes. Mitochondrial proteomes from the mouse heart, the mouse liver, and the human heart were used to evaluate the usability and performance of freeQuant. The evaluation showed that the quantitative algorithms implemented in freeQuant can improve accuracy of quantification with better dynamic range.
NASA Astrophysics Data System (ADS)
Bokhart, Mark T.; Nazari, Milad; Garrard, Kenneth P.; Muddiman, David C.
2018-01-01
A major update to the mass spectrometry imaging (MSI) software MSiReader is presented, offering a multitude of newly added features critical to MSI analyses. MSiReader is a free, open-source, and vendor-neutral software written in the MATLAB platform and is capable of analyzing most common MSI data formats. A standalone version of the software, which does not require a MATLAB license, is also distributed. The newly incorporated data analysis features expand the utility of MSiReader beyond simple visualization of molecular distributions. The MSiQuantification tool allows researchers to calculate absolute concentrations from quantification MSI experiments exclusively through MSiReader software, significantly reducing data analysis time. An image overlay feature allows the incorporation of complementary imaging modalities to be displayed with the MSI data. A polarity filter has also been incorporated into the data loading step, allowing the facile analysis of polarity switching experiments without the need for data parsing prior to loading the data file into MSiReader. A quality assurance feature to generate a mass measurement accuracy (MMA) heatmap for an analyte of interest has also been added to allow for the investigation of MMA across the imaging experiment. Most importantly, as new features have been added performance has not degraded, in fact it has been dramatically improved. These new tools and the improvements to the performance in MSiReader v1.0 enable the MSI community to evaluate their data in greater depth and in less time. [Figure not available: see fulltext.
Gallo-Oller, Gabriel; Ordoñez, Raquel; Dotor, Javier
2018-06-01
Since its first description, Western blot has been widely used in molecular labs. It constitutes a multistep method that allows the detection and/or quantification of proteins from simple to complex protein mixtures. Western blot quantification method constitutes a critical step in order to obtain accurate and reproducible results. Due to the technical knowledge required for densitometry analysis together with the resources availability, standard office scanners are often used for the imaging acquisition of developed Western blot films. Furthermore, the use of semi-quantitative software as ImageJ (Java-based image-processing and analysis software) is clearly increasing in different scientific fields. In this work, we describe the use of office scanner coupled with the ImageJ software together with a new image background subtraction method for accurate Western blot quantification. The proposed method represents an affordable, accurate and reproducible approximation that could be used in the presence of limited resources availability. Copyright © 2018 Elsevier B.V. All rights reserved.
Santos, Hugo M; Reboiro-Jato, Miguel; Glez-Peña, Daniel; Nunes-Miranda, J D; Fdez-Riverola, Florentino; Carvallo, R; Capelo, J L
2010-09-15
The decision peptide-driven tool implements a software application for assisting the user in a protocol for accurate protein quantification based on the following steps: (1) protein separation through gel electrophoresis; (2) in-gel protein digestion; (3) direct and inverse (18)O-labeling and (4) matrix assisted laser desorption ionization time of flight mass spectrometry, MALDI analysis. The DPD software compares the MALDI results of the direct and inverse (18)O-labeling experiments and quickly identifies those peptides with paralleled loses in different sets of a typical proteomic workflow. Those peptides are used for subsequent accurate protein quantification. The interpretation of the MALDI data from direct and inverse labeling experiments is time-consuming requiring a significant amount of time to do all comparisons manually. The DPD software shortens and simplifies the searching of the peptides that must be used for quantification from a week to just some minutes. To do so, it takes as input several MALDI spectra and aids the researcher in an automatic mode (i) to compare data from direct and inverse (18)O-labeling experiments, calculating the corresponding ratios to determine those peptides with paralleled losses throughout different sets of experiments; and (ii) allow to use those peptides as internal standards for subsequent accurate protein quantification using (18)O-labeling. In this work the DPD software is presented and explained with the quantification of protein carbonic anhydrase. Copyright (c) 2010 Elsevier B.V. All rights reserved.
The role of PET quantification in cardiovascular imaging.
Slomka, Piotr; Berman, Daniel S; Alexanderson, Erick; Germano, Guido
2014-08-01
Positron Emission Tomography (PET) has several clinical and research applications in cardiovascular imaging. Myocardial perfusion imaging with PET allows accurate global and regional measurements of myocardial perfusion, myocardial blood flow and function at stress and rest in one exam. Simultaneous assessment of function and perfusion by PET with quantitative software is currently the routine practice. Combination of ejection fraction reserve with perfusion information may improve the identification of severe disease. The myocardial viability can be estimated by quantitative comparison of fluorodeoxyglucose ( 18 FDG) and rest perfusion imaging. The myocardial blood flow and coronary flow reserve measurements are becoming routinely included in the clinical assessment due to enhanced dynamic imaging capabilities of the latest PET/CT scanners. Absolute flow measurements allow evaluation of the coronary microvascular dysfunction and provide additional prognostic and diagnostic information for coronary disease. Standard quantitative approaches to compute myocardial blood flow from kinetic PET data in automated and rapid fashion have been developed for 13 N-ammonia, 15 O-water and 82 Rb radiotracers. The agreement between software methods available for such analysis is excellent. Relative quantification of 82 Rb PET myocardial perfusion, based on comparisons to normal databases, demonstrates high performance for the detection of obstructive coronary disease. New tracers, such as 18 F-flurpiridaz may allow further improvements in the disease detection. Computerized analysis of perfusion at stress and rest reduces the variability of the assessment as compared to visual analysis. PET quantification can be enhanced by precise coregistration with CT angiography. In emerging clinical applications, the potential to identify vulnerable plaques by quantification of atherosclerotic plaque uptake of 18 FDG and 18 F-sodium fluoride tracers in carotids, aorta and coronary arteries has been demonstrated.
CASL Dakota Capabilities Summary
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Simmons, Chris; Williams, Brian J.
2017-10-10
The Dakota software project serves the mission of Sandia National Laboratories and supports a worldwide user community by delivering state-of-the-art research and robust, usable software for optimization and uncertainty quantification. These capabilities enable advanced exploration and riskinformed prediction with a wide range of computational science and engineering models. Dakota is the verification and validation (V&V) / uncertainty quantification (UQ) software delivery vehicle for CASL, allowing analysts across focus areas to apply these capabilities to myriad nuclear engineering analyses.
Provost, Karine; Leblond, Antoine; Gauthier-Lemire, Annie; Filion, Édith; Bahig, Houda; Lord, Martin
2017-09-01
Planar perfusion scintigraphy with 99m Tc-labeled macroaggregated albumin is often used for pretherapy quantification of regional lung perfusion in lung cancer patients, particularly those with poor respiratory function. However, subdividing lung parenchyma into rectangular regions of interest, as done on planar images, is a poor reflection of true lobar anatomy. New tridimensional methods using SPECT and SPECT/CT have been introduced, including semiautomatic lung segmentation software. The present study evaluated inter- and intraobserver agreement on quantification using SPECT/CT software and compared the results for regional lung contribution obtained with SPECT/CT and planar scintigraphy. Methods: Thirty lung cancer patients underwent ventilation-perfusion scintigraphy with 99m Tc-macroaggregated albumin and 99m Tc-Technegas. The regional lung contribution to perfusion and ventilation was measured on both planar scintigraphy and SPECT/CT using semiautomatic lung segmentation software by 2 observers. Interobserver and intraobserver agreement for the SPECT/CT software was assessed using the intraclass correlation coefficient, Bland-Altman plots, and absolute differences in measurements. Measurements from planar and tridimensional methods were compared using the paired-sample t test and mean absolute differences. Results: Intraclass correlation coefficients were in the excellent range (above 0.9) for both interobserver and intraobserver agreement using the SPECT/CT software. Bland-Altman analyses showed very narrow limits of agreement. Absolute differences were below 2.0% in 96% of both interobserver and intraobserver measurements. There was a statistically significant difference between planar and SPECT/CT methods ( P < 0.001) for quantification of perfusion and ventilation for all right lung lobes, with a maximal mean absolute difference of 20.7% for the right middle lobe. There was no statistically significant difference in quantification of perfusion and ventilation for the left lung lobes using either method; however, absolute differences reached 12.0%. The total right and left lung contributions were similar for the two methods, with a mean difference of 1.2% for perfusion and 2.0% for ventilation. Conclusion: Quantification of regional lung perfusion and ventilation using SPECT/CT-based lung segmentation software is highly reproducible. This tridimensional method yields statistically significant differences in measurements for right lung lobes when compared with planar scintigraphy. We recommend that SPECT/CT-based quantification be used for all lung cancer patients undergoing pretherapy evaluation of regional lung function. © 2017 by the Society of Nuclear Medicine and Molecular Imaging.
Automated quantification of neuronal networks and single-cell calcium dynamics using calcium imaging
Patel, Tapan P.; Man, Karen; Firestein, Bonnie L.; Meaney, David F.
2017-01-01
Background Recent advances in genetically engineered calcium and membrane potential indicators provide the potential to estimate the activation dynamics of individual neurons within larger, mesoscale networks (100s–1000 +neurons). However, a fully integrated automated workflow for the analysis and visualization of neural microcircuits from high speed fluorescence imaging data is lacking. New method Here we introduce FluoroSNNAP, Fluorescence Single Neuron and Network Analysis Package. FluoroSNNAP is an open-source, interactive software developed in MATLAB for automated quantification of numerous biologically relevant features of both the calcium dynamics of single-cells and network activity patterns. FluoroSNNAP integrates and improves upon existing tools for spike detection, synchronization analysis, and inference of functional connectivity, making it most useful to experimentalists with little or no programming knowledge. Results We apply FluoroSNNAP to characterize the activity patterns of neuronal microcircuits undergoing developmental maturation in vitro. Separately, we highlight the utility of single-cell analysis for phenotyping a mixed population of neurons expressing a human mutant variant of the microtubule associated protein tau and wild-type tau. Comparison with existing method(s) We show the performance of semi-automated cell segmentation using spatiotemporal independent component analysis and significant improvement in detecting calcium transients using a template-based algorithm in comparison to peak-based or wavelet-based detection methods. Our software further enables automated analysis of microcircuits, which is an improvement over existing methods. Conclusions We expect the dissemination of this software will facilitate a comprehensive analysis of neuronal networks, promoting the rapid interrogation of circuits in health and disease. PMID:25629800
Brunner, J; Krummenauer, F; Lehr, H A
2000-04-01
Study end-points in microcirculation research are usually video-taped images rather than numeric computer print-outs. Analysis of these video-taped images for the quantification of microcirculatory parameters usually requires computer-based image analysis systems. Most software programs for image analysis are custom-made, expensive, and limited in their applicability to selected parameters and study end-points. We demonstrate herein that an inexpensive, commercially available computer software (Adobe Photoshop), run on a Macintosh G3 computer with inbuilt graphic capture board provides versatile, easy to use tools for the quantification of digitized video images. Using images obtained by intravital fluorescence microscopy from the pre- and postischemic muscle microcirculation in the skinfold chamber model in hamsters, Photoshop allows simple and rapid quantification (i) of microvessel diameters, (ii) of the functional capillary density and (iii) of postischemic leakage of FITC-labeled high molecular weight dextran from postcapillary venules. We present evidence of the technical accuracy of the software tools and of a high degree of interobserver reliability. Inexpensive commercially available imaging programs (i.e., Adobe Photoshop) provide versatile tools for image analysis with a wide range of potential applications in microcirculation research.
Automated lobar quantification of emphysema in patients with severe COPD.
Revel, Marie-Pierre; Faivre, Jean-Baptiste; Remy-Jardin, Martine; Deken, Valérie; Duhamel, Alain; Marquette, Charles-Hugo; Tacelli, Nunzia; Bakai, Anne-Marie; Remy, Jacques
2008-12-01
Automated lobar quantification of emphysema has not yet been evaluated. Unenhanced 64-slice MDCT was performed in 47 patients evaluated before bronchoscopic lung-volume reduction. CT images reconstructed with a standard (B20) and high-frequency (B50) kernel were analyzed using a dedicated prototype software (MevisPULMO) allowing lobar quantification of emphysema extent. Lobar quantification was obtained following (a) a fully automatic delineation of the lobar limits by the software and (b) a semiautomatic delineation with manual correction of the lobar limits when necessary and was compared with the visual scoring of emphysema severity per lobe. No statistically significant difference existed between automated and semiautomated lobar quantification (p > 0.05 in the five lobes), with differences ranging from 0.4 to 3.9%. The agreement between the two methods (intraclass correlation coefficient, ICC) was excellent for left upper lobe (ICC = 0.94), left lower lobe (ICC = 0.98), and right lower lobe (ICC = 0.80). The agreement was good for right upper lobe (ICC = 0.68) and moderate for middle lobe (IC = 0.53). The Bland and Altman plots confirmed these results. A good agreement was observed between the software and visually assessed lobar predominance of emphysema (kappa 0.78; 95% CI 0.64-0.92). Automated and semiautomated lobar quantifications of emphysema are concordant and show good agreement with visual scoring.
VAVUQ, Python and Matlab freeware for Verification and Validation, Uncertainty Quantification
NASA Astrophysics Data System (ADS)
Courtney, J. E.; Zamani, K.; Bombardelli, F. A.; Fleenor, W. E.
2015-12-01
A package of scripts is presented for automated Verification and Validation (V&V) and Uncertainty Quantification (UQ) for engineering codes that approximate Partial Differential Equations (PDFs). The code post-processes model results to produce V&V and UQ information. This information can be used to assess model performance. Automated information on code performance can allow for a systematic methodology to assess the quality of model approximations. The software implements common and accepted code verification schemes. The software uses the Method of Manufactured Solutions (MMS), the Method of Exact Solution (MES), Cross-Code Verification, and Richardson Extrapolation (RE) for solution (calculation) verification. It also includes common statistical measures that can be used for model skill assessment. Complete RE can be conducted for complex geometries by implementing high-order non-oscillating numerical interpolation schemes within the software. Model approximation uncertainty is quantified by calculating lower and upper bounds of numerical error from the RE results. The software is also able to calculate the Grid Convergence Index (GCI), and to handle adaptive meshes and models that implement mixed order schemes. Four examples are provided to demonstrate the use of the software for code and solution verification, model validation and uncertainty quantification. The software is used for code verification of a mixed-order compact difference heat transport solver; the solution verification of a 2D shallow-water-wave solver for tidal flow modeling in estuaries; the model validation of a two-phase flow computation in a hydraulic jump compared to experimental data; and numerical uncertainty quantification for 3D CFD modeling of the flow patterns in a Gust erosion chamber.
Find Pairs: The Module for Protein Quantification of the PeakQuant Software Suite
Eisenacher, Martin; Kohl, Michael; Wiese, Sebastian; Hebeler, Romano; Meyer, Helmut E.
2012-01-01
Abstract Accurate quantification of proteins is one of the major tasks in current proteomics research. To address this issue, a wide range of stable isotope labeling techniques have been developed, allowing one to quantitatively study thousands of proteins by means of mass spectrometry. In this article, the FindPairs module of the PeakQuant software suite is detailed. It facilitates the automatic determination of protein abundance ratios based on the automated analysis of stable isotope-coded mass spectrometric data. Furthermore, it implements statistical methods to determine outliers due to biological as well as technical variance of proteome data obtained in replicate experiments. This provides an important means to evaluate the significance in obtained protein expression data. For demonstrating the high applicability of FindPairs, we focused on the quantitative analysis of proteome data acquired in 14N/15N labeling experiments. We further provide a comprehensive overview of the features of the FindPairs software, and compare these with existing quantification packages. The software presented here supports a wide range of proteomics applications, allowing one to quantitatively assess data derived from different stable isotope labeling approaches, such as 14N/15N labeling, SILAC, and iTRAQ. The software is publicly available at http://www.medizinisches-proteom-center.de/software and free for academic use. PMID:22909347
Lim, Hyun-ju; Weinheimer, Oliver; Wielpütz, Mark O.; Dinkel, Julien; Hielscher, Thomas; Gompelmann, Daniela; Kauczor, Hans-Ulrich; Heussel, Claus Peter
2016-01-01
Objectives Surgical or bronchoscopic lung volume reduction (BLVR) techniques can be beneficial for heterogeneous emphysema. Post-processing software tools for lobar emphysema quantification are useful for patient and target lobe selection, treatment planning and post-interventional follow-up. We aimed to evaluate the inter-software variability of emphysema quantification using fully automated lobar segmentation prototypes. Material and Methods 66 patients with moderate to severe COPD who underwent CT for planning of BLVR were included. Emphysema quantification was performed using 2 modified versions of in-house software (without and with prototype advanced lung vessel segmentation; programs 1 [YACTA v.2.3.0.2] and 2 [YACTA v.2.4.3.1]), as well as 1 commercial program 3 [Pulmo3D VA30A_HF2] and 1 pre-commercial prototype 4 [CT COPD ISP ver7.0]). The following parameters were computed for each segmented anatomical lung lobe and the whole lung: lobar volume (LV), mean lobar density (MLD), 15th percentile of lobar density (15th), emphysema volume (EV) and emphysema index (EI). Bland-Altman analysis (limits of agreement, LoA) and linear random effects models were used for comparison between the software. Results Segmentation using programs 1, 3 and 4 was unsuccessful in 1 (1%), 7 (10%) and 5 (7%) patients, respectively. Program 2 could analyze all datasets. The 53 patients with successful segmentation by all 4 programs were included for further analysis. For LV, program 1 and 4 showed the largest mean difference of 72 ml and the widest LoA of [-356, 499 ml] (p<0.05). Program 3 and 4 showed the largest mean difference of 4% and the widest LoA of [-7, 14%] for EI (p<0.001). Conclusions Only a single software program was able to successfully analyze all scheduled data-sets. Although mean bias of LV and EV were relatively low in lobar quantification, ranges of disagreement were substantial in both of them. For longitudinal emphysema monitoring, not only scanning protocol but also quantification software needs to be kept constant. PMID:27029047
2017-01-01
Unique Molecular Identifiers (UMIs) are random oligonucleotide barcodes that are increasingly used in high-throughput sequencing experiments. Through a UMI, identical copies arising from distinct molecules can be distinguished from those arising through PCR amplification of the same molecule. However, bioinformatic methods to leverage the information from UMIs have yet to be formalized. In particular, sequencing errors in the UMI sequence are often ignored or else resolved in an ad hoc manner. We show that errors in the UMI sequence are common and introduce network-based methods to account for these errors when identifying PCR duplicates. Using these methods, we demonstrate improved quantification accuracy both under simulated conditions and real iCLIP and single-cell RNA-seq data sets. Reproducibility between iCLIP replicates and single-cell RNA-seq clustering are both improved using our proposed network-based method, demonstrating the value of properly accounting for errors in UMIs. These methods are implemented in the open source UMI-tools software package. PMID:28100584
OPAD-EDIFIS Real-Time Processing
NASA Technical Reports Server (NTRS)
Katsinis, Constantine
1997-01-01
The Optical Plume Anomaly Detection (OPAD) detects engine hardware degradation of flight vehicles through identification and quantification of elemental species found in the plume by analyzing the plume emission spectra in a real-time mode. Real-time performance of OPAD relies on extensive software which must report metal amounts in the plume faster than once every 0.5 sec. OPAD software previously written by NASA scientists performed most necessary functions at speeds which were far below what is needed for real-time operation. The research presented in this report improved the execution speed of the software by optimizing the code without changing the algorithms and converting it into a parallelized form which is executed in a shared-memory multiprocessor system. The resulting code was subjected to extensive timing analysis. The report also provides suggestions for further performance improvement by (1) identifying areas of algorithm optimization, (2) recommending commercially available multiprocessor architectures and operating systems to support real-time execution and (3) presenting an initial study of fault-tolerance requirements.
Grasso, Chiara; Trevisan, Morena; Fiano, Valentina; Tarallo, Valentina; De Marco, Laura; Sacerdote, Carlotta; Richiardi, Lorenzo; Merletti, Franco; Gillio-Tos, Anna
2016-01-01
Pyrosequencing has emerged as an alternative method of nucleic acid sequencing, well suited for many applications which aim to characterize single nucleotide polymorphisms, mutations, microbial types and CpG methylation in the target DNA. The commercially available pyrosequencing systems can harbor two different types of software which allow analysis in AQ or CpG mode, respectively, both widely employed for DNA methylation analysis. Aim of the study was to assess the performance for DNA methylation analysis at CpG sites of the two pyrosequencing software which allow analysis in AQ or CpG mode, respectively. Despite CpG mode having been specifically generated for CpG methylation quantification, many investigations on this topic have been carried out with AQ mode. As proof of equivalent performance of the two software for this type of analysis is not available, the focus of this paper was to evaluate if the two modes currently used for CpG methylation assessment by pyrosequencing may give overlapping results. We compared the performance of the two software in quantifying DNA methylation in the promoter of selected genes (GSTP1, MGMT, LINE-1) by testing two case series which include DNA from paraffin embedded prostate cancer tissues (PC study, N = 36) and DNA from blood fractions of healthy people (DD study, N = 28), respectively. We found discrepancy in the two pyrosequencing software-based quality assignment of DNA methylation assays. Compared to the software for analysis in the AQ mode, less permissive criteria are supported by the Pyro Q-CpG software, which enables analysis in CpG mode. CpG mode warns the operators about potential unsatisfactory performance of the assay and ensures a more accurate quantitative evaluation of DNA methylation at CpG sites. The implementation of CpG mode is strongly advisable in order to improve the reliability of the methylation analysis results achievable by pyrosequencing.
2015-01-01
Food consumption is an important behavior that is regulated by an intricate array of neuropeptides (NPs). Although many feeding-related NPs have been identified in mammals, precise mechanisms are unclear and difficult to study in mammals, as current methods are not highly multiplexed and require extensive a priori knowledge about analytes. New advances in data-independent acquisition (DIA) MS/MS and the open-source quantification software Skyline have opened up the possibility to identify hundreds of compounds and quantify them from a single DIA MS/MS run. An untargeted DIA MSE quantification method using Skyline software for multiplexed, discovery-driven quantification was developed and found to produce linear calibration curves for peptides at physiologically relevant concentrations using a protein digest as internal standard. By using this method, preliminary relative quantification of the crab Cancer borealis neuropeptidome (<2 kDa, 137 peptides from 18 families) was possible in microdialysates from 8 replicate feeding experiments. Of these NPs, 55 were detected with an average mass error below 10 ppm. The time-resolved profiles of relative concentration changes for 6 are shown, and there is great potential for the use of this method in future experiments to aid in correlation of NP changes with behavior. This work presents an unbiased approach to winnowing candidate NPs related to a behavior of interest in a functionally relevant manner, and demonstrates the success of such a UPLC-MSE quantification method using the open source software Skyline. PMID:25552291
Schmerberg, Claire M; Liang, Zhidan; Li, Lingjun
2015-01-21
Food consumption is an important behavior that is regulated by an intricate array of neuropeptides (NPs). Although many feeding-related NPs have been identified in mammals, precise mechanisms are unclear and difficult to study in mammals, as current methods are not highly multiplexed and require extensive a priori knowledge about analytes. New advances in data-independent acquisition (DIA) MS/MS and the open-source quantification software Skyline have opened up the possibility to identify hundreds of compounds and quantify them from a single DIA MS/MS run. An untargeted DIA MS(E) quantification method using Skyline software for multiplexed, discovery-driven quantification was developed and found to produce linear calibration curves for peptides at physiologically relevant concentrations using a protein digest as internal standard. By using this method, preliminary relative quantification of the crab Cancer borealis neuropeptidome (<2 kDa, 137 peptides from 18 families) was possible in microdialysates from 8 replicate feeding experiments. Of these NPs, 55 were detected with an average mass error below 10 ppm. The time-resolved profiles of relative concentration changes for 6 are shown, and there is great potential for the use of this method in future experiments to aid in correlation of NP changes with behavior. This work presents an unbiased approach to winnowing candidate NPs related to a behavior of interest in a functionally relevant manner, and demonstrates the success of such a UPLC-MS(E) quantification method using the open source software Skyline.
Turner, Clare E; Russell, Bruce R; Gant, Nicholas
2015-11-01
Magnetic resonance spectroscopy (MRS) is an analytical procedure that can be used to non-invasively measure the concentration of a range of neural metabolites. Creatine is an important neurometabolite with dietary supplementation offering therapeutic potential for neurological disorders with dysfunctional energetic processes. Neural creatine concentrations can be probed using proton MRS and quantified using a range of software packages based on different analytical methods. This experiment examines the differences in quantification performance of two commonly used analysis packages following a creatine supplementation strategy with potential therapeutic application. Human participants followed a seven day dietary supplementation regime in a placebo-controlled, cross-over design interspersed with a five week wash-out period. Spectroscopy data were acquired the day immediately following supplementation and analyzed with two commonly-used software packages which employ vastly different quantification methods. Results demonstrate that neural creatine concentration was augmented following creatine supplementation when analyzed using the peak fitting method of quantification (105.9%±10.1). In contrast, no change in neural creatine levels were detected with supplementation when analysis was conducted using the basis spectrum method of quantification (102.6%±8.6). Results suggest that software packages that employ the peak fitting procedure for spectral quantification are possibly more sensitive to subtle changes in neural creatine concentrations. The relative simplicity of the spectroscopy sequence and the data analysis procedure suggest that peak fitting procedures may be the most effective means of metabolite quantification when detection of subtle alterations in neural metabolites is necessary. The straightforward technique can be used on a clinical magnetic resonance imaging system. Copyright © 2015 Elsevier Inc. All rights reserved.
Braun, Martin; Kirsten, Robert; Rupp, Niels J; Moch, Holger; Fend, Falko; Wernert, Nicolas; Kristiansen, Glen; Perner, Sven
2013-05-01
Quantification of protein expression based on immunohistochemistry (IHC) is an important step for translational research and clinical routine. Several manual ('eyeballing') scoring systems are used in order to semi-quantify protein expression based on chromogenic intensities and distribution patterns. However, manual scoring systems are time-consuming and subject to significant intra- and interobserver variability. The aim of our study was to explore, whether new image analysis software proves to be sufficient as an alternative tool to quantify protein expression. For IHC experiments, one nucleus specific marker (i.e., ERG antibody), one cytoplasmic specific marker (i.e., SLC45A3 antibody), and one marker expressed in both compartments (i.e., TMPRSS2 antibody) were chosen. Stainings were applied on TMAs, containing tumor material of 630 prostate cancer patients. A pathologist visually quantified all IHC stainings in a blinded manner, applying a four-step scoring system. For digital quantification, image analysis software (Tissue Studio v.2.1, Definiens AG, Munich, Germany) was applied to obtain a continuous spectrum of average staining intensity. For each of the three antibodies we found a strong correlation of the manual protein expression score and the score of the image analysis software. Spearman's rank correlation coefficient was 0.94, 0.92, and 0.90 for ERG, SLC45A3, and TMPRSS2, respectively (p⟨0.01). Our data suggest that the image analysis software Tissue Studio is a powerful tool for quantification of protein expression in IHC stainings. Further, since the digital analysis is precise and reproducible, computer supported protein quantification might help to overcome intra- and interobserver variability and increase objectivity of IHC based protein assessment.
Otani, Kyoko; Nakazono, Akemi; Salgo, Ivan S; Lang, Roberto M; Takeuchi, Masaaki
2016-10-01
Echocardiographic determination of left heart chamber volumetric parameters by using manual tracings during multiple beats is tedious in atrial fibrillation (AF). The aim of this study was to determine the usefulness of fully automated left chamber quantification software with single-beat three-dimensional transthoracic echocardiographic data sets in patients with AF. Single-beat full-volume three-dimensional transthoracic echocardiographic data sets were prospectively acquired during consecutive multiple cardiac beats (≥10 beats) in 88 patients with AF. In protocol 1, left ventricular volumes, left ventricular ejection fraction, and maximal left atrial volume were validated using automated quantification against the manual tracing method in identical beats in 10 patients. In protocol 2, automated quantification-derived averaged values from multiple beats were compared with the corresponding values obtained from the indexed beat in all patients. Excellent correlations of left chamber parameters between automated quantification and the manual method were observed (r = 0.88-0.98) in protocol 1. The time required for the analysis with the automated quantification method (5 min) was significantly less compared with the manual method (27 min) (P < .0001). In protocol 2, there were excellent linear correlations between the averaged left chamber parameters and the corresponding values obtained from the indexed beat (r = 0.94-0.99), and test-retest variability of left chamber parameters was low (3.5%-4.8%). Three-dimensional transthoracic echocardiography with fully automated quantification software is a rapid and reliable way to measure averaged values of left heart chamber parameters during multiple consecutive beats. Thus, it is a potential new approach for left chamber quantification in patients with AF in daily routine practice. Copyright © 2016 American Society of Echocardiography. Published by Elsevier Inc. All rights reserved.
Final Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Marzouk, Youssef; Conrad, Patrick; Bigoni, Daniele
QUEST (\\url{www.quest-scidac.org}) is a SciDAC Institute that is focused on uncertainty quantification (UQ) in large-scale scientific computations. Our goals are to (1) advance the state of the art in UQ mathematics, algorithms, and software; and (2) provide modeling, algorithmic, and general UQ expertise, together with software tools, to other SciDAC projects, thereby enabling and guiding a broad range of UQ activities in their respective contexts. QUEST is a collaboration among six institutions (Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University) with a historymore » of joint UQ research. Our vision encompasses all aspects of UQ in leadership-class computing. This includes the well-founded setup of UQ problems; characterization of the input space given available data/information; local and global sensitivity analysis; adaptive dimensionality and order reduction; forward and inverse propagation of uncertainty; handling of application code failures, missing data, and hardware/software fault tolerance; and model inadequacy, comparison, validation, selection, and averaging. The nature of the UQ problem requires the seamless combination of data, models, and information across this landscape in a manner that provides a self-consistent quantification of requisite uncertainties in predictions from computational models. Accordingly, our UQ methods and tools span an interdisciplinary space across applied math, information theory, and statistics. The MIT QUEST effort centers on statistical inference and methods for surrogate or reduced-order modeling. MIT personnel have been responsible for the development of adaptive sampling methods, methods for approximating computationally intensive models, and software for both forward uncertainty propagation and statistical inverse problems. A key software product of the MIT QUEST effort is the MIT Uncertainty Quantification library, called MUQ (\\url{muq.mit.edu}).« less
Chae, Soo Young; Suh, Sangil; Ryoo, Inseon; Park, Arim; Noh, Kyoung Jin; Shim, Hackjoon; Seol, Hae Young
2017-05-01
We developed a semi-automated volumetric software, NPerfusion, to segment brain tumors and quantify perfusion parameters on whole-brain CT perfusion (WBCTP) images. The purpose of this study was to assess the feasibility of the software and to validate its performance compared with manual segmentation. Twenty-nine patients with pathologically proven brain tumors who underwent preoperative WBCTP between August 2012 and February 2015 were included. Three perfusion parameters, arterial flow (AF), equivalent blood volume (EBV), and Patlak flow (PF, which is a measure of permeability of capillaries), of brain tumors were generated by a commercial software and then quantified volumetrically by NPerfusion, which also semi-automatically segmented tumor boundaries. The quantification was validated by comparison with that of manual segmentation in terms of the concordance correlation coefficient and Bland-Altman analysis. With NPerfusion, we successfully performed segmentation and quantified whole volumetric perfusion parameters of all 29 brain tumors that showed consistent perfusion trends with previous studies. The validation of the perfusion parameter quantification exhibited almost perfect agreement with manual segmentation, with Lin concordance correlation coefficients (ρ c ) for AF, EBV, and PF of 0.9988, 0.9994, and 0.9976, respectively. On Bland-Altman analysis, most differences between this software and manual segmentation on the commercial software were within the limit of agreement. NPerfusion successfully performs segmentation of brain tumors and calculates perfusion parameters of brain tumors. We validated this semi-automated segmentation software by comparing it with manual segmentation. NPerfusion can be used to calculate volumetric perfusion parameters of brain tumors from WBCTP.
Folks, Russell D; Garcia, Ernest V; Taylor, Andrew T
2007-03-01
Quantitative nuclear renography has numerous potential sources of error. We previously reported the initial development of a computer software module for comprehensively addressing the issue of quality control (QC) in the analysis of radionuclide renal images. The objective of this study was to prospectively test the QC software. The QC software works in conjunction with standard quantitative renal image analysis using a renal quantification program. The software saves a text file that summarizes QC findings as possible errors in user-entered values, calculated values that may be unreliable because of the patient's clinical condition, and problems relating to acquisition or processing. To test the QC software, a technologist not involved in software development processed 83 consecutive nontransplant clinical studies. The QC findings of the software were then tabulated. QC events were defined as technical (study descriptors that were out of range or were entered and then changed, unusually sized or positioned regions of interest, or missing frames in the dynamic image set) or clinical (calculated functional values judged to be erroneous or unreliable). Technical QC events were identified in 36 (43%) of 83 studies. Clinical QC events were identified in 37 (45%) of 83 studies. Specific QC events included starting the camera after the bolus had reached the kidney, dose infiltration, oversubtraction of background activity, and missing frames in the dynamic image set. QC software has been developed to automatically verify user input, monitor calculation of renal functional parameters, summarize QC findings, and flag potentially unreliable values for the nuclear medicine physician. Incorporation of automated QC features into commercial or local renal software can reduce errors and improve technologist performance and should improve the efficiency and accuracy of image interpretation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Gaschen, Brian Keith; Bloch, Jeffrey Joseph; Porter, Reid
Morphological signatures of bulk SNM materials have significant promise, but these potential signatures are not fully utilized. This document describes software tools, collectively called the MAMA (Morphological Analysis for Material Attribution) software that can help provide robust and accurate quantification of morphological features in bulk material microscopy images (Optical, SEM). Although many of the specific tools are not unique to Mama, the software package has been designed specifically for nuclear material morphological analysis, and is at a point where it can be easily adapted (by Los Alamos or by collaborators) in response to new, different, or changing forensics needs. Themore » current release of the MAMA software only includes the image quantification, descriptions, and annotation functionality. Only limited information on a sample, its pedigree, and its chemistry are recorded inside this part of the software. This was decision based on initial feedback and the fact that there are several analytical chemistry databases being developed within the community. Currently MAMA is a standalone program that can export quantification results in a basic text format that can be imported into other programs such as Excel and Access. There is also a basic report generating feature that produces HTML formatted pages of the same information. We will be working with collaborators to provide better integration of MAMA into their particular systems, databases and workflows.« less
NASA Technical Reports Server (NTRS)
Dill, Loren H.; Choo, Yung K. (Technical Monitor)
2004-01-01
Software was developed to construct approximating NURBS curves for iced airfoil geometries. Users specify a tolerance that determines the extent to which the approximating curve follows the rough ice. The user can therefore smooth the ice geometry in a controlled manner, thereby enabling the generation of grids suitable for numerical aerodynamic simulations. Ultimately, this ability to smooth the ice geometry will permit studies of the effects of smoothing upon the aerodynamics of iced airfoils. The software was applied to several different types of iced airfoil data collected in the Icing Research Tunnel at NASA Glenn Research Center, and in all cases was found to efficiently generate suitable approximating NURBS curves. This method is an improvement over the current "control point formulation" of Smaggice (v.1.2). In this report, we present the relevant theory of approximating NURBS curves and discuss typical results of the software.
A Database for Propagation Models and Conversion to C++ Programming Language
NASA Technical Reports Server (NTRS)
Kantak, Anil V.; Angkasa, Krisjani; Rucker, James
1996-01-01
The telecommunications system design engineer generally needs the quantification of effects of the propagation medium (definition of the propagation channel) to design an optimal communications system. To obtain the definition of the channel, the systems engineer generally has a few choices. A search of the relevant publications such as the IEEE Transactions, CCIR's, NASA propagation handbook, etc., may be conducted to find the desired channel values. This method may need excessive amounts of time and effort on the systems engineer's part and there is a possibility that the search may not even yield the needed results. To help the researcher and the systems engineers, it was recommended by the conference participants of NASA Propagation Experimenters (NAPEX) XV (London, Ontario, Canada, June 28 and 29, 1991) that a software should be produced that would contain propagation models and the necessary prediction methods of most propagation phenomena. Moreover, the software should be flexible enough for the user to make slight changes to the models without expending a substantial effort in programming. In the past few years, a software was produced to fit these requirements as best as could be done. The software was distributed to all NAPEX participants for evaluation and use, the participant reactions, suggestions etc., were gathered and were used to improve the subsequent releases of the software. The existing database program is in the Microsoft Excel application software and works fine within the guidelines of that environment, however, recently there have been some questions about the robustness and survivability of the Excel software in the ever changing (hopefully improving) world of software packages.
Measuring the complexity of design in real-time imaging software
NASA Astrophysics Data System (ADS)
Sangwan, Raghvinder S.; Vercellone-Smith, Pamela; Laplante, Phillip A.
2007-02-01
Due to the intricacies in the algorithms involved, the design of imaging software is considered to be more complex than non-image processing software (Sangwan et al, 2005). A recent investigation (Larsson and Laplante, 2006) examined the complexity of several image processing and non-image processing software packages along a wide variety of metrics, including those postulated by McCabe (1976), Chidamber and Kemerer (1994), and Martin (2003). This work found that it was not always possible to quantitatively compare the complexity between imaging applications and nonimage processing systems. Newer research and an accompanying tool (Structure 101, 2006), however, provides a greatly simplified approach to measuring software complexity. Therefore it may be possible to definitively quantify the complexity differences between imaging and non-imaging software, between imaging and real-time imaging software, and between software programs of the same application type. In this paper, we review prior results and describe the methodology for measuring complexity in imaging systems. We then apply a new complexity measurement methodology to several sets of imaging and non-imaging code in order to compare the complexity differences between the two types of applications. The benefit of such quantification is far reaching, for example, leading to more easily measured performance improvement and quality in real-time imaging code.
Romero, Peggy; Miller, Ted; Garakani, Arman
2009-12-01
Current methods to assess neurodegradation in dorsal root ganglion cultures as a model for neurodegenerative diseases are imprecise and time-consuming. Here we describe two new methods to quantify neuroprotection in these cultures. The neurite quality index (NQI) builds upon earlier manual methods, incorporating additional morphological events to increase detection sensitivity for the detection of early degeneration events. Neurosight is a machine vision-based method that recapitulates many of the strengths of NQI while enabling high-throughput screening applications with decreased costs.
Kim, Song Soo; Seo, Joon Beom; Kim, Namkug; Chae, Eun Jin; Lee, Young Kyung; Oh, Yeon Mok; Lee, Sang Do
2014-01-01
To determine the improvement of emphysema quantification with density correction and to determine the optimal site to use for air density correction on volumetric computed tomography (CT). Seventy-eight CT scans of COPD patients (GOLD II-IV, smoking history 39.2±25.3 pack-years) were obtained from several single-vendor 16-MDCT scanners. After density measurement of aorta, tracheal- and external air, volumetric CT density correction was conducted (two reference values: air, -1,000 HU/blood, +50 HU). Using in-house software, emphysema index (EI) and mean lung density (MLD) were calculated. Differences in air densities, MLD and EI prior to and after density correction were evaluated (paired t-test). Correlation between those parameters and FEV1 and FEV1/FVC were compared (age- and sex adjusted partial correlation analysis). Measured densities (HU) of tracheal- and external air differed significantly (-990 ± 14, -1016 ± 9, P<0.001). MLD and EI on original CT data, after density correction using tracheal- and external air also differed significantly (MLD: -874.9 ± 27.6 vs. -882.3 ± 24.9 vs. -860.5 ± 26.6; EI: 16.8 ± 13.4 vs. 21.1 ± 14.5 vs. 9.7 ± 10.5, respectively, P<0.001). The correlation coefficients between CT quantification indices and FEV1, and FEV1/FVC increased after density correction. The tracheal air correction showed better results than the external air correction. Density correction of volumetric CT data can improve correlations of emphysema quantification and PFT. Copyright © 2012 Elsevier Ireland Ltd. All rights reserved.
Levy, Franck; Dan Schouver, Elie; Iacuzio, Laura; Civaia, Filippo; Rusek, Stephane; Dommerc, Carinne; Marechaux, Sylvestre; Dor, Vincent; Tribouilloy, Christophe; Dreyfus, Gilles
2017-11-01
Three-dimensional (3D) transthoracic echocardiography (TTE) is superior to two-dimensional Simpson's method for assessment of left ventricular (LV) volumes and LV ejection fraction (LVEF). Nevertheless, 3D TTE is not incorporated into everyday practice, as current LV chamber quantification software products are time-consuming. To evaluate the feasibility, accuracy and reproducibility of new fully automated fast 3D TTE software (HeartModel A.I. ; Philips Healthcare, Andover, MA, USA) for quantification of LV volumes and LVEF in routine practice; to compare the 3D LV volumes and LVEF obtained with a cardiac magnetic resonance (CMR) reference; and to optimize automated default border settings with CMR as reference. Sixty-three consecutive patients, who had comprehensive 3D TTE and CMR examinations within 24hours, were eligible for inclusion. Nine patients (14%) were excluded because of insufficient echogenicity in the 3D TTE. Thus, 54 patients (40 men; mean age 63±13 years) were prospectively included into the study. The inter- and intraobserver reproducibilities of 3D TTE were excellent (coefficient of variation<10%) for end-diastolic volume (EDV), end-systolic volume (ESV) and LVEF. Despite a slight underestimation of EDV using 3D TTE compared with CMR (bias=-22±34mL; P<0.0001), a significant correlation was found between the two measurements (r=0.93; P=0.0001). Enlarging default border detection settings leads to frequent volume overestimation in the general population, but improved agreement with CMR in patients with LVEF≤50%. Correlations between 3D TTE and CMR for ESV and LVEF were excellent (r=0.93 and r=0.91, respectively; P<0.0001). 3D TTE using new-generation fully automated software is a feasible, fast, reproducible and accurate imaging modality for LV volumetric quantification in routine practice. Optimization of border detection settings may increase agreement with CMR for EDV assessment in dilated ventricles. Copyright © 2017 Elsevier Masson SAS. All rights reserved.
Dunet, Vincent; Klein, Ran; Allenbach, Gilles; Renaud, Jennifer; deKemp, Robert A; Prior, John O
2016-06-01
Several analysis software packages for myocardial blood flow (MBF) quantification from cardiac PET studies exist, but they have not been compared using concordance analysis, which can characterize precision and bias separately. Reproducible measurements are needed for quantification to fully develop its clinical potential. Fifty-one patients underwent dynamic Rb-82 PET at rest and during adenosine stress. Data were processed with PMOD and FlowQuant (Lortie model). MBF and myocardial flow reserve (MFR) polar maps were quantified and analyzed using a 17-segment model. Comparisons used Pearson's correlation ρ (measuring precision), Bland and Altman limit-of-agreement and Lin's concordance correlation ρc = ρ·C b (C b measuring systematic bias). Lin's concordance and Pearson's correlation values were very similar, suggesting no systematic bias between software packages with an excellent precision ρ for MBF (ρ = 0.97, ρc = 0.96, C b = 0.99) and good precision for MFR (ρ = 0.83, ρc = 0.76, C b = 0.92). On a per-segment basis, no mean bias was observed on Bland-Altman plots, although PMOD provided slightly higher values than FlowQuant at higher MBF and MFR values (P < .0001). Concordance between software packages was excellent for MBF and MFR, despite higher values by PMOD at higher MBF values. Both software packages can be used interchangeably for quantification in daily practice of Rb-82 cardiac PET.
Liu, Ruolin; Dickerson, Julie
2017-11-01
We propose a novel method and software tool, Strawberry, for transcript reconstruction and quantification from RNA-Seq data under the guidance of genome alignment and independent of gene annotation. Strawberry consists of two modules: assembly and quantification. The novelty of Strawberry is that the two modules use different optimization frameworks but utilize the same data graph structure, which allows a highly efficient, expandable and accurate algorithm for dealing large data. The assembly module parses aligned reads into splicing graphs, and uses network flow algorithms to select the most likely transcripts. The quantification module uses a latent class model to assign read counts from the nodes of splicing graphs to transcripts. Strawberry simultaneously estimates the transcript abundances and corrects for sequencing bias through an EM algorithm. Based on simulations, Strawberry outperforms Cufflinks and StringTie in terms of both assembly and quantification accuracies. Under the evaluation of a real data set, the estimated transcript expression by Strawberry has the highest correlation with Nanostring probe counts, an independent experiment measure for transcript expression. Strawberry is written in C++14, and is available as open source software at https://github.com/ruolin/strawberry under the MIT license.
Volumetric quantification of lung nodules in CT with iterative reconstruction (ASiR and MBIR).
Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Robins, Marthony; Colsher, James; Samei, Ehsan
2013-11-01
Volume quantifications of lung nodules with multidetector computed tomography (CT) images provide useful information for monitoring nodule developments. The accuracy and precision of the volume quantification, however, can be impacted by imaging and reconstruction parameters. This study aimed to investigate the impact of iterative reconstruction algorithms on the accuracy and precision of volume quantification with dose and slice thickness as additional variables. Repeated CT images were acquired from an anthropomorphic chest phantom with synthetic nodules (9.5 and 4.8 mm) at six dose levels, and reconstructed with three reconstruction algorithms [filtered backprojection (FBP), adaptive statistical iterative reconstruction (ASiR), and model based iterative reconstruction (MBIR)] into three slice thicknesses. The nodule volumes were measured with two clinical software (A: Lung VCAR, B: iNtuition), and analyzed for accuracy and precision. Precision was found to be generally comparable between FBP and iterative reconstruction with no statistically significant difference noted for different dose levels, slice thickness, and segmentation software. Accuracy was found to be more variable. For large nodules, the accuracy was significantly different between ASiR and FBP for all slice thicknesses with both software, and significantly different between MBIR and FBP for 0.625 mm slice thickness with Software A and for all slice thicknesses with Software B. For small nodules, the accuracy was more similar between FBP and iterative reconstruction, with the exception of ASIR vs FBP at 1.25 mm with Software A and MBIR vs FBP at 0.625 mm with Software A. The systematic difference between the accuracy of FBP and iterative reconstructions highlights the importance of extending current segmentation software to accommodate the image characteristics of iterative reconstructions. In addition, a calibration process may help reduce the dependency of accuracy on reconstruction algorithms, such that volumes quantified from scans of different reconstruction algorithms can be compared. The little difference found between the precision of FBP and iterative reconstructions could be a result of both iterative reconstruction's diminished noise reduction at the edge of the nodules as well as the loss of resolution at high noise levels with iterative reconstruction. The findings do not rule out potential advantage of IR that might be evident in a study that uses a larger number of nodules or repeated scans.
Quantifying reproducibility in computational biology: the case of the tuberculosis drugome.
Garijo, Daniel; Kinnings, Sarah; Xie, Li; Xie, Lei; Zhang, Yinliang; Bourne, Philip E; Gil, Yolanda
2013-01-01
How easy is it to reproduce the results found in a typical computational biology paper? Either through experience or intuition the reader will already know that the answer is with difficulty or not at all. In this paper we attempt to quantify this difficulty by reproducing a previously published paper for different classes of users (ranging from users with little expertise to domain experts) and suggest ways in which the situation might be improved. Quantification is achieved by estimating the time required to reproduce each of the steps in the method described in the original paper and make them part of an explicit workflow that reproduces the original results. Reproducing the method took several months of effort, and required using new versions and new software that posed challenges to reconstructing and validating the results. The quantification leads to "reproducibility maps" that reveal that novice researchers would only be able to reproduce a few of the steps in the method, and that only expert researchers with advance knowledge of the domain would be able to reproduce the method in its entirety. The workflow itself is published as an online resource together with supporting software and data. The paper concludes with a brief discussion of the complexities of requiring reproducibility in terms of cost versus benefit, and a desiderata with our observations and guidelines for improving reproducibility. This has implications not only in reproducing the work of others from published papers, but reproducing work from one's own laboratory.
Yalcin, Hulya; Valenta, Ines; Zhao, Min; Tahari, Abdel; Lu, Dai-Yin; Higuchi, Takahiro; Yalcin, Fatih; Kucukler, Nagehan; Soleimanifard, Yalda; Zhou, Yun; Pomper, Martin G; Abraham, Theodore P; Tsui, Ben; Lodge, Martin A; Schindler, Thomas H; Roselle Abraham, M
2018-01-22
Quantification of myocardial blood flow (MBF) by positron emission tomography (PET) is important for investigation of angina in hypertrophic cardiomyopathy (HCM). Several software programs exist for MBF quantification, but they have been mostly evaluated in patients (with normal cardiac geometry), referred for evaluation of coronary artery disease (CAD). Software performance has not been evaluated in HCM patients who frequently have hyperdynamic LV function, LV outflow tract (LVOT) obstruction, small LV cavity size, and variation in the degree/location of LV hypertrophy. We compared results of MBF obtained using PMod, which permits manual segmentation, to those obtained by FDA-approved QPET software which has an automated segmentation algorithm. 13 N-ammonia PET perfusion data were acquired in list mode at rest and during pharmacologic vasodilation, in 76 HCM patients and 10 non-HCM patients referred for evaluation of CAD (CAD group.) Data were resampled to create static, ECG-gated and 36-frame-dynamic images. Myocardial flow reserve (MFR) and MBF (in ml/min/g) were calculated using QPET and PMod softwares. All HCM patients had asymmetric septal hypertrophy, and 50% had evidence of LVOT obstruction, whereas non-HCM patients (CAD group) had normal wall thickness and ejection fraction. PMod yielded significantly higher values for global and regional stress-MBF and MFR than for QPET in HCM. Reasonably fair correlation was observed for global rest-MBF, stress-MBF, and MFR using these two softwares (rest-MBF: r = 0.78; stress-MBF: r = 0.66.; MFR: r = 0.7) in HCM patients. Agreement between global MBF and MFR values improved when HCM patients with high spillover fractions (> 0.65) were excluded from the analysis (rest-MBF: r = 0.84; stress-MBF: r = 0.72; MFR: r = 0.8.) Regionally, the highest agreement between PMod and QPET was observed in the LAD territory (rest-MBF: r = 0.82, Stress-MBF: r = 0.68) where spillover fraction was the lowest. Unlike HCM patients, the non-HCM patients (CAD group) demonstrated excellent agreement in MBF/MFR values, obtained by the two softwares, when patients with high spillover fractions were excluded (rest-MBF: r = 0.95; stress-MBF: r = 0.92; MFR: r = 0.95). Anatomic characteristics specific to HCM hearts contribute to lower correlations between MBF/MFR values obtained by PMod and QPET, compared with non-HCM patients. These differences indicate that PMod and QPET cannot be used interchangeably for MBF/MFR analyses in HCM patients.
Preconditioning of the background error covariance matrix in data assimilation for the Caspian Sea
NASA Astrophysics Data System (ADS)
Arcucci, Rossella; D'Amore, Luisa; Toumi, Ralf
2017-06-01
Data Assimilation (DA) is an uncertainty quantification technique used for improving numerical forecasted results by incorporating observed data into prediction models. As a crucial point into DA models is the ill conditioning of the covariance matrices involved, it is mandatory to introduce, in a DA software, preconditioning methods. Here we present first studies concerning the introduction of two different preconditioning methods in a DA software we are developing (we named S3DVAR) which implements a Scalable Three Dimensional Variational Data Assimilation model for assimilating sea surface temperature (SST) values collected into the Caspian Sea by using the Regional Ocean Modeling System (ROMS) with observations provided by the Group of High resolution sea surface temperature (GHRSST). We also present the algorithmic strategies we employ.
Toward the S3DVAR data assimilation software for the Caspian Sea
NASA Astrophysics Data System (ADS)
Arcucci, Rossella; Celestino, Simone; Toumi, Ralf; Laccetti, Giuliano
2017-07-01
Data Assimilation (DA) is an uncertainty quantification technique used to incorporate observed data into a prediction model in order to improve numerical forecasted results. The forecasting model used for producing oceanographic prediction into the Caspian Sea is the Regional Ocean Modeling System (ROMS). Here we propose the computational issues we are facing in a DA software we are developing (we named S3DVAR) which implements a Scalable Three Dimensional Variational Data Assimilation model for assimilating sea surface temperature (SST) values collected into the Caspian Sea with observations provided by the Group of High resolution sea surface temperature (GHRSST). We present the algorithmic strategies we employ and the numerical issues on data collected in two of the months which present the most significant variability in water temperature: August and March.
Jiřík, Miroslav; Bartoš, Martin; Tomášek, Petr; Malečková, Anna; Kural, Tomáš; Horáková, Jana; Lukáš, David; Suchý, Tomáš; Kochová, Petra; Hubálek Kalbáčová, Marie; Králíčková, Milena; Tonar, Zbyněk
2018-06-01
Quantification of the structure and composition of biomaterials using micro-CT requires image segmentation due to the low contrast and overlapping radioopacity of biological materials. The amount of bias introduced by segmentation procedures is generally unknown. We aim to develop software that generates three-dimensional models of fibrous and porous structures with known volumes, surfaces, lengths, and object counts in fibrous materials and to provide a software tool that calibrates quantitative micro-CT assessments. Virtual image stacks were generated using the newly developed software TeIGen, enabling the simulation of micro-CT scans of unconnected tubes, connected tubes, and porosities. A realistic noise generator was incorporated. Forty image stacks were evaluated using micro-CT, and the error between the true known and estimated data was quantified. Starting with geometric primitives, the error of the numerical estimation of surfaces and volumes was eliminated, thereby enabling the quantification of volumes and surfaces of colliding objects. Analysis of the sensitivity of the thresholding upon parameters of generated testing image sets revealed the effects of decreasing resolution and increasing noise on the accuracy of the micro-CT quantification. The size of the error increased with decreasing resolution when the voxel size exceeded 1/10 of the typical object size, which simulated the effect of the smallest details that could still be reliably quantified. Open-source software for calibrating quantitative micro-CT assessments by producing and saving virtually generated image data sets with known morphometric data was made freely available to researchers involved in morphometry of three-dimensional fibrillar and porous structures in micro-CT scans. © 2018 Wiley Periodicals, Inc.
Standardless quantification by parameter optimization in electron probe microanalysis
NASA Astrophysics Data System (ADS)
Limandri, Silvina P.; Bonetto, Rita D.; Josa, Víctor Galván; Carreras, Alejo C.; Trincavelli, Jorge C.
2012-11-01
A method for standardless quantification by parameter optimization in electron probe microanalysis is presented. The method consists in minimizing the quadratic differences between an experimental spectrum and an analytical function proposed to describe it, by optimizing the parameters involved in the analytical prediction. This algorithm, implemented in the software POEMA (Parameter Optimization in Electron Probe Microanalysis), allows the determination of the elemental concentrations, along with their uncertainties. The method was tested in a set of 159 elemental constituents corresponding to 36 spectra of standards (mostly minerals) that include trace elements. The results were compared with those obtained with the commercial software GENESIS Spectrum® for standardless quantification. The quantifications performed with the method proposed here are better in the 74% of the cases studied. In addition, the performance of the method proposed is compared with the first principles standardless analysis procedure DTSA for a different data set, which excludes trace elements. The relative deviations with respect to the nominal concentrations are lower than 0.04, 0.08 and 0.35 for the 66% of the cases for POEMA, GENESIS and DTSA, respectively.
Toward improved peptide feature detection in quantitative proteomics using stable isotope labeling.
Nilse, Lars; Sigloch, Florian Christoph; Biniossek, Martin L; Schilling, Oliver
2015-08-01
Reliable detection of peptides in LC-MS data is a key algorithmic step in the analysis of quantitative proteomics experiments. While highly abundant peptides can be detected reliably by most modern software tools, there is much less agreement on medium and low-intensity peptides in a sample. The choice of software tools can have a big impact on the quantification of proteins, especially for proteins that appear in lower concentrations. However, in many experiments, it is precisely this region of less abundant but substantially regulated proteins that holds the biggest potential for discoveries. This is particularly true for discovery proteomics in the pharmacological sector with a specific interest in key regulatory proteins. In this viewpoint article, we discuss how the development of novel software algorithms allows us to study this region of the proteome with increased confidence. Reliable results are one of many aspects to be considered when deciding on a bioinformatics software platform. Deployment into existing IT infrastructures, compatibility with other software packages, scalability, automation, flexibility, and support need to be considered and are briefly addressed in this viewpoint article. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Veit, Johannes; Sachsenberg, Timo; Chernev, Aleksandar; Aicheler, Fabian; Urlaub, Henning; Kohlbacher, Oliver
2016-09-02
Modern mass spectrometry setups used in today's proteomics studies generate vast amounts of raw data, calling for highly efficient data processing and analysis tools. Software for analyzing these data is either monolithic (easy to use, but sometimes too rigid) or workflow-driven (easy to customize, but sometimes complex). Thermo Proteome Discoverer (PD) is a powerful software for workflow-driven data analysis in proteomics which, in our eyes, achieves a good trade-off between flexibility and usability. Here, we present two open-source plugins for PD providing additional functionality: LFQProfiler for label-free quantification of peptides and proteins, and RNP(xl) for UV-induced peptide-RNA cross-linking data analysis. LFQProfiler interacts with existing PD nodes for peptide identification and validation and takes care of the entire quantitative part of the workflow. We show that it performs at least on par with other state-of-the-art software solutions for label-free quantification in a recently published benchmark ( Ramus, C.; J. Proteomics 2016 , 132 , 51 - 62 ). The second workflow, RNP(xl), represents the first software solution to date for identification of peptide-RNA cross-links including automatic localization of the cross-links at amino acid resolution and localization scoring. It comes with a customized integrated cross-link fragment spectrum viewer for convenient manual inspection and validation of the results.
Kuharev, Jörg; Navarro, Pedro; Distler, Ute; Jahn, Olaf; Tenzer, Stefan
2015-09-01
Label-free quantification (LFQ) based on data-independent acquisition workflows currently experiences increasing popularity. Several software tools have been recently published or are commercially available. The present study focuses on the evaluation of three different software packages (Progenesis, synapter, and ISOQuant) supporting ion mobility enhanced data-independent acquisition data. In order to benchmark the LFQ performance of the different tools, we generated two hybrid proteome samples of defined quantitative composition containing tryptically digested proteomes of three different species (mouse, yeast, Escherichia coli). This model dataset simulates complex biological samples containing large numbers of both unregulated (background) proteins as well as up- and downregulated proteins with exactly known ratios between samples. We determined the number and dynamic range of quantifiable proteins and analyzed the influence of applied algorithms (retention time alignment, clustering, normalization, etc.) on quantification results. Analysis of technical reproducibility revealed median coefficients of variation of reported protein abundances below 5% for MS(E) data for Progenesis and ISOQuant. Regarding accuracy of LFQ, evaluation with synapter and ISOQuant yielded superior results compared to Progenesis. In addition, we discuss reporting formats and user friendliness of the software packages. The data generated in this study have been deposited to the ProteomeXchange Consortium with identifier PXD001240 (http://proteomecentral.proteomexchange.org/dataset/PXD001240). © 2014 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Normal Databases for the Relative Quantification of Myocardial Perfusion
Rubeaux, Mathieu; Xu, Yuan; Germano, Guido; Berman, Daniel S.; Slomka, Piotr J.
2016-01-01
Purpose of review Myocardial perfusion imaging (MPI) with SPECT is performed clinically worldwide to detect and monitor coronary artery disease (CAD). MPI allows an objective quantification of myocardial perfusion at stress and rest. This established technique relies on normal databases to compare patient scans against reference normal limits. In this review, we aim to introduce the process of MPI quantification with normal databases and describe the associated perfusion quantitative measures that are used. Recent findings New equipment and new software reconstruction algorithms have been introduced which require the development of new normal limits. The appearance and regional count variations of normal MPI scan may differ between these new scanners and standard Anger cameras. Therefore, these new systems may require the determination of new normal limits to achieve optimal accuracy in relative myocardial perfusion quantification. Accurate diagnostic and prognostic results rivaling those obtained by expert readers can be obtained by this widely used technique. Summary Throughout this review, we emphasize the importance of the different normal databases and the need for specific databases relative to distinct imaging procedures. use of appropriate normal limits allows optimal quantification of MPI by taking into account subtle image differences due to the hardware and software used, and the population studied. PMID:28138354
Brestrich, Nina; Briskot, Till; Osberghaus, Anna; Hubbuch, Jürgen
2014-07-01
Selective quantification of co-eluting proteins in chromatography is usually performed by offline analytics. This is time-consuming and can lead to late detection of irregularities in chromatography processes. To overcome this analytical bottleneck, a methodology for selective protein quantification in multicomponent mixtures by means of spectral data and partial least squares regression was presented in two previous studies. In this paper, a powerful integration of software and chromatography hardware will be introduced that enables the applicability of this methodology for a selective inline quantification of co-eluting proteins in chromatography. A specific setup consisting of a conventional liquid chromatography system, a diode array detector, and a software interface to Matlab® was developed. The established tool for selective inline quantification was successfully applied for a peak deconvolution of a co-eluting ternary protein mixture consisting of lysozyme, ribonuclease A, and cytochrome c on SP Sepharose FF. Compared to common offline analytics based on collected fractions, no loss of information regarding the retention volumes and peak flanks was observed. A comparison between the mass balances of both analytical methods showed, that the inline quantification tool can be applied for a rapid determination of pool yields. Finally, the achieved inline peak deconvolution was successfully applied to make product purity-based real-time pooling decisions. This makes the established tool for selective inline quantification a valuable approach for inline monitoring and control of chromatographic purification steps and just in time reaction on process irregularities. © 2014 Wiley Periodicals, Inc.
Cortesi, Marilisa; Bandiera, Lucia; Pasini, Alice; Bevilacqua, Alessandro; Gherardi, Alessandro; Furini, Simone; Giordano, Emanuele
2017-01-01
Quantifying gene expression at single cell level is fundamental for the complete characterization of synthetic gene circuits, due to the significant impact of noise and inter-cellular variability on the system's functionality. Commercial set-ups that allow the acquisition of fluorescent signal at single cell level (flow cytometers or quantitative microscopes) are expensive apparatuses that are hardly affordable by small laboratories. A protocol that makes a standard optical microscope able to acquire quantitative, single cell, fluorescent data from a bacterial population transformed with synthetic gene circuitry is presented. Single cell fluorescence values, acquired with a microscope set-up and processed with custom-made software, are compared with results that were obtained with a flow cytometer in a bacterial population transformed with the same gene circuitry. The high correlation between data from the two experimental set-ups, with a correlation coefficient computed over the tested dynamic range > 0.99, proves that a standard optical microscope- when coupled with appropriate software for image processing- might be used for quantitative single-cell fluorescence measurements. The calibration of the set-up, together with its validation, is described. The experimental protocol described in this paper makes quantitative measurement of single cell fluorescence accessible to laboratories equipped with standard optical microscope set-ups. Our method allows for an affordable measurement/quantification of intercellular variability, whose better understanding of this phenomenon will improve our comprehension of cellular behaviors and the design of synthetic gene circuits. All the required software is freely available to the synthetic biology community (MUSIQ Microscope flUorescence SIngle cell Quantification).
Tranquart, F; Mercier, L; Frinking, P; Gaud, E; Arditi, M
2012-07-01
With contrast-enhanced ultrasound (CEUS) now established as a valuable imaging modality for many applications, a more specific demand has recently emerged for quantifying perfusion and using measured parameters as objective indicators for various disease states. However, CEUS perfusion quantification remains challenging and is not well integrated in daily clinical practice. The development of VueBox™ alleviates existing limitations and enables quantification in a standardized way. VueBox™ operates as an off-line software application, after dynamic contrast-enhanced ultrasound (DCE-US) is performed. It enables linearization of DICOM clips, assessment of perfusion using patented curve-fitting models, and generation of parametric images by synthesizing perfusion information at the pixel level using color coding. VueBox™ is compatible with most of the available ultrasound platforms (nonlinear contrast-enabled), has the ability to process both bolus and disruption-replenishment kinetics loops, allows analysis results and their context to be saved, and generates analysis reports automatically. Specific features have been added to VueBox™, such as fully automatic in-plane motion compensation and an easy-to-use clip editor. Processing time has been reduced as a result of parallel programming optimized for multi-core processors. A long list of perfusion parameters is available for each of the two administration modes to address all possible demands currently reported in the literature for diagnosis or treatment monitoring. In conclusion, VueBox™ is a valid and robust quantification tool to be used for standardizing perfusion quantification and to improve the reproducibility of results across centers. © Georg Thieme Verlag KG Stuttgart · New York.
RNA-Skim: a rapid method for RNA-Seq quantification at transcript level
Zhang, Zhaojun; Wang, Wei
2014-01-01
Motivation: RNA-Seq technique has been demonstrated as a revolutionary means for exploring transcriptome because it provides deep coverage and base pair-level resolution. RNA-Seq quantification is proven to be an efficient alternative to Microarray technique in gene expression study, and it is a critical component in RNA-Seq differential expression analysis. Most existing RNA-Seq quantification tools require the alignments of fragments to either a genome or a transcriptome, entailing a time-consuming and intricate alignment step. To improve the performance of RNA-Seq quantification, an alignment-free method, Sailfish, has been recently proposed to quantify transcript abundances using all k-mers in the transcriptome, demonstrating the feasibility of designing an efficient alignment-free method for transcriptome quantification. Even though Sailfish is substantially faster than alternative alignment-dependent methods such as Cufflinks, using all k-mers in the transcriptome quantification impedes the scalability of the method. Results: We propose a novel RNA-Seq quantification method, RNA-Skim, which partitions the transcriptome into disjoint transcript clusters based on sequence similarity, and introduces the notion of sig-mers, which are a special type of k-mers uniquely associated with each cluster. We demonstrate that the sig-mer counts within a cluster are sufficient for estimating transcript abundances with accuracy comparable with any state-of-the-art method. This enables RNA-Skim to perform transcript quantification on each cluster independently, reducing a complex optimization problem into smaller optimization tasks that can be run in parallel. As a result, RNA-Skim uses <4% of the k-mers and <10% of the CPU time required by Sailfish. It is able to finish transcriptome quantification in <10 min per sample by using just a single thread on a commodity computer, which represents >100 speedup over the state-of-the-art alignment-based methods, while delivering comparable or higher accuracy. Availability and implementation: The software is available at http://www.csbio.unc.edu/rs. Contact: weiwang@cs.ucla.edu Supplementary information: Supplementary data are available at Bioinformatics online. PMID:24931995
Dakota Graphical User Interface v. 1.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
Friedman-Hill, Ernest; Glickman, Matthew; Gibson, Marcus
Graphical analysis environment for Sandia’s Dakota software for optimization and uncertainty quantification. The Dakota GUI is an interactive graphical analysis environment for creating, running, and interpreting Dakota optimization and uncertainty quantification studies. It includes problem (Dakota study) set-up, option specification, simulation interfacing, analysis execution, and results visualization. Through the use of wizards, templates, and views, Dakota GUI helps uses navigate Dakota’s complex capability landscape.
Negative dielectrophoresis spectroscopy for rare analyte quantification in biological samples
NASA Astrophysics Data System (ADS)
Kirmani, Syed Abdul Mannan; Gudagunti, Fleming Dackson; Velmanickam, Logeeshan; Nawarathna, Dharmakeerthi; Lima, Ivan T., Jr.
2017-03-01
We propose the use of negative dielectrophoresis (DEP) spectroscopy as a technique to improve the detection limit of rare analytes in biological samples. We observe a significant dependence of the negative DEP force on functionalized polystyrene beads at the edges of interdigitated electrodes with respect to the frequency of the electric field. We measured this velocity of repulsion for 0% and 0.8% conjugation of avidin with biotin functionalized polystyrene beads with our automated software through real-time image processing that monitors the Rayleigh scattering from the beads. A significant difference in the velocity of the beads was observed in the presence of as little as 80 molecules of avidin per biotin functionalized bead. This technology can be applied in the detection and quantification of rare analytes that can be useful in the diagnosis and the treatment of diseases, such as cancer and myocardial infarction, with the use of polystyrene beads functionalized with antibodies for the target biomarkers.
NASA Astrophysics Data System (ADS)
Fang, Y.; Hou, J.; Engel, D.; Lin, G.; Yin, J.; Han, B.; Fang, Z.; Fountoulakis, V.
2011-12-01
In this study, we introduce an uncertainty quantification (UQ) software framework for carbon sequestration, with the focus of studying being the effect of spatial heterogeneity of reservoir properties on CO2 migration. We use a sequential Gaussian method (SGSIM) to generate realizations of permeability fields with various spatial statistical attributes. To deal with the computational difficulties, we integrate the following ideas/approaches: 1) firstly, we use three different sampling approaches (probabilistic collocation, quasi-Monte Carlo, and adaptive sampling approaches) to reduce the required forward calculations while trying to explore the parameter space and quantify the input uncertainty; 2) secondly, we use eSTOMP as the forward modeling simulator. eSTOMP is implemented using the Global Arrays toolkit (GA) that is based on one-sided inter-processor communication and supports a shared memory programming style on distributed memory platforms. It provides highly-scalable performance. It uses a data model to partition most of the large scale data structures into a relatively small number of distinct classes. The lower level simulator infrastructure (e.g. meshing support, associated data structures, and data mapping to processors) is separated from the higher level physics and chemistry algorithmic routines using a grid component interface; and 3) besides the faster model and more efficient algorithms to speed up the forward calculation, we built an adaptive system infrastructure to select the best possible data transfer mechanisms, to optimally allocate system resources to improve performance, and to integrate software packages and data for composing carbon sequestration simulation, computation, analysis, estimation and visualization. We will demonstrate the framework with a given CO2 injection scenario in a heterogeneous sandstone reservoir.
Bassanese, Danielle N; Conlan, Xavier A; Barnett, Neil W; Stevenson, Paul G
2015-05-01
This paper explores the analytical figures of merit of two-dimensional high-performance liquid chromatography for the separation of antioxidant standards. The cumulative two-dimensional high-performance liquid chromatography peak area was calculated for 11 antioxidants by two different methods--the areas reported by the control software and by fitting the data with a Gaussian model; these methods were evaluated for precision and sensitivity. Both methods demonstrated excellent precision in regards to retention time in the second dimension (%RSD below 1.16%) and cumulative second dimension peak area (%RSD below 3.73% from the instrument software and 5.87% for the Gaussian method). Combining areas reported by the high-performance liquid chromatographic control software displayed superior limits of detection, in the order of 1 × 10(-6) M, almost an order of magnitude lower than the Gaussian method for some analytes. The introduction of the countergradient eliminated the strong solvent mismatch between dimensions, leading to a much improved peak shape and better detection limits for quantification. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Semi-automatic computerized approach to radiological quantification in rheumatoid arthritis
NASA Astrophysics Data System (ADS)
Steiner, Wolfgang; Schoeffmann, Sylvia; Prommegger, Andrea; Boegl, Karl; Klinger, Thomas; Peloschek, Philipp; Kainberger, Franz
2004-04-01
Rheumatoid Arthritis (RA) is a common systemic disease predominantly involving the joints. Precise diagnosis and follow-up therapy requires objective quantification. For this purpose, radiological analyses using standardized scoring systems are considered to be the most appropriate method. The aim of our study is to develop a semi-automatic image analysis software, especially applicable for scoring of joints in rheumatic disorders. The X-Ray RheumaCoach software delivers various scoring systems (Larsen-Score and Ratingen-Rau-Score) which can be applied by the scorer. In addition to the qualitative assessment of joints performed by the radiologist, a semi-automatic image analysis for joint detection and measurements of bone diameters and swollen tissue supports the image assessment process. More than 3000 radiographs from hands and feet of more than 200 RA patients were collected, analyzed, and statistically evaluated. Radiographs were quantified using conventional paper-based Larsen score and the X-Ray RheumaCoach software. The use of the software shortened the scoring time by about 25 percent and reduced the rate of erroneous scorings in all our studies. Compared to paper-based scoring methods, the X-Ray RheumaCoach software offers several advantages: (i) Structured data analysis and input that minimizes variance by standardization, (ii) faster and more precise calculation of sum scores and indices, (iii) permanent data storing and fast access to the software"s database, (iv) the possibility of cross-calculation to other scores, (v) semi-automatic assessment of images, and (vii) reliable documentation of results in the form of graphical printouts.
Development and validation of an open source quantification tool for DSC-MRI studies.
Gordaliza, P M; Mateos-Pérez, J M; Montesinos, P; Guzmán-de-Villoria, J A; Desco, M; Vaquero, J J
2015-03-01
This work presents the development of an open source tool for the quantification of dynamic susceptibility-weighted contrast-enhanced (DSC) perfusion studies. The development of this tool is motivated by the lack of open source tools implemented on open platforms to allow external developers to implement their own quantification methods easily and without the need of paying for a development license. This quantification tool was developed as a plugin for the ImageJ image analysis platform using the Java programming language. A modular approach was used in the implementation of the components, in such a way that the addition of new methods can be done without breaking any of the existing functionalities. For the validation process, images from seven patients with brain tumors were acquired and quantified with the presented tool and with a widely used clinical software package. The resulting perfusion parameters were then compared. Perfusion parameters and the corresponding parametric images were obtained. When no gamma-fitting is used, an excellent agreement with the tool used as a gold-standard was obtained (R(2)>0.8 and values are within 95% CI limits in Bland-Altman plots). An open source tool that performs quantification of perfusion studies using magnetic resonance imaging has been developed and validated using a clinical software package. It works as an ImageJ plugin and the source code has been published with an open source license. Copyright © 2015 Elsevier Ltd. All rights reserved.
Targeted Feature Detection for Data-Dependent Shotgun Proteomics
2017-01-01
Label-free quantification of shotgun LC–MS/MS data is the prevailing approach in quantitative proteomics but remains computationally nontrivial. The central data analysis step is the detection of peptide-specific signal patterns, called features. Peptide quantification is facilitated by associating signal intensities in features with peptide sequences derived from MS2 spectra; however, missing values due to imperfect feature detection are a common problem. A feature detection approach that directly targets identified peptides (minimizing missing values) but also offers robustness against false-positive features (by assigning meaningful confidence scores) would thus be highly desirable. We developed a new feature detection algorithm within the OpenMS software framework, leveraging ideas and algorithms from the OpenSWATH toolset for DIA/SRM data analysis. Our software, FeatureFinderIdentification (“FFId”), implements a targeted approach to feature detection based on information from identified peptides. This information is encoded in an MS1 assay library, based on which ion chromatogram extraction and detection of feature candidates are carried out. Significantly, when analyzing data from experiments comprising multiple samples, our approach distinguishes between “internal” and “external” (inferred) peptide identifications (IDs) for each sample. On the basis of internal IDs, two sets of positive (true) and negative (decoy) feature candidates are defined. A support vector machine (SVM) classifier is then trained to discriminate between the sets and is subsequently applied to the “uncertain” feature candidates from external IDs, facilitating selection and confidence scoring of the best feature candidate for each peptide. This approach also enables our algorithm to estimate the false discovery rate (FDR) of the feature selection step. We validated FFId based on a public benchmark data set, comprising a yeast cell lysate spiked with protein standards that provide a known ground-truth. The algorithm reached almost complete (>99%) quantification coverage for the full set of peptides identified at 1% FDR (PSM level). Compared with other software solutions for label-free quantification, this is an outstanding result, which was achieved at competitive quantification accuracy and reproducibility across replicates. The FDR for the feature selection was estimated at a low 1.5% on average per sample (3% for features inferred from external peptide IDs). The FFId software is open-source and freely available as part of OpenMS (www.openms.org). PMID:28673088
Targeted Feature Detection for Data-Dependent Shotgun Proteomics.
Weisser, Hendrik; Choudhary, Jyoti S
2017-08-04
Label-free quantification of shotgun LC-MS/MS data is the prevailing approach in quantitative proteomics but remains computationally nontrivial. The central data analysis step is the detection of peptide-specific signal patterns, called features. Peptide quantification is facilitated by associating signal intensities in features with peptide sequences derived from MS2 spectra; however, missing values due to imperfect feature detection are a common problem. A feature detection approach that directly targets identified peptides (minimizing missing values) but also offers robustness against false-positive features (by assigning meaningful confidence scores) would thus be highly desirable. We developed a new feature detection algorithm within the OpenMS software framework, leveraging ideas and algorithms from the OpenSWATH toolset for DIA/SRM data analysis. Our software, FeatureFinderIdentification ("FFId"), implements a targeted approach to feature detection based on information from identified peptides. This information is encoded in an MS1 assay library, based on which ion chromatogram extraction and detection of feature candidates are carried out. Significantly, when analyzing data from experiments comprising multiple samples, our approach distinguishes between "internal" and "external" (inferred) peptide identifications (IDs) for each sample. On the basis of internal IDs, two sets of positive (true) and negative (decoy) feature candidates are defined. A support vector machine (SVM) classifier is then trained to discriminate between the sets and is subsequently applied to the "uncertain" feature candidates from external IDs, facilitating selection and confidence scoring of the best feature candidate for each peptide. This approach also enables our algorithm to estimate the false discovery rate (FDR) of the feature selection step. We validated FFId based on a public benchmark data set, comprising a yeast cell lysate spiked with protein standards that provide a known ground-truth. The algorithm reached almost complete (>99%) quantification coverage for the full set of peptides identified at 1% FDR (PSM level). Compared with other software solutions for label-free quantification, this is an outstanding result, which was achieved at competitive quantification accuracy and reproducibility across replicates. The FDR for the feature selection was estimated at a low 1.5% on average per sample (3% for features inferred from external peptide IDs). The FFId software is open-source and freely available as part of OpenMS ( www.openms.org ).
The Infeasibility of Quantifying the Reliability of Life-Critical Real-Time Software
NASA Technical Reports Server (NTRS)
Butler, Ricky W.; Finelli, George B.
1991-01-01
This paper affirms that the quantification of life-critical software reliability is infeasible using statistical methods whether applied to standard software or fault-tolerant software. The classical methods of estimating reliability are shown to lead to exhorbitant amounts of testing when applied to life-critical software. Reliability growth models are examined and also shown to be incapable of overcoming the need for excessive amounts of testing. The key assumption of software fault tolerance separately programmed versions fail independently is shown to be problematic. This assumption cannot be justified by experimentation in the ultrareliability region and subjective arguments in its favor are not sufficiently strong to justify it as an axiom. Also, the implications of the recent multiversion software experiments support this affirmation.
Statistical modeling of software reliability
NASA Technical Reports Server (NTRS)
Miller, Douglas R.
1992-01-01
This working paper discusses the statistical simulation part of a controlled software development experiment being conducted under the direction of the System Validation Methods Branch, Information Systems Division, NASA Langley Research Center. The experiment uses guidance and control software (GCS) aboard a fictitious planetary landing spacecraft: real-time control software operating on a transient mission. Software execution is simulated to study the statistical aspects of reliability and other failure characteristics of the software during development, testing, and random usage. Quantification of software reliability is a major goal. Various reliability concepts are discussed. Experiments are described for performing simulations and collecting appropriate simulated software performance and failure data. This data is then used to make statistical inferences about the quality of the software development and verification processes as well as inferences about the reliability of software versions and reliability growth under random testing and debugging.
pyQms enables universal and accurate quantification of mass spectrometry data.
Leufken, Johannes; Niehues, Anna; Sarin, L Peter; Wessel, Florian; Hippler, Michael; Leidel, Sebastian A; Fufezan, Christian
2017-10-01
Quantitative mass spectrometry (MS) is a key technique in many research areas (1), including proteomics, metabolomics, glycomics, and lipidomics. Because all of the corresponding molecules can be described by chemical formulas, universal quantification tools are highly desirable. Here, we present pyQms, an open-source software for accurate quantification of all types of molecules measurable by MS. pyQms uses isotope pattern matching that offers an accurate quality assessment of all quantifications and the ability to directly incorporate mass spectrometer accuracy. pyQms is, due to its universal design, applicable to every research field, labeling strategy, and acquisition technique. This opens ultimate flexibility for researchers to design experiments employing innovative and hitherto unexplored labeling strategies. Importantly, pyQms performs very well to accurately quantify partially labeled proteomes in large scale and high throughput, the most challenging task for a quantification algorithm. © 2017 by The American Society for Biochemistry and Molecular Biology, Inc.
A new semiquantitative method for evaluation of metastasis progression.
Volarevic, A; Ljujic, B; Volarevic, V; Milovanovic, M; Kanjevac, T; Lukic, A; Arsenijevic, N
2012-01-01
Although recent technical advancements are directed toward developing novel assays and methods for detection of micro and macro metastasis, there are still no reports of reliable, simple to use imaging software that could be used for the detection and quantification of metastasis in tissue sections. We herein report a new semiquantitative method for evaluation of metastasis progression in a well established 4T1 orthotopic mouse model of breast cancer metastasis. The new semiquantitative method presented here was implemented by using the Autodesk AutoCAD 2012 program, a computer-aided design program used primarily for preparing technical drawings in 2 dimensions. By using the Autodesk AutoCAD 2012 software- aided graphical evaluation we managed to detect each metastatic lesion and we precisely calculated the average percentage of lung and liver tissue parenchyma with metastasis in 4T1 tumor-bearing mice. The data were highly specific and relevant to descriptive histological analysis, confirming reliability and accuracy of the AutoCAD 2012 software as new method for quantification of metastatic lesions. The new semiquantitative method using AutoCAD 2012 software provides a novel approach for the estimation of metastatic progression in histological tissue sections.
Rizk, Aurélien; Paul, Grégory; Incardona, Pietro; Bugarski, Milica; Mansouri, Maysam; Niemann, Axel; Ziegler, Urs; Berger, Philipp; Sbalzarini, Ivo F
2014-03-01
Detection and quantification of fluorescently labeled molecules in subcellular compartments is a key step in the analysis of many cell biological processes. Pixel-wise colocalization analyses, however, are not always suitable, because they do not provide object-specific information, and they are vulnerable to noise and background fluorescence. Here we present a versatile protocol for a method named 'Squassh' (segmentation and quantification of subcellular shapes), which is used for detecting, delineating and quantifying subcellular structures in fluorescence microscopy images. The workflow is implemented in freely available, user-friendly software. It works on both 2D and 3D images, accounts for the microscope optics and for uneven image background, computes cell masks and provides subpixel accuracy. The Squassh software enables both colocalization and shape analyses. The protocol can be applied in batch, on desktop computers or computer clusters, and it usually requires <1 min and <5 min for 2D and 3D images, respectively. Basic computer-user skills and some experience with fluorescence microscopy are recommended to successfully use the protocol.
An overview of the mathematical and statistical analysis component of RICIS
NASA Technical Reports Server (NTRS)
Hallum, Cecil R.
1987-01-01
Mathematical and statistical analysis components of RICIS (Research Institute for Computing and Information Systems) can be used in the following problem areas: (1) quantification and measurement of software reliability; (2) assessment of changes in software reliability over time (reliability growth); (3) analysis of software-failure data; and (4) decision logic for whether to continue or stop testing software. Other areas of interest to NASA/JSC where mathematical and statistical analysis can be successfully employed include: math modeling of physical systems, simulation, statistical data reduction, evaluation methods, optimization, algorithm development, and mathematical methods in signal processing.
UQTools: The Uncertainty Quantification Toolbox - Introduction and Tutorial
NASA Technical Reports Server (NTRS)
Kenny, Sean P.; Crespo, Luis G.; Giesy, Daniel P.
2012-01-01
UQTools is the short name for the Uncertainty Quantification Toolbox, a software package designed to efficiently quantify the impact of parametric uncertainty on engineering systems. UQTools is a MATLAB-based software package and was designed to be discipline independent, employing very generic representations of the system models and uncertainty. Specifically, UQTools accepts linear and nonlinear system models and permits arbitrary functional dependencies between the system s measures of interest and the probabilistic or non-probabilistic parametric uncertainty. One of the most significant features incorporated into UQTools is the theoretical development centered on homothetic deformations and their application to set bounding and approximating failure probabilities. Beyond the set bounding technique, UQTools provides a wide range of probabilistic and uncertainty-based tools to solve key problems in science and engineering.
Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan
2016-01-01
We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels—from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures. PMID:27920748
Wallot, Sebastian; Roepstorff, Andreas; Mønster, Dan
2016-01-01
We introduce Multidimensional Recurrence Quantification Analysis (MdRQA) as a tool to analyze multidimensional time-series data. We show how MdRQA can be used to capture the dynamics of high-dimensional signals, and how MdRQA can be used to assess coupling between two or more variables. In particular, we describe applications of the method in research on joint and collective action, as it provides a coherent analysis framework to systematically investigate dynamics at different group levels-from individual dynamics, to dyadic dynamics, up to global group-level of arbitrary size. The Appendix in Supplementary Material contains a software implementation in MATLAB to calculate MdRQA measures.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Smidts, Carol; Huang, Funqun; Li, Boyuan
With the current transition from analog to digital instrumentation and control systems in nuclear power plants, the number and variety of software-based systems have significantly increased. The sophisticated nature and increasing complexity of software raises trust in these systems as a significant challenge. The trust placed in a software system is typically termed software dependability. Software dependability analysis faces uncommon challenges since software systems’ characteristics differ from those of hardware systems. The lack of systematic science-based methods for quantifying the dependability attributes in software-based instrumentation as well as control systems in safety critical applications has proved itself to be amore » significant inhibitor to the expanded use of modern digital technology in the nuclear industry. Dependability refers to the ability of a system to deliver a service that can be trusted. Dependability is commonly considered as a general concept that encompasses different attributes, e.g., reliability, safety, security, availability and maintainability. Dependability research has progressed significantly over the last few decades. For example, various assessment models and/or design approaches have been proposed for software reliability, software availability and software maintainability. Advances have also been made to integrate multiple dependability attributes, e.g., integrating security with other dependability attributes, measuring availability and maintainability, modeling reliability and availability, quantifying reliability and security, exploring the dependencies between security and safety and developing integrated analysis models. However, there is still a lack of understanding of the dependencies between various dependability attributes as a whole and of how such dependencies are formed. To address the need for quantification and give a more objective basis to the review process -- therefore reducing regulatory uncertainty -- measures and methods are needed to assess dependability attributes early on, as well as throughout the life-cycle process of software development. In this research, extensive expert opinion elicitation is used to identify the measures and methods for assessing software dependability. Semi-structured questionnaires were designed to elicit expert knowledge. A new notation system, Causal Mechanism Graphing, was developed to extract and represent such knowledge. The Causal Mechanism Graphs were merged, thus, obtaining the consensus knowledge shared by the domain experts. In this report, we focus on how software contributes to dependability. However, software dependability is not discussed separately from the context of systems or socio-technical systems. Specifically, this report focuses on software dependability, reliability, safety, security, availability, and maintainability. Our research was conducted in the sequence of stages found below. Each stage is further examined in its corresponding chapter. Stage 1 (Chapter 2): Elicitation of causal maps describing the dependencies between dependability attributes. These causal maps were constructed using expert opinion elicitation. This chapter describes the expert opinion elicitation process, the questionnaire design, the causal map construction method and the causal maps obtained. Stage 2 (Chapter 3): Elicitation of the causal map describing the occurrence of the event of interest for each dependability attribute. The causal mechanisms for the “event of interest” were extracted for each of the software dependability attributes. The “event of interest” for a dependability attribute is generally considered to be the “attribute failure”, e.g. security failure. The extraction was based on the analysis of expert elicitation results obtained in Stage 1. Stage 3 (Chapter 4): Identification of relevant measurements. Measures for the “events of interest” and their causal mechanisms were obtained from expert opinion elicitation for each of the software dependability attributes. The measures extracted are presented in this chapter. Stage 4 (Chapter 5): Assessment of the coverage of the causal maps via measures. Coverage was assessed to determine whether the measures obtained were sufficient to quantify software dependability, and what measures are further required. Stage 5 (Chapter 6): Identification of “missing” measures and measurement approaches for concepts not covered. New measures, for concepts that had not been covered sufficiently as determined in Stage 4, were identified using supplementary expert opinion elicitation as well as literature reviews. Stage 6 (Chapter 7): Building of a detailed quantification model based on the causal maps and measurements obtained. Ability to derive such a quantification model shows that the causal models and measurements derived from the previous stages (Stage 1 to Stage 5) can form the technical basis for developing dependability quantification models. Scope restrictions have led us to prioritize this demonstration effort. The demonstration was focused on a critical system, i.e. the reactor protection system. For this system, a ranking of the software dependability attributes by nuclear stakeholders was developed. As expected for this application, the stakeholder ranking identified safety as the most critical attribute to be quantified. A safety quantification model limited to the requirements phase of development was built. Two case studies were conducted for verification. A preliminary control gate for software safety for the requirements stage was proposed and applied to the first case study. The control gate allows a cost effective selection of the duration of the requirements phase.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Parreiras Nogueira, Liebert; Barroso, Regina Cely; Pereira de Almeida, Andre
2012-05-17
This work aims to evaluate histomorphometric quantification by synchrotron radiation computed microto-mography in bones of human and rat specimens. Bones specimens are classified as normal and pathological (for human samples) and irradiated and non-irradiated samples (for rat ones). Human bones are specimens which were affected by some injury, or not. Rat bones are specimens which were irradiated, simulating radiotherapy procedures, or not. Images were obtained on SYRMEP beamline at the Elettra Synchrotron Laboratory in Trieste, Italy. The system generated 14 {mu}m tomographic images. The quantification of bone structures were performed directly by the 3D rendered images using a home-made software.more » Resolution yielded was excellent what facilitate quantification of bone microstructures.« less
Technical advances in proteomics: new developments in data-independent acquisition.
Hu, Alex; Noble, William S; Wolf-Yadlin, Alejandro
2016-01-01
The ultimate aim of proteomics is to fully identify and quantify the entire complement of proteins and post-translational modifications in biological samples of interest. For the last 15 years, liquid chromatography-tandem mass spectrometry (LC-MS/MS) in data-dependent acquisition (DDA) mode has been the standard for proteomics when sampling breadth and discovery were the main objectives; multiple reaction monitoring (MRM) LC-MS/MS has been the standard for targeted proteomics when precise quantification, reproducibility, and validation were the main objectives. Recently, improvements in mass spectrometer design and bioinformatics algorithms have resulted in the rediscovery and development of another sampling method: data-independent acquisition (DIA). DIA comprehensively and repeatedly samples every peptide in a protein digest, producing a complex set of mass spectra that is difficult to interpret without external spectral libraries. Currently, DIA approaches the identification breadth of DDA while achieving the reproducible quantification characteristic of MRM or its newest version, parallel reaction monitoring (PRM). In comparative de novo identification and quantification studies in human cell lysates, DIA identified up to 89% of the proteins detected in a comparable DDA experiment while providing reproducible quantification of over 85% of them. DIA analysis aided by spectral libraries derived from prior DIA experiments or auxiliary DDA data produces identification and quantification as reproducible and precise as that achieved by MRM/PRM, except on low‑abundance peptides that are obscured by stronger signals. DIA is still a work in progress toward the goal of sensitive, reproducible, and precise quantification without external spectral libraries. New software tools applied to DIA analysis have to deal with deconvolution of complex spectra as well as proper filtering of false positives and false negatives. However, the future outlook is positive, and various researchers are working on novel bioinformatics techniques to address these issues and increase the reproducibility, fidelity, and identification breadth of DIA.
UROKIN: A Software to Enhance Our Understanding of Urogenital Motion.
Czyrnyj, Catriona S; Labrosse, Michel R; Graham, Ryan B; McLean, Linda
2018-05-01
Transperineal ultrasound (TPUS) allows for objective quantification of mid-sagittal urogenital mechanics, yet current practice omits dynamic motion information in favor of analyzing only a rest and a peak motion frame. This work details the development of UROKIN, a semi-automated software which calculates kinematic curves of urogenital landmark motion. A proof of concept analysis, performed using UROKIN on TPUS video recorded from 20 women with and 10 women without stress urinary incontinence (SUI) performing maximum voluntary contraction of the pelvic floor muscles. The anorectal angle and bladder neck were tracked while the motion of the pubic symphysis was used to compensate for the error incurred by TPUS probe motion during imaging. Kinematic curves of landmark motion were generated for each video and curves were smoothed, time normalized, and averaged within groups. Kinematic data yielded by the UROKIN software showed statistically significant differences between women with and without SUI in terms of magnitude and timing characteristics of the kinematic curves depicting landmark motion. Results provide insight into the ways in which UROKIN may be useful to study differences in pelvic floor muscle contraction mechanics between women with and without SUI and other pelvic floor disorders. The UROKIN software improves on methods described in the literature and provides unique capacity to further our understanding of urogenital biomechanics.
Conklin, Emily E; Lee, Kathyann L; Schlabach, Sadie A; Woods, Ian G
2015-01-01
Differences in nervous system function can result in differences in behavioral output. Measurements of animal locomotion enable the quantification of these differences. Automated tracking of animal movement is less labor-intensive and bias-prone than direct observation, and allows for simultaneous analysis of multiple animals, high spatial and temporal resolution, and data collection over extended periods of time. Here, we present a new video-tracking system built on Python-based software that is free, open source, and cross-platform, and that can analyze video input from widely available video capture devices such as smartphone cameras and webcams. We validated this software through four tests on a variety of animal species, including larval and adult zebrafish (Danio rerio), Siberian dwarf hamsters (Phodopus sungorus), and wild birds. These tests highlight the capacity of our software for long-term data acquisition, parallel analysis of multiple animals, and application to animal species of different sizes and movement patterns. We applied the software to an analysis of the effects of ethanol on thigmotaxis (wall-hugging) behavior on adult zebrafish, and found that acute ethanol treatment decreased thigmotaxis behaviors without affecting overall amounts of motion. The open source nature of our software enables flexibility, customization, and scalability in behavioral analyses. Moreover, our system presents a free alternative to commercial video-tracking systems and is thus broadly applicable to a wide variety of educational settings and research programs.
Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N A; Bin Zaheer, Kashif
2015-01-01
Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called 'StakeMeter'. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error.
ERIC Educational Resources Information Center
Rushinek, Avi; Rushinek, Sara
1984-01-01
Describes results of a system rating study in which users responded to WPS (word processing software) questions. Study objectives were data collection and evaluation of variables; statistical quantification of WPS's contribution (along with other variables) to user satisfaction; design of an expert system to evaluate WPS; and database update and…
IsobariQ: software for isobaric quantitative proteomics using IPTL, iTRAQ, and TMT.
Arntzen, Magnus Ø; Koehler, Christian J; Barsnes, Harald; Berven, Frode S; Treumann, Achim; Thiede, Bernd
2011-02-04
Isobaric peptide labeling plays an important role in relative quantitative comparisons of proteomes. Isobaric labeling techniques utilize MS/MS spectra for relative quantification, which can be either based on the relative intensities of reporter ions in the low mass region (iTRAQ and TMT) or on the relative intensities of quantification signatures throughout the spectrum due to isobaric peptide termini labeling (IPTL). Due to the increased quantitative information found in MS/MS fragment spectra generated by the recently developed IPTL approach, new software was required to extract the quantitative information. IsobariQ was specifically developed for this purpose; however, support for the reporter ion techniques iTRAQ and TMT is also included. In addition, to address recently emphasized issues about heterogeneity of variance in proteomics data sets, IsobariQ employs the statistical software package R and variance stabilizing normalization (VSN) algorithms available therein. Finally, the functionality of IsobariQ is validated with data sets of experiments using 6-plex TMT and IPTL. Notably, protein substrates resulting from cleavage by proteases can be identified as shown for caspase targets in apoptosis.
Sheng, Quanhu; Li, Rongxia; Dai, Jie; Li, Qingrun; Su, Zhiduan; Guo, Yan; Li, Chen; Shyr, Yu; Zeng, Rong
2015-01-01
Isobaric labeling techniques coupled with high-resolution mass spectrometry have been widely employed in proteomic workflows requiring relative quantification. For each high-resolution tandem mass spectrum (MS/MS), isobaric labeling techniques can be used not only to quantify the peptide from different samples by reporter ions, but also to identify the peptide it is derived from. Because the ions related to isobaric labeling may act as noise in database searching, the MS/MS spectrum should be preprocessed before peptide or protein identification. In this article, we demonstrate that there are a lot of high-frequency, high-abundance isobaric related ions in the MS/MS spectrum, and removing isobaric related ions combined with deisotoping and deconvolution in MS/MS preprocessing procedures significantly improves the peptide/protein identification sensitivity. The user-friendly software package TurboRaw2MGF (v2.0) has been implemented for converting raw TIC data files to mascot generic format files and can be downloaded for free from https://github.com/shengqh/RCPA.Tools/releases as part of the software suite ProteomicsTools. The data have been deposited to the ProteomeXchange with identifier PXD000994. PMID:25435543
Design and validation of Segment--freely available software for cardiovascular image analysis.
Heiberg, Einar; Sjögren, Jane; Ugander, Martin; Carlsson, Marcus; Engblom, Henrik; Arheden, Håkan
2010-01-11
Commercially available software for cardiovascular image analysis often has limited functionality and frequently lacks the careful validation that is required for clinical studies. We have already implemented a cardiovascular image analysis software package and released it as freeware for the research community. However, it was distributed as a stand-alone application and other researchers could not extend it by writing their own custom image analysis algorithms. We believe that the work required to make a clinically applicable prototype can be reduced by making the software extensible, so that researchers can develop their own modules or improvements. Such an initiative might then serve as a bridge between image analysis research and cardiovascular research. The aim of this article is therefore to present the design and validation of a cardiovascular image analysis software package (Segment) and to announce its release in a source code format. Segment can be used for image analysis in magnetic resonance imaging (MRI), computed tomography (CT), single photon emission computed tomography (SPECT) and positron emission tomography (PET). Some of its main features include loading of DICOM images from all major scanner vendors, simultaneous display of multiple image stacks and plane intersections, automated segmentation of the left ventricle, quantification of MRI flow, tools for manual and general object segmentation, quantitative regional wall motion analysis, myocardial viability analysis and image fusion tools. Here we present an overview of the validation results and validation procedures for the functionality of the software. We describe a technique to ensure continued accuracy and validity of the software by implementing and using a test script that tests the functionality of the software and validates the output. The software has been made freely available for research purposes in a source code format on the project home page http://segment.heiberg.se. Segment is a well-validated comprehensive software package for cardiovascular image analysis. It is freely available for research purposes provided that relevant original research publications related to the software are cited.
David, R.; Stoessel, A.; Berthoz, A.; Spoor, F.; Bennequin, D.
2016-01-01
The semicircular duct system is part of the sensory organ of balance and essential for navigation and spatial awareness in vertebrates. Its function in detecting head rotations has been modelled with increasing sophistication, but the biomechanics of actual semicircular duct systems has rarely been analyzed, foremost because the fragile membranous structures in the inner ear are hard to visualize undistorted and in full. Here we present a new, easy-to-apply and non-invasive method for three-dimensional in-situ visualization and quantification of the semicircular duct system, using X-ray micro tomography and tissue staining with phosphotungstic acid. Moreover, we introduce Ariadne, a software toolbox which provides comprehensive and improved morphological and functional analysis of any visualized duct system. We demonstrate the potential of these methods by presenting results for the duct system of humans, the squirrel monkey and the rhesus macaque, making comparisons with past results from neurophysiological, oculometric and biomechanical studies. Ariadne is freely available at http://www.earbank.org. PMID:27604473
NASA Astrophysics Data System (ADS)
Holmes, David W.; Sheehan, Madoc; Birks, Melanie; Smithson, John
2018-01-01
Mapping the curriculum of a professional degree to the associated competency standard ensures graduates have the competence to perform as professionals. Existing approaches to competence mapping vary greatly in depth, complexity, and effectiveness, and a standardised approach remains elusive. This paper describes a new mapping software tool that streamlines and standardises the competency mapping process. The available analytics facilitate ongoing programme review, management, and accreditation. The complete mapping and analysis of an Australian mechanical engineering degree programme is described as a case study. Each subject is mapped by evaluating the amount and depth of competence development present. Combining subject results then enables highly detailed programme level analysis. The mapping process is designed to be administratively light, with aspects of professional development embedded in the software. The effective competence mapping described in this paper enables quantification of learning within a professional degree programme, and provides a mechanism for holistic programme improvement.
Quantitative CT: technique dependence of volume estimation on pulmonary nodules
NASA Astrophysics Data System (ADS)
Chen, Baiyu; Barnhart, Huiman; Richard, Samuel; Colsher, James; Amurao, Maxwell; Samei, Ehsan
2012-03-01
Current estimation of lung nodule size typically relies on uni- or bi-dimensional techniques. While new three-dimensional volume estimation techniques using MDCT have improved size estimation of nodules with irregular shapes, the effect of acquisition and reconstruction parameters on accuracy (bias) and precision (variance) of the new techniques has not been fully investigated. To characterize the volume estimation performance dependence on these parameters, an anthropomorphic chest phantom containing synthetic nodules was scanned and reconstructed with protocols across various acquisition and reconstruction parameters. Nodule volumes were estimated by a clinical lung analysis software package, LungVCAR. Precision and accuracy of the volume assessment were calculated across the nodules and compared between protocols via a generalized estimating equation analysis. Results showed that the precision and accuracy of nodule volume quantifications were dependent on slice thickness, with different dependences for different nodule characteristics. Other parameters including kVp, pitch, and reconstruction kernel had lower impact. Determining these technique dependences enables better volume quantification via protocol optimization and highlights the importance of consistent imaging parameters in sequential examinations.
Martinon, Alice; Cronin, Ultan P; Wilkinson, Martin G
2012-01-01
In this article, four types of standards were assessed in a SYBR Green-based real-time PCR procedure for the quantification of Staphylococcus aureus (S. aureus) in DNA samples. The standards were purified S. aureus genomic DNA (type A), circular plasmid DNA containing a thermonuclease (nuc) gene fragment (type B), DNA extracted from defined populations of S. aureus cells generated by Fluorescence Activated Cell Sorting (FACS) technology with (type C) or without purification of DNA by boiling (type D). The optimal efficiency of 2.016 was obtained on Roche LightCycler(®) 4.1. software for type C standards, whereas the lowest efficiency (1.682) corresponded to type D standards. Type C standards appeared to be more suitable for quantitative real-time PCR because of the use of defined populations for construction of standard curves. Overall, Fieller Confidence Interval algorithm may be improved for replicates having a low standard deviation in Cycle Threshold values such as found for type B and C standards. Stabilities of diluted PCR standards stored at -20°C were compared after 0, 7, 14 and 30 days and were lower for type A or C standards compared with type B standards. However, FACS generated standards may be useful for bacterial quantification in real-time PCR assays once optimal storage and temperature conditions are defined.
Babar, Muhammad Imran; Ghazali, Masitah; Jawawi, Dayang N. A.; Zaheer, Kashif Bin
2015-01-01
Value-based requirements engineering plays a vital role in the development of value-based software (VBS). Stakeholders are the key players in the requirements engineering process, and the selection of critical stakeholders for the VBS systems is highly desirable. Based on the stakeholder requirements, the innovative or value-based idea is realized. The quality of the VBS system is associated with the concrete set of valuable requirements, and the valuable requirements can only be obtained if all the relevant valuable stakeholders participate in the requirements elicitation phase. The existing value-based approaches focus on the design of the VBS systems. However, the focus on the valuable stakeholders and requirements is inadequate. The current stakeholder identification and quantification (SIQ) approaches are neither state-of-the-art nor systematic for the VBS systems. The existing approaches are time-consuming, complex and inconsistent which makes the initiation process difficult. Moreover, the main motivation of this research is that the existing SIQ approaches do not provide the low level implementation details for SIQ initiation and stakeholder metrics for quantification. Hence, keeping in view the existing SIQ problems, this research contributes in the form of a new SIQ framework called ‘StakeMeter’. The StakeMeter framework is verified and validated through case studies. The proposed framework provides low-level implementation guidelines, attributes, metrics, quantification criteria and application procedure as compared to the other methods. The proposed framework solves the issues of stakeholder quantification or prioritization, higher time consumption, complexity, and process initiation. The framework helps in the selection of highly critical stakeholders for the VBS systems with less judgmental error. PMID:25799490
Murugaiyan, Jayaseelan; Eravci, Murat; Weise, Christoph; Roesler, Uwe
2017-06-01
Here, we provide the dataset associated with our research article 'label-free quantitative proteomic analysis of harmless and pathogenic strains of infectious microalgae, Prototheca spp.' (Murugaiyan et al., 2017) [1]. This dataset describes liquid chromatography-mass spectrometry (LC-MS)-based protein identification and quantification of a non-infectious strain, Prototheca zopfii genotype 1 and two strains associated with severe and mild infections, respectively, P. zopfii genotype 2 and Prototheca blaschkeae . Protein identification and label-free quantification was carried out by analysing MS raw data using the MaxQuant-Andromeda software suit. The expressional level differences of the identified proteins among the strains were computed using Perseus software and the results were presented in [1]. This DiB provides the MaxQuant output file and raw data deposited in the PRIDE repository with the dataset identifier PXD005305.
AGScan: a pluggable microarray image quantification software based on the ImageJ library.
Cathelin, R; Lopez, F; Klopp, Ch
2007-01-15
Many different programs are available to analyze microarray images. Most programs are commercial packages, some are free. In the latter group only few propose automatic grid alignment and batch mode. More often than not a program implements only one quantification algorithm. AGScan is an open source program that works on all major platforms. It is based on the ImageJ library [Rasband (1997-2006)] and offers a plug-in extension system to add new functions to manipulate images, align grid and quantify spots. It is appropriate for daily laboratory use and also as a framework for new algorithms. The program is freely distributed under X11 Licence. The install instructions can be found in the user manual. The software can be downloaded from http://mulcyber.toulouse.inra.fr/projects/agscan/. The questions and plug-ins can be sent to the contact listed below.
Mavar-Haramija, Marija; Prats-Galino, Alberto; Méndez, Juan A Juanes; Puigdelívoll-Sánchez, Anna; de Notaris, Matteo
2015-10-01
A three-dimensional (3D) model of the skull base was reconstructed from the pre- and post-dissection head CT images and embedded in a Portable Document Format (PDF) file, which can be opened by freely available software and used offline. The CT images were segmented using a specific 3D software platform for biomedical data, and the resulting 3D geometrical models of anatomical structures were used for dual purpose: to simulate the extended endoscopic endonasal transsphenoidal approaches and to perform the quantitative analysis of the procedures. The analysis consisted of bone removal quantification and the calculation of quantitative parameters (surgical freedom and exposure area) of each procedure. The results are presented in three PDF documents containing JavaScript-based functions. The 3D-PDF files include reconstructions of the nasal structures (nasal septum, vomer, middle turbinates), the bony structures of the anterior skull base and maxillofacial region and partial reconstructions of the optic nerve, the hypoglossal and vidian canals and the internal carotid arteries. Alongside the anatomical model, axial, sagittal and coronal CT images are shown. Interactive 3D presentations were created to explain the surgery and the associated quantification methods step-by-step. The resulting 3D-PDF files allow the user to interact with the model through easily available software, free of charge and in an intuitive manner. The files are available for offline use on a personal computer and no previous specialized knowledge in informatics is required. The documents can be downloaded at http://hdl.handle.net/2445/55224 .
EasyLCMS: an asynchronous web application for the automated quantification of LC-MS data
2012-01-01
Background Downstream applications in metabolomics, as well as mathematical modelling, require data in a quantitative format, which may also necessitate the automated and simultaneous quantification of numerous metabolites. Although numerous applications have been previously developed for metabolomics data handling, automated calibration and calculation of the concentrations in terms of μmol have not been carried out. Moreover, most of the metabolomics applications are designed for GC-MS, and would not be suitable for LC-MS, since in LC, the deviation in the retention time is not linear, which is not taken into account in these applications. Moreover, only a few are web-based applications, which could improve stand-alone software in terms of compatibility, sharing capabilities and hardware requirements, even though a strong bandwidth is required. Furthermore, none of these incorporate asynchronous communication to allow real-time interaction with pre-processed results. Findings Here, we present EasyLCMS (http://www.easylcms.es/), a new application for automated quantification which was validated using more than 1000 concentration comparisons in real samples with manual operation. The results showed that only 1% of the quantifications presented a relative error higher than 15%. Using clustering analysis, the metabolites with the highest relative error distributions were identified and studied to solve recurrent mistakes. Conclusions EasyLCMS is a new web application designed to quantify numerous metabolites, simultaneously integrating LC distortions and asynchronous web technology to present a visual interface with dynamic interaction which allows checking and correction of LC-MS raw data pre-processing results. Moreover, quantified data obtained with EasyLCMS are fully compatible with numerous downstream applications, as well as for mathematical modelling in the systems biology field. PMID:22884039
Tutorial examples for uncertainty quantification methods.
DOE Office of Scientific and Technical Information (OSTI.GOV)
De Bord, Sarah
2015-08-01
This report details the work accomplished during my 2015 SULI summer internship at Sandia National Laboratories in Livermore, CA. During this internship, I worked on multiple tasks with the common goal of making uncertainty quantification (UQ) methods more accessible to the general scientific community. As part of my work, I created a comprehensive numerical integration example to incorporate into the user manual of a UQ software package. Further, I developed examples involving heat transfer through a window to incorporate into tutorial lectures that serve as an introduction to UQ methods.
MIXING QUANTIFICATION BY VISUAL IMAGING ANALYSIS
This paper reports on development of a method for quantifying two measures of mixing, the scale and intensity of segregation, through flow visualization, video recording, and software analysis. This non-intrusive method analyzes a planar cross section of a flowing system from an ...
Aquila, Iolanda; González, Ariana; Fernández-Golfín, Covadonga; Rincón, Luis Miguel; Casas, Eduardo; García, Ana; Hinojar, Rocio; Jiménez-Nacher, José Julio; Zamorano, José Luis
2016-05-17
3D transesophageal echocardiography (TEE) is superior to 2D TEE in quantitative anatomic evaluation of the mitral valve (MV) but it shows limitations regarding automatic quantification. Here, we tested the inter-/intra-observer reproducibility of a novel full-automated software in the evaluation of MV anatomy compared to manual 3D assessment. Thirty-six out of 61 screened patients referred to our Cardiac Imaging Unit for TEE were retrospectively included. 3D TEE analysis was performed both manually and with the automated software by two independent operators. Mitral annular area, intercommissural distance, anterior leaflet length and posterior leaflet length were assessed. A significant correlation between both methods was found for all variables: intercommissural diameter (r = 0.84, p < 0.01), mitral annular area (r = 0.94, p > 0, 01), anterior leaflet length (r = 0.83, p < 0.01) and posterior leaflet length (r = 0.67, p < 0.01). Interobserver variability assessed by the intraclass correlation coefficient was superior for the automatic software: intercommisural distance 0.997 vs. 0.76; mitral annular area 0.957 vs. 0.858; anterior leaflet length 0.963 vs. 0.734 and posterior leaflet length 0.936 vs. 0.838. Intraobserver variability was good for both methods with a better level of agreement with the automatic software. The novel 3D automated software is reproducible in MV anatomy assessment. The incorporation of this new tool in clinical MV assessment may improve patient selection and outcomes for MV interventions as well as patient diagnosis and prognosis stratification. Yet, high-quality 3D images are indispensable.
Peridigm summary report : lessons learned in development with agile components.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Salinger, Andrew Gerhard; Mitchell, John Anthony; Littlewood, David John
2011-09-01
This report details efforts to deploy Agile Components for rapid development of a peridynamics code, Peridigm. The goal of Agile Components is to enable the efficient development of production-quality software by providing a well-defined, unifying interface to a powerful set of component-based software. Specifically, Agile Components facilitate interoperability among packages within the Trilinos Project, including data management, time integration, uncertainty quantification, and optimization. Development of the Peridigm code served as a testbed for Agile Components and resulted in a number of recommendations for future development. Agile Components successfully enabled rapid integration of Trilinos packages into Peridigm. A cost of thismore » approach, however, was a set of restrictions on Peridigm's architecture which impacted the ability to track history-dependent material data, dynamically modify the model discretization, and interject user-defined routines into the time integration algorithm. These restrictions resulted in modifications to the Agile Components approach, as implemented in Peridigm, and in a set of recommendations for future Agile Components development. Specific recommendations include improved handling of material states, a more flexible flow control model, and improved documentation. A demonstration mini-application, SimpleODE, was developed at the onset of this project and is offered as a potential supplement to Agile Components documentation.« less
Beyond the proteome: Mass Spectrometry Special Interest Group (MS-SIG) at ISMB/ECCB 2013
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ryu, Soyoung; Payne, Samuel H.; Schaab, Christoph
2014-07-02
Mass spectrometry special interest group (MS-SIG) aims to bring together experts from the global research community to discuss highlights and challenges in the field of mass spectrometry (MS)-based proteomics and computational biology. The rapid echnological developments in MS-based proteomics have enabled the generation of a large amount of meaningful information on hundreds to thousands of proteins simultaneously from a biological sample; however, the complexity of the MS data require sophisticated computational algorithms and software for data analysis and interpretation. This year’s MS-SIG meeting theme was ‘Beyond the Proteome’ with major focuses on improving protein identification/quantification and using proteomics data tomore » solve interesting problems in systems biology and clinical research.« less
DynamicRoots: A Software Platform for the Reconstruction and Analysis of Growing Plant Roots.
Symonova, Olga; Topp, Christopher N; Edelsbrunner, Herbert
2015-01-01
We present a software platform for reconstructing and analyzing the growth of a plant root system from a time-series of 3D voxelized shapes. It aligns the shapes with each other, constructs a geometric graph representation together with the function that records the time of growth, and organizes the branches into a hierarchy that reflects the order of creation. The software includes the automatic computation of structural and dynamic traits for each root in the system enabling the quantification of growth on fine-scale. These are important advances in plant phenotyping with applications to the study of genetic and environmental influences on growth.
Automatic Identification and Quantification of Extra-Well Fluorescence in Microarray Images.
Rivera, Robert; Wang, Jie; Yu, Xiaobo; Demirkan, Gokhan; Hopper, Marika; Bian, Xiaofang; Tahsin, Tasnia; Magee, D Mitchell; Qiu, Ji; LaBaer, Joshua; Wallstrom, Garrick
2017-11-03
In recent studies involving NAPPA microarrays, extra-well fluorescence is used as a key measure for identifying disease biomarkers because there is evidence to support that it is better correlated with strong antibody responses than statistical analysis involving intraspot intensity. Because this feature is not well quantified by traditional image analysis software, identification and quantification of extra-well fluorescence is performed manually, which is both time-consuming and highly susceptible to variation between raters. A system that could automate this task efficiently and effectively would greatly improve the process of data acquisition in microarray studies, thereby accelerating the discovery of disease biomarkers. In this study, we experimented with different machine learning methods, as well as novel heuristics, for identifying spots exhibiting extra-well fluorescence (rings) in microarray images and assigning each ring a grade of 1-5 based on its intensity and morphology. The sensitivity of our final system for identifying rings was found to be 72% at 99% specificity and 98% at 92% specificity. Our system performs this task significantly faster than a human, while maintaining high performance, and therefore represents a valuable tool for microarray image analysis.
An online database for plant image analysis software tools.
Lobet, Guillaume; Draye, Xavier; Périlleux, Claire
2013-10-09
Recent years have seen an increase in methods for plant phenotyping using image analyses. These methods require new software solutions for data extraction and treatment. These solutions are instrumental in supporting various research pipelines, ranging from the localisation of cellular compounds to the quantification of tree canopies. However, due to the variety of existing tools and the lack of central repository, it is challenging for researchers to identify the software that is best suited for their research. We present an online, manually curated, database referencing more than 90 plant image analysis software solutions. The website, plant-image-analysis.org, presents each software in a uniform and concise manner enabling users to identify the available solutions for their experimental needs. The website also enables user feedback, evaluations and new software submissions. The plant-image-analysis.org database provides an overview of existing plant image analysis software. The aim of such a toolbox is to help users to find solutions, and to provide developers a way to exchange and communicate about their work.
Courcelles, Mathieu; Coulombe-Huntington, Jasmin; Cossette, Émilie; Gingras, Anne-Claude; Thibault, Pierre; Tyers, Mike
2017-07-07
Protein cross-linking mass spectrometry (CL-MS) enables the sensitive detection of protein interactions and the inference of protein complex topology. The detection of chemical cross-links between protein residues can identify intra- and interprotein contact sites or provide physical constraints for molecular modeling of protein structure. Recent innovations in cross-linker design, sample preparation, mass spectrometry, and software tools have significantly improved CL-MS approaches. Although a number of algorithms now exist for the identification of cross-linked peptides from mass spectral data, a dearth of user-friendly analysis tools represent a practical bottleneck to the broad adoption of the approach. To facilitate the analysis of CL-MS data, we developed CLMSVault, a software suite designed to leverage existing CL-MS algorithms and provide intuitive and flexible tools for cross-platform data interpretation. CLMSVault stores and combines complementary information obtained from different cross-linkers and search algorithms. CLMSVault provides filtering, comparison, and visualization tools to support CL-MS analyses and includes a workflow for label-free quantification of cross-linked peptides. An embedded 3D viewer enables the visualization of quantitative data and the mapping of cross-linked sites onto PDB structural models. We demonstrate the application of CLMSVault for the analysis of a noncovalent Cdc34-ubiquitin protein complex cross-linked under different conditions. CLMSVault is open-source software (available at https://gitlab.com/courcelm/clmsvault.git ), and a live demo is available at http://democlmsvault.tyerslab.com/ .
Ahm, Malte; Thorndahl, Søren; Nielsen, Jesper E; Rasmussen, Michael R
2016-12-01
Combined sewer overflow (CSO) structures are constructed to effectively discharge excess water during heavy rainfall, to protect the urban drainage system from hydraulic overload. Consequently, most CSO structures are not constructed according to basic hydraulic principles for ideal measurement weirs. It can, therefore, be a challenge to quantify the discharges from CSOs. Quantification of CSO discharges are important in relation to the increased environmental awareness of the receiving water bodies. Furthermore, CSO discharge quantification is essential for closing the rainfall-runoff mass-balance in combined sewer catchments. A closed mass-balance is an advantage for calibration of all urban drainage models based on mass-balance principles. This study presents three different software sensor concepts based on local water level sensors, which can be used to estimate CSO discharge volumes from hydraulic complex CSO structures. The three concepts were tested and verified under real practical conditions. All three concepts were accurate when compared to electromagnetic flow measurements.
MorphoGraphX: A platform for quantifying morphogenesis in 4D.
Barbier de Reuille, Pierre; Routier-Kierzkowska, Anne-Lise; Kierzkowski, Daniel; Bassel, George W; Schüpbach, Thierry; Tauriello, Gerardo; Bajpai, Namrata; Strauss, Sören; Weber, Alain; Kiss, Annamaria; Burian, Agata; Hofhuis, Hugo; Sapala, Aleksandra; Lipowczan, Marcin; Heimlicher, Maria B; Robinson, Sarah; Bayer, Emmanuelle M; Basler, Konrad; Koumoutsakos, Petros; Roeder, Adrienne H K; Aegerter-Wilmsen, Tinri; Nakayama, Naomi; Tsiantis, Miltos; Hay, Angela; Kwiatkowska, Dorota; Xenarios, Ioannis; Kuhlemeier, Cris; Smith, Richard S
2015-05-06
Morphogenesis emerges from complex multiscale interactions between genetic and mechanical processes. To understand these processes, the evolution of cell shape, proliferation and gene expression must be quantified. This quantification is usually performed either in full 3D, which is computationally expensive and technically challenging, or on 2D planar projections, which introduces geometrical artifacts on highly curved organs. Here we present MorphoGraphX ( www.MorphoGraphX.org), a software that bridges this gap by working directly with curved surface images extracted from 3D data. In addition to traditional 3D image analysis, we have developed algorithms to operate on curved surfaces, such as cell segmentation, lineage tracking and fluorescence signal quantification. The software's modular design makes it easy to include existing libraries, or to implement new algorithms. Cell geometries extracted with MorphoGraphX can be exported and used as templates for simulation models, providing a powerful platform to investigate the interactions between shape, genes and growth.
CCD Camera Detection of HIV Infection.
Day, John R
2017-01-01
Rapid and precise quantification of the infectivity of HIV is important for molecular virologic studies, as well as for measuring the activities of antiviral drugs and neutralizing antibodies. An indicator cell line, a CCD camera, and image-analysis software are used to quantify HIV infectivity. The cells of the P4R5 line, which express the receptors for HIV infection as well as β-galactosidase under the control of the HIV-1 long terminal repeat, are infected with HIV and then incubated 2 days later with X-gal to stain the infected cells blue. Digital images of monolayers of the infected cells are captured using a high resolution CCD video camera and a macro video zoom lens. A software program is developed to process the images and to count the blue-stained foci of infection. The described method allows for the rapid quantification of the infected cells over a wide range of viral inocula with reproducibility, accuracy and at relatively low cost.
Bioinformatics Pertinent to Lipid Analysis in Biological Samples.
Ma, Justin; Arbelo, Ulises; Guerra, Yenifer; Aribindi, Katyayini; Bhattacharya, Sanjoy K; Pelaez, Daniel
2017-01-01
Electrospray ionization mass spectrometry has revolutionized the way lipids are studied. In this work, we present a tutorial for analyzing class-specific lipid spectra obtained from a triple quadrupole mass spectrometer. The open-source software MZmine 2.21 is used, coupled with LIPID MAPS databases. Here, we describe the steps for lipid identification, ratiometric quantification, and briefly address the differences to the analyses when using direct infusion versus tandem liquid chromatography-mass spectrometry (LC-MS). We also provide a tutorial and equations for quantification of lipid amounts using synthetic lipid standards and normalization to a protein amount.
Development of magnetic resonance technology for noninvasive boron quantification
DOE Office of Scientific and Technical Information (OSTI.GOV)
Bradshaw, K.M.
1990-11-01
Boron magnetic resonance imaging (MRI) and spectroscopy (MRS) were developed in support of the noninvasive boron quantification task of the Idaho National Engineering Laboratory (INEL) Power Burst Facility/Boron Neutron Capture Therapy (PBF/BNCT) program. The hardware and software described in this report are modifications specific to a GE Signa{trademark} MRI system, release 3.X and are necessary for boron magnetic resonance operation. The technology developed in this task has been applied to obtaining animal pharmacokinetic data of boron compounds (drug time response) and the in-vivo localization of boron in animal tissue noninvasively. 9 refs., 21 figs.
Leung, Kit-Yi; Lescuyer, Pierre; Campbell, James; Byers, Helen L; Allard, Laure; Sanchez, Jean-Charles; Ward, Malcolm A
2005-08-01
A novel strategy consisting of cleavable Isotope-Coded Affinity Tag (cICAT) combined with MASCOT Distiller was evaluated as a tool for the quantification of proteins in "abnormal" patient plasma, prepared by pooling samples from patients with acute stroke. Quantification of all light and heavy cICAT-labelled peptide ion pairs was obtained using MASCOT Distiller combined with a proprietary software. Peptides displaying differences were selected for identification by MS. These preliminary results show the promise of our approach to identify potential biomarkers.
Measurements and analysis in imaging for biomedical applications
NASA Astrophysics Data System (ADS)
Hoeller, Timothy L.
2009-02-01
A Total Quality Management (TQM) approach can be used to analyze data from biomedical optical and imaging platforms of tissues. A shift from individuals to teams, partnerships, and total participation are necessary from health care groups for improved prognostics using measurement analysis. Proprietary measurement analysis software is available for calibrated, pixel-to-pixel measurements of angles and distances in digital images. Feature size, count, and color are determinable on an absolute and comparative basis. Although changes in images of histomics are based on complex and numerous factors, the variation of changes in imaging analysis to correlations of time, extent, and progression of illness can be derived. Statistical methods are preferred. Applications of the proprietary measurement software are available for any imaging platform. Quantification of results provides improved categorization of illness towards better health. As health care practitioners try to use quantified measurement data for patient diagnosis, the techniques reported can be used to track and isolate causes better. Comparisons, norms, and trends are available from processing of measurement data which is obtained easily and quickly from Scientific Software and methods. Example results for the class actions of Preventative and Corrective Care in Ophthalmology and Dermatology, respectively, are provided. Improved and quantified diagnosis can lead to better health and lower costs associated with health care. Systems support improvements towards Lean and Six Sigma affecting all branches of biology and medicine. As an example for use of statistics, the major types of variation involving a study of Bone Mineral Density (BMD) are examined. Typically, special causes in medicine relate to illness and activities; whereas, common causes are known to be associated with gender, race, size, and genetic make-up. Such a strategy of Continuous Process Improvement (CPI) involves comparison of patient results to baseline data using F-statistics. Self-parings over time are also useful. Special and common causes are identified apart from aging in applying the statistical methods. In the future, implementation of imaging measurement methods by research staff, doctors, and concerned patient partners result in improved health diagnosis, reporting, and cause determination. The long-term prospects for quantified measurements are better quality in imaging analysis with applications of higher utility for heath care providers.
DOT National Transportation Integrated Search
2014-05-01
The Federal Aviation Administration, Office of Environment and Energy (FAA-AEE) has developed the Aviation Environmental Design Tool (AEDT) version 2a software system with the support of the following development team: FAA, National Aeronautics and S...
Buck, Thomas; Hwang, Shawn M; Plicht, Björn; Mucci, Ronald A; Hunold, Peter; Erbel, Raimund; Levine, Robert A
2008-06-01
Cardiac ultrasound imaging systems are limited in the noninvasive quantification of valvular regurgitation due to indirect measurements and inaccurate hemodynamic assumptions. We recently demonstrated that the principle of integration of backscattered acoustic Doppler power times velocity can be used for flow quantification in valvular regurgitation directly at the vena contracta of a regurgitant flow jet. We now aimed to accomplish implementation of automated Doppler power flow analysis software on a standard cardiac ultrasound system utilizing novel matrix-array transducer technology with detailed description of system requirements, components and software contributing to the system. This system based on a 3.5 MHz, matrix-array cardiac ultrasound scanner (Sonos 5500, Philips Medical Systems) was validated by means of comprehensive experimental signal generator trials, in vitro flow phantom trials and in vivo testing in 48 patients with mitral regurgitation of different severity and etiology using magnetic resonance imaging (MRI) for reference. All measurements displayed good correlation to the reference values, indicating successful implementation of automated Doppler power flow analysis on a matrix-array ultrasound imaging system. Systematic underestimation of effective regurgitant orifice areas >0.65 cm(2) and volumes >40 ml was found due to currently limited Doppler beam width that could be readily overcome by the use of new generation 2D matrix-array technology. Automated flow quantification in valvular heart disease based on backscattered Doppler power can be fully implemented on board a routinely used matrix-array ultrasound imaging systems. Such automated Doppler power flow analysis of valvular regurgitant flow directly, noninvasively, and user independent overcomes the practical limitations of current techniques.
An open tool for input function estimation and quantification of dynamic PET FDG brain scans.
Bertrán, Martín; Martínez, Natalia; Carbajal, Guillermo; Fernández, Alicia; Gómez, Álvaro
2016-08-01
Positron emission tomography (PET) analysis of clinical studies is mostly restricted to qualitative evaluation. Quantitative analysis of PET studies is highly desirable to be able to compute an objective measurement of the process of interest in order to evaluate treatment response and/or compare patient data. But implementation of quantitative analysis generally requires the determination of the input function: the arterial blood or plasma activity which indicates how much tracer is available for uptake in the brain. The purpose of our work was to share with the community an open software tool that can assist in the estimation of this input function, and the derivation of a quantitative map from the dynamic PET study. Arterial blood sampling during the PET study is the gold standard method to get the input function, but is uncomfortable and risky for the patient so it is rarely used in routine studies. To overcome the lack of a direct input function, different alternatives have been devised and are available in the literature. These alternatives derive the input function from the PET image itself (image-derived input function) or from data gathered from previous similar studies (population-based input function). In this article, we present ongoing work that includes the development of a software tool that integrates several methods with novel strategies for the segmentation of blood pools and parameter estimation. The tool is available as an extension to the 3D Slicer software. Tests on phantoms were conducted in order to validate the implemented methods. We evaluated the segmentation algorithms over a range of acquisition conditions and vasculature size. Input function estimation algorithms were evaluated against ground truth of the phantoms, as well as on their impact over the final quantification map. End-to-end use of the tool yields quantification maps with [Formula: see text] relative error in the estimated influx versus ground truth on phantoms. The main contribution of this article is the development of an open-source, free to use tool that encapsulates several well-known methods for the estimation of the input function and the quantification of dynamic PET FDG studies. Some alternative strategies are also proposed and implemented in the tool for the segmentation of blood pools and parameter estimation. The tool was tested on phantoms with encouraging results that suggest that even bloodless estimators could provide a viable alternative to blood sampling for quantification using graphical analysis. The open tool is a promising opportunity for collaboration among investigators and further validation on real studies.
Toward a Quantification of the Information/Communication Industries. Publication No. 74-2.
ERIC Educational Resources Information Center
Lavey, Warren G.
A national survey was made to collect data about the information/communication industries in the United States today. Eleven industries were studied: television, radio, telephone, telegraph, postal service, newspaper, periodical, book publishing and printing, motion pictures, computer software, and cable television. The data collection scheme used…
Instrument for Real-Time Digital Nucleic Acid Amplification on Custom Microfluidic Devices
Selck, David A.
2016-01-01
Nucleic acid amplification tests that are coupled with a digital readout enable the absolute quantification of single molecules, even at ultralow concentrations. Digital methods are robust, versatile and compatible with many amplification chemistries including isothermal amplification, making them particularly invaluable to assays that require sensitive detection, such as the quantification of viral load in occult infections or detection of sparse amounts of DNA from forensic samples. A number of microfluidic platforms are being developed for carrying out digital amplification. However, the mechanistic investigation and optimization of digital assays has been limited by the lack of real-time kinetic information about which factors affect the digital efficiency and analytical sensitivity of a reaction. Commercially available instruments that are capable of tracking digital reactions in real-time are restricted to only a small number of device types and sample-preparation strategies. Thus, most researchers who wish to develop, study, or optimize digital assays rely on the rate of the amplification reaction when performed in a bulk experiment, which is now recognized as an unreliable predictor of digital efficiency. To expand our ability to study how digital reactions proceed in real-time and enable us to optimize both the digital efficiency and analytical sensitivity of digital assays, we built a custom large-format digital real-time amplification instrument that can accommodate a wide variety of devices, amplification chemistries and sample-handling conditions. Herein, we validate this instrument, we provide detailed schematics that will enable others to build their own custom instruments, and we include a complete custom software suite to collect and analyze the data retrieved from the instrument. We believe assay optimizations enabled by this instrument will improve the current limits of nucleic acid detection and quantification, improving our fundamental understanding of single-molecule reactions and providing advancements in practical applications such as medical diagnostics, forensics and environmental sampling. PMID:27760148
User Guidelines and Best Practices for CASL VUQ Analysis Using Dakota
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Coleman, Kayla; Gilkey, Lindsay N.
Sandia’s Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to enhance understanding of risk, improve products, and assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a physics-based computational model. This can lend efficiency and rigor to manual parameter perturbation studies already being conducted by analysts. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, riskmore » analysis, and quantification of margins and uncertainty with such models. It directly supports verification and validation activities. Dakota algorithms enrich complex science and engineering models, enabling an analyst to answer crucial questions of - Sensitivity: Which are the most important input factors or parameters entering the simulation, and how do they influence key outputs?; Uncertainty: What is the uncertainty or variability in simulation output, given uncertainties in input parameters? How safe, reliable, robust, or variable is my system? (Quantification of margins and uncertainty, QMU); Optimization: What parameter values yield the best performing design or operating condition, given constraints? Calibration: What models and/or parameters best match experimental data? In general, Dakota is the Consortium for Advanced Simulation of Light Water Reactors (CASL) delivery vehicle for verification, validation, and uncertainty quantification (VUQ) algorithms. It permits ready application of the VUQ methods described above to simulation codes by CASL researchers, code developers, and application engineers.« less
ddpcr: an R package and web application for analysis of droplet digital PCR data.
Attali, Dean; Bidshahri, Roza; Haynes, Charles; Bryan, Jennifer
2016-01-01
Droplet digital polymerase chain reaction (ddPCR) is a novel platform for exact quantification of DNA which holds great promise in clinical diagnostics. It is increasingly popular due to its digital nature, which provides more accurate quantification and higher sensitivity than traditional real-time PCR. However, clinical adoption has been slowed in part by the lack of software tools available for analyzing ddPCR data. Here, we present ddpcr - a new R package for ddPCR visualization and analysis. In addition, ddpcr includes a web application (powered by the Shiny R package) that allows users to analyze ddPCR data using an interactive graphical interface.
Consistency of flow quantifications in tridirectional phase-contrast MRI
NASA Astrophysics Data System (ADS)
Unterhinninghofen, R.; Ley, S.; Dillmann, R.
2009-02-01
Tridirectionally encoded phase-contrast MRI is a technique to non-invasively acquire time-resolved velocity vector fields of blood flow. These may not only be used to analyze pathological flow patterns, but also to quantify flow at arbitrary positions within the acquired volume. In this paper we examine the validity of this approach by analyzing the consistency of related quantifications instead of comparing it with an external reference measurement. Datasets of the thoracic aorta were acquired from 6 pigs, 1 healthy volunteer and 3 patients with artificial aortic valves. Using in-house software an elliptical flow quantification plane was placed manually at 6 positions along the descending aorta where it was rotated to 5 different angles. For each configuration flow was computed based on the original data and data that had been corrected for phase offsets. Results reveal that quantifications are more dependent on changes in position than on changes in angle. Phase offset correction considerably reduces this dependency. Overall consistency is good with a maximum variation coefficient of 9.9% and a mean variation coefficient of 7.2%.
Ghosn, Mohamad G; Shah, Dipan J
2014-01-01
Cardiac magnetic resonance has become a well-established imaging modality and is considered the gold standard for myocardial tissue viability assessment and ventricular volumes quantification. Recent technological hardware and software advancements in magnetic resonance imaging technology have allowed the development of new methods that can improve clinical cardiovascular diagnosis and prognosis. The advent of a new generation of higher magnetic field scanners can be beneficial to various clinical applications. Also, the development of faster acquisition techniques have allowed mapping of the magnetic relaxation properties T1, T2, and T2* in the myocardium that can be used to quantify myocardial diffuse fibrosis, determine the presence of edema or inflammation, and measure iron within the myocardium, respectively. Another recent major advancement in CMR has been the introduction of three-dimension (3D) phase contrast imaging, also known as 4D flow. The following review discusses key advances in cardiac magnetic resonance technology and their potential to improve clinical cardiovascular diagnosis and outcomes.
Eloi, Juliana Cristina; Epifanio, Matias; de Gonçalves, Marília Maia; Pellicioli, Augusto; Vieira, Patricia Froelich Giora; Dias, Henrique Bregolin; Bruscato, Neide; Soder, Ricardo Bernardi; Santana, João Carlos Batista; Mouzaki, Marialena; Baldisserotto, Matteo
2017-01-01
Computed tomography, which uses ionizing radiation and expensive software packages for analysis of scans, can be used to quantify abdominal fat. The objective of this study is to measure abdominal fat with 3T MRI using free software for image analysis and to correlate these findings with anthropometric and laboratory parameters in adolescents. This prospective observational study included 24 overweight/obese and 33 healthy adolescents (mean age 16.55 years). All participants underwent abdominal MRI exams. Visceral and subcutaneous fat area and percentage were correlated with anthropometric parameters, lipid profile, glucose metabolism, and insulin resistance. Student's t test and Mann-Whitney's test was applied. Pearson's chi-square test was used to compare proportions. To determine associations Pearson's linear correlation or Spearman's correlation were used. In both groups, waist circumference (WC) was associated with visceral fat area (P = 0.001 and P = 0.01 respectively), and triglycerides were associated with fat percentage (P = 0.046 and P = 0.071 respectively). In obese individuals, total cholesterol/HDL ratio was associated with visceral fat area (P = 0.03) and percentage (P = 0.09), and insulin and HOMA-IR were associated with visceral fat area (P = 0.001) and percentage (P = 0.005). 3T MRI can provide reliable and good quality images for quantification of visceral and subcutaneous fat by using a free software package. The results demonstrate that WC is a good predictor of visceral fat in obese adolescents and visceral fat area is associated with total cholesterol/HDL ratio, insulin and HOMA-IR.
Design and Use of a Full Flow Sampling System (FFS) for the Quantification of Methane Emissions
Johnson, Derek R.; Covington, April N.; Clark, Nigel N.
2016-01-01
The use of natural gas continues to grow with increased discovery and production of unconventional shale resources. At the same time, the natural gas industry faces continued scrutiny for methane emissions from across the supply chain, due to methane's relatively high global warming potential (25-84x that of carbon dioxide, according to the Energy Information Administration). Currently, a variety of techniques of varied uncertainties exists to measure or estimate methane emissions from components or facilities. Currently, only one commercial system is available for quantification of component level emissions and recent reports have highlighted its weaknesses. In order to improve accuracy and increase measurement flexibility, we have designed, developed, and implemented a novel full flow sampling system (FFS) for quantification of methane emissions and greenhouse gases based on transportation emissions measurement principles. The FFS is a modular system that consists of an explosive-proof blower(s), mass airflow sensor(s) (MAF), thermocouple, sample probe, constant volume sampling pump, laser based greenhouse gas sensor, data acquisition device, and analysis software. Dependent upon the blower and hose configuration employed, the current FFS is able to achieve a flow rate ranging from 40 to 1,500 standard cubic feet per minute (SCFM). Utilization of laser-based sensors mitigates interference from higher hydrocarbons (C2+). Co-measurement of water vapor allows for humidity correction. The system is portable, with multiple configurations for a variety of applications ranging from being carried by a person to being mounted in a hand drawn cart, on-road vehicle bed, or from the bed of utility terrain vehicles (UTVs). The FFS is able to quantify methane emission rates with a relative uncertainty of ± 4.4%. The FFS has proven, real world operation for the quantification of methane emissions occurring in conventional and remote facilities. PMID:27341646
Design and Use of a Full Flow Sampling System (FFS) for the Quantification of Methane Emissions.
Johnson, Derek R; Covington, April N; Clark, Nigel N
2016-06-12
The use of natural gas continues to grow with increased discovery and production of unconventional shale resources. At the same time, the natural gas industry faces continued scrutiny for methane emissions from across the supply chain, due to methane's relatively high global warming potential (25-84x that of carbon dioxide, according to the Energy Information Administration). Currently, a variety of techniques of varied uncertainties exists to measure or estimate methane emissions from components or facilities. Currently, only one commercial system is available for quantification of component level emissions and recent reports have highlighted its weaknesses. In order to improve accuracy and increase measurement flexibility, we have designed, developed, and implemented a novel full flow sampling system (FFS) for quantification of methane emissions and greenhouse gases based on transportation emissions measurement principles. The FFS is a modular system that consists of an explosive-proof blower(s), mass airflow sensor(s) (MAF), thermocouple, sample probe, constant volume sampling pump, laser based greenhouse gas sensor, data acquisition device, and analysis software. Dependent upon the blower and hose configuration employed, the current FFS is able to achieve a flow rate ranging from 40 to 1,500 standard cubic feet per minute (SCFM). Utilization of laser-based sensors mitigates interference from higher hydrocarbons (C2+). Co-measurement of water vapor allows for humidity correction. The system is portable, with multiple configurations for a variety of applications ranging from being carried by a person to being mounted in a hand drawn cart, on-road vehicle bed, or from the bed of utility terrain vehicles (UTVs). The FFS is able to quantify methane emission rates with a relative uncertainty of ± 4.4%. The FFS has proven, real world operation for the quantification of methane emissions occurring in conventional and remote facilities.
NASA Astrophysics Data System (ADS)
McDougald, Wendy A.; Collins, Richard; Green, Mark; Tavares, Adriana A. S.
2017-10-01
Obtaining accurate quantitative measurements in preclinical Positron Emission Tomography/Computed Tomography (PET/CT) imaging is of paramount importance in biomedical research and helps supporting efficient translation of preclinical results to the clinic. The purpose of this study was two-fold: (1) to investigate the effects of different CT acquisition protocols on PET/CT image quality and data quantification; and (2) to evaluate the absorbed dose associated with varying CT parameters. Methods: An air/water quality control CT phantom, tissue equivalent material phantom, an in-house 3D printed phantom and an image quality PET/CT phantom were imaged using a Mediso nanoPET/CT scanner. Collected data was analyzed using PMOD software, VivoQuant software and National Electric Manufactures Association (NEMA) software implemented by Mediso. Measured Hounsfield Unit (HU) in collected CT images were compared to the known HU values and image noise was quantified. PET recovery coefficients (RC), uniformity and quantitative bias were also measured. Results: Only less than 2% and 1% of CT acquisition protocols yielded water HU values < -80 and air HU values < -840, respectively. Four out of eleven CT protocols resulted in more than 100 mGy absorbed dose. Different CT protocols did not impact PET uniformity and RC, and resulted in <4% overall bias relative to expected radioactive concentration. Conclusion: Preclinical CT protocols with increased exposure times can result in high absorbed doses to the small animals. These should be avoided, as they do not contributed towards improved microPET/CT image quantitative accuracy and could limit longitudinal scanning of small animals.
2013-01-01
Background The production of multiple transcript isoforms from one gene is a major source of transcriptome complexity. RNA-Seq experiments, in which transcripts are converted to cDNA and sequenced, allow the resolution and quantification of alternative transcript isoforms. However, methods to analyze splicing are underdeveloped and errors resulting in incorrect splicing calls occur in every experiment. Results We used RNA-Seq data to develop sequencing and aligner error models. By applying these error models to known input from simulations, we found that errors result from false alignment to minor splice motifs and antisense stands, shifted junction positions, paralog joining, and repeat induced gaps. By using a series of quantitative and qualitative filters, we eliminated diagnosed errors in the simulation, and applied this to RNA-Seq data from Drosophila melanogaster heads. We used high-confidence junction detections to specifically interrogate local splicing differences between transcripts. This method out-performed commonly used RNA-seq methods to identify known alternative splicing events in the Drosophila sex determination pathway. We describe a flexible software package to perform these tasks called Splicing Analysis Kit (Spanki), available at http://www.cbcb.umd.edu/software/spanki. Conclusions Splice-junction centric analysis of RNA-Seq data provides advantages in specificity for detection of alternative splicing. Our software provides tools to better understand error profiles in RNA-Seq data and improve inference from this new technology. The splice-junction centric approach that this software enables will provide more accurate estimates of differentially regulated splicing than current tools. PMID:24209455
Sturgill, David; Malone, John H; Sun, Xia; Smith, Harold E; Rabinow, Leonard; Samson, Marie-Laure; Oliver, Brian
2013-11-09
The production of multiple transcript isoforms from one gene is a major source of transcriptome complexity. RNA-Seq experiments, in which transcripts are converted to cDNA and sequenced, allow the resolution and quantification of alternative transcript isoforms. However, methods to analyze splicing are underdeveloped and errors resulting in incorrect splicing calls occur in every experiment. We used RNA-Seq data to develop sequencing and aligner error models. By applying these error models to known input from simulations, we found that errors result from false alignment to minor splice motifs and antisense stands, shifted junction positions, paralog joining, and repeat induced gaps. By using a series of quantitative and qualitative filters, we eliminated diagnosed errors in the simulation, and applied this to RNA-Seq data from Drosophila melanogaster heads. We used high-confidence junction detections to specifically interrogate local splicing differences between transcripts. This method out-performed commonly used RNA-seq methods to identify known alternative splicing events in the Drosophila sex determination pathway. We describe a flexible software package to perform these tasks called Splicing Analysis Kit (Spanki), available at http://www.cbcb.umd.edu/software/spanki. Splice-junction centric analysis of RNA-Seq data provides advantages in specificity for detection of alternative splicing. Our software provides tools to better understand error profiles in RNA-Seq data and improve inference from this new technology. The splice-junction centric approach that this software enables will provide more accurate estimates of differentially regulated splicing than current tools.
Space station data system analysis/architecture study. Task 3: Trade studies, DR-5, volume 1
NASA Technical Reports Server (NTRS)
1985-01-01
The primary objective of Task 3 is to provide additional analysis and insight necessary to support key design/programmatic decision for options quantification and selection for system definition. This includes: (1) the identification of key trade study topics; (2) the definition of a trade study procedure for each topic (issues to be resolved, key inputs, criteria/weighting, methodology); (3) conduct tradeoff and sensitivity analysis; and (4) the review/verification of results within the context of evolving system design and definition. The trade study topics addressed in this volume include space autonomy and function automation, software transportability, system network topology, communications standardization, onboard local area networking, distributed operating system, software configuration management, and the software development environment facility.
A Java program for LRE-based real-time qPCR that enables large-scale absolute quantification.
Rutledge, Robert G
2011-03-02
Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples.
A Java Program for LRE-Based Real-Time qPCR that Enables Large-Scale Absolute Quantification
Rutledge, Robert G.
2011-01-01
Background Linear regression of efficiency (LRE) introduced a new paradigm for real-time qPCR that enables large-scale absolute quantification by eliminating the need for standard curves. Developed through the application of sigmoidal mathematics to SYBR Green I-based assays, target quantity is derived directly from fluorescence readings within the central region of an amplification profile. However, a major challenge of implementing LRE quantification is the labor intensive nature of the analysis. Findings Utilizing the extensive resources that are available for developing Java-based software, the LRE Analyzer was written using the NetBeans IDE, and is built on top of the modular architecture and windowing system provided by the NetBeans Platform. This fully featured desktop application determines the number of target molecules within a sample with little or no intervention by the user, in addition to providing extensive database capabilities. MS Excel is used to import data, allowing LRE quantification to be conducted with any real-time PCR instrument that provides access to the raw fluorescence readings. An extensive help set also provides an in-depth introduction to LRE, in addition to guidelines on how to implement LRE quantification. Conclusions The LRE Analyzer provides the automated analysis and data storage capabilities required by large-scale qPCR projects wanting to exploit the many advantages of absolute quantification. Foremost is the universal perspective afforded by absolute quantification, which among other attributes, provides the ability to directly compare quantitative data produced by different assays and/or instruments. Furthermore, absolute quantification has important implications for gene expression profiling in that it provides the foundation for comparing transcript quantities produced by any gene with any other gene, within and between samples. PMID:21407812
Final Technical Report: Quantification of Uncertainty in Extreme Scale Computations (QUEST)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Knio, Omar M.
QUEST is a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, University of Southern California, Massachusetts Institute of Technology, University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The Duke effort focused on the development of algorithms and utility software for non-intrusive sparse UQ representations, and on participation in the organization of annual workshops and tutorials to disseminate UQ tools to the community, and to gather inputmore » in order to adapt approaches to the needs of SciDAC customers. In particular, fundamental developments were made in (a) multiscale stochastic preconditioners, (b) gradient-based approaches to inverse problems, (c) adaptive pseudo-spectral approximations, (d) stochastic limit cycles, and (e) sensitivity analysis tools for noisy systems. In addition, large-scale demonstrations were performed, namely in the context of ocean general circulation models.« less
MorphoGraphX: A platform for quantifying morphogenesis in 4D
Barbier de Reuille, Pierre; Routier-Kierzkowska, Anne-Lise; Kierzkowski, Daniel; Bassel, George W; Schüpbach, Thierry; Tauriello, Gerardo; Bajpai, Namrata; Strauss, Sören; Weber, Alain; Kiss, Annamaria; Burian, Agata; Hofhuis, Hugo; Sapala, Aleksandra; Lipowczan, Marcin; Heimlicher, Maria B; Robinson, Sarah; Bayer, Emmanuelle M; Basler, Konrad; Koumoutsakos, Petros; Roeder, Adrienne HK; Aegerter-Wilmsen, Tinri; Nakayama, Naomi; Tsiantis, Miltos; Hay, Angela; Kwiatkowska, Dorota; Xenarios, Ioannis; Kuhlemeier, Cris; Smith, Richard S
2015-01-01
Morphogenesis emerges from complex multiscale interactions between genetic and mechanical processes. To understand these processes, the evolution of cell shape, proliferation and gene expression must be quantified. This quantification is usually performed either in full 3D, which is computationally expensive and technically challenging, or on 2D planar projections, which introduces geometrical artifacts on highly curved organs. Here we present MorphoGraphX (www.MorphoGraphX.org), a software that bridges this gap by working directly with curved surface images extracted from 3D data. In addition to traditional 3D image analysis, we have developed algorithms to operate on curved surfaces, such as cell segmentation, lineage tracking and fluorescence signal quantification. The software's modular design makes it easy to include existing libraries, or to implement new algorithms. Cell geometries extracted with MorphoGraphX can be exported and used as templates for simulation models, providing a powerful platform to investigate the interactions between shape, genes and growth. DOI: http://dx.doi.org/10.7554/eLife.05864.001 PMID:25946108
Planning of vessel grafts for reconstructive surgery in congenital heart diseases
NASA Astrophysics Data System (ADS)
Rietdorf, U.; Riesenkampff, E.; Schwarz, T.; Kuehne, T.; Meinzer, H.-P.; Wolf, I.
2010-02-01
The Fontan operation is a surgical treatment for patients with severe congenital heart diseases, where a biventricular correction of the heart can't be achieved. In these cases, a uni-ventricular system is established. During the last step of surgery a tunnel segment is placed to connect the inferior caval vein directly with the pulmonary artery, bypassing the right atrium and ventricle. Thus, the existing ventricle works for the body circulation, while the venous blood is passively directed to the pulmonary arteries. Fontan tunnels can be placed intra- and extracardially. The location, length and shape of the tunnel must be planned accurately. Furthermore, if the tunnel is placed extracardially, it must be positioned between other anatomical structures without constraining them. We developed a software system to support planning of the tunnel location, shape, and size, making pre-operative preparation of the tunnel material possible. The system allows for interactive placement and adjustment of the tunnel, affords a three-dimensional visualization of the virtual Fontan tunnel inside the thorax, and provides a quantification of the length, circumferences and diameters of the tunnel segments. The visualization and quantification can be used to plan and prepare the tunnel material for surgery in order to reduce the intra-operative time and to improve the fit of the tunnel patch.
Statistical analysis to assess automated level of suspicion scoring methods in breast ultrasound
NASA Astrophysics Data System (ADS)
Galperin, Michael
2003-05-01
A well-defined rule-based system has been developed for scoring 0-5 the Level of Suspicion (LOS) based on qualitative lexicon describing the ultrasound appearance of breast lesion. The purposes of the research are to asses and select one of the automated LOS scoring quantitative methods developed during preliminary studies in benign biopsies reduction. The study has used Computer Aided Imaging System (CAIS) to improve the uniformity and accuracy of applying the LOS scheme by automatically detecting, analyzing and comparing breast masses. The overall goal is to reduce biopsies on the masses with lower levels of suspicion, rather that increasing the accuracy of diagnosis of cancers (will require biopsy anyway). On complex cysts and fibroadenoma cases experienced radiologists were up to 50% less certain in true negatives than CAIS. Full correlation analysis was applied to determine which of the proposed LOS quantification methods serves CAIS accuracy the best. This paper presents current results of applying statistical analysis for automated LOS scoring quantification for breast masses with known biopsy results. It was found that First Order Ranking method yielded most the accurate results. The CAIS system (Image Companion, Data Companion software) is developed by Almen Laboratories and was used to achieve the results.
NASA Technical Reports Server (NTRS)
Barth, Timothy J.
2014-01-01
This workshop presentation discusses the design and implementation of numerical methods for the quantification of statistical uncertainty, including a-posteriori error bounds, for output quantities computed using CFD methods. Hydrodynamic realizations often contain numerical error arising from finite-dimensional approximation (e.g. numerical methods using grids, basis functions, particles) and statistical uncertainty arising from incomplete information and/or statistical characterization of model parameters and random fields. The first task at hand is to derive formal error bounds for statistics given realizations containing finite-dimensional numerical error [1]. The error in computed output statistics contains contributions from both realization error and the error resulting from the calculation of statistics integrals using a numerical method. A second task is to devise computable a-posteriori error bounds by numerically approximating all terms arising in the error bound estimates. For the same reason that CFD calculations including error bounds but omitting uncertainty modeling are only of limited value, CFD calculations including uncertainty modeling but omitting error bounds are only of limited value. To gain maximum value from CFD calculations, a general software package for uncertainty quantification with quantified error bounds has been developed at NASA. The package provides implementations for a suite of numerical methods used in uncertainty quantification: Dense tensorization basis methods [3] and a subscale recovery variant [1] for non-smooth data, Sparse tensorization methods[2] utilizing node-nested hierarchies, Sampling methods[4] for high-dimensional random variable spaces.
Quantification of dsDNA using the Hitachi F-7000 Fluorescence Spectrophotometer and PicoGreen dye.
Moreno, Luis A; Cox, Kendra L
2010-11-05
Quantification of DNA, especially in small concentrations, is an important task with a wide range of biological applications including standard molecular biology assays such as synthesis and purification of DNA, diagnostic applications such as quantification of DNA amplification products, and detection of DNA molecules in drug preparations. During this video we will demonstrate the capability of the Hitachi F-7000 Fluorescence Spectrophotometer equipped with a Micro Plate Reader accessory to perform dsDNA quantification using Molecular Probes Quant-it PicoGreen dye reagent kit. The F-7000 Fluorescence Spectrophotometer offers high sensitivity and high speed measurements. It is a highly flexible system capable of measuring fluorescence, luminescence, and phosphorescence. Several measuring modes are available, including wavelength scan, time scan, photometry and 3-D scan measurement. The spectrophotometer has sensitivity in the range of 50 picomoles of fluorescein when using a 300 μL sample volume in the microplate, and is capable of measuring scan speeds of 60,000 nm/minute. It also has a wide dynamic range of up to 5 orders of magnitude which allows for the use of calibration curves over a wide range of concentrations. The optical system uses all reflective optics for maximum energy and sensitivity. The standard wavelength range is 200 to 750 nm, and can be extended to 900 nm when using one of the optional near infrared photomultipliers. The system allows optional temperature control for the plate reader from 5 to 60 degrees Celsius using an optional external temperature controlled liquid circulator. The microplate reader allows for the use of 96 well microplates, and the measuring speed for 96 wells is less than 60 seconds when using the kinetics mode. Software controls for the F-7000 and Microplate Reader are also highly flexible. Samples may be set in either column or row formats, and any combination of wells may be chosen for sample measurements. This allows for optimal utilization of the microplate. Additionally, the software allows importing micro plate sample configurations created in Excel and saved in comma separated values, or "csv" format. Microplate measuring configurations can be saved and recalled by the software for convenience and increased productivity. Data results can be output to a standard report, to Excel, or to an optional Report Generator Program.
Quantification of dsDNA using the Hitachi F-7000 Fluorescence Spectrophotometer and PicoGreen Dye
Moreno, Luis A.; Cox, Kendra L.
2010-01-01
Quantification of DNA, especially in small concentrations, is an important task with a wide range of biological applications including standard molecular biology assays such as synthesis and purification of DNA, diagnostic applications such as quantification of DNA amplification products, and detection of DNA molecules in drug preparations. During this video we will demonstrate the capability of the Hitachi F-7000 Fluorescence Spectrophotometer equipped with a Micro Plate Reader accessory to perform dsDNA quantification using Molecular Probes Quant-it PicoGreen dye reagent kit. The F-7000 Fluorescence Spectrophotometer offers high sensitivity and high speed measurements. It is a highly flexible system capable of measuring fluorescence, luminescence, and phosphorescence. Several measuring modes are available, including wavelength scan, time scan, photometry and 3-D scan measurement. The spectrophotometer has sensitivity in the range of 50 picomoles of fluorescein when using a 300 μL sample volume in the microplate, and is capable of measuring scan speeds of 60,000 nm/minute. It also has a wide dynamic range of up to 5 orders of magnitude which allows for the use of calibration curves over a wide range of concentrations. The optical system uses all reflective optics for maximum energy and sensitivity. The standard wavelength range is 200 to 750 nm, and can be extended to 900 nm when using one of the optional near infrared photomultipliers. The system allows optional temperature control for the plate reader from 5 to 60 degrees Celsius using an optional external temperature controlled liquid circulator. The microplate reader allows for the use of 96 well microplates, and the measuring speed for 96 wells is less than 60 seconds when using the kinetics mode. Software controls for the F-7000 and Microplate Reader are also highly flexible. Samples may be set in either column or row formats, and any combination of wells may be chosen for sample measurements. This allows for optimal utilization of the microplate. Additionally, the software allows importing micro plate sample configurations created in Excel and saved in comma separated values, or "csv" format. Microplate measuring configurations can be saved and recalled by the software for convenience and increased productivity. Data results can be output to a standard report, to Excel, or to an optional Report Generator Program. PMID:21189464
RNAbrowse: RNA-Seq de novo assembly results browser.
Mariette, Jérôme; Noirot, Céline; Nabihoudine, Ibounyamine; Bardou, Philippe; Hoede, Claire; Djari, Anis; Cabau, Cédric; Klopp, Christophe
2014-01-01
Transcriptome analysis based on a de novo assembly of next generation RNA sequences is now performed routinely in many laboratories. The generated results, including contig sequences, quantification figures, functional annotations and variation discovery outputs are usually bulky and quite diverse. This article presents a user oriented storage and visualisation environment permitting to explore the data in a top-down manner, going from general graphical views to all possible details. The software package is based on biomart, easy to install and populate with local data. The software package is available under the GNU General Public License (GPL) at http://bioinfo.genotoul.fr/RNAbrowse.
Improved Strategies and Optimization of Calibration Models for Real-time PCR Absolute Quantification
Real-time PCR absolute quantification applications rely on the use of standard curves to make estimates of DNA target concentrations in unknown samples. Traditional absolute quantification approaches dictate that a standard curve must accompany each experimental run. However, t...
Design Analysis Kit for Optimization and Terascale Applications 6.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
2015-10-19
Sandia's Dakota software (available at http://dakota.sandia.gov) supports science and engineering transformation through advanced exploration of simulations. Specifically it manages and analyzes ensembles of simulations to provide broader and deeper perspective for analysts and decision makers. This enables them to: (1) enhance understanding of risk, (2) improve products, and (3) assess simulation credibility. In its simplest mode, Dakota can automate typical parameter variation studies through a generic interface to a computational model. However, Dakota also delivers advanced parametric analysis techniques enabling design exploration, optimization, model calibration, risk analysis, and quantification of margins and uncertainty with such models. It directly supports verificationmore » and validation activities. The algorithms implemented in Dakota aim to address challenges in performing these analyses with complex science and engineering models from desktop to high performance computers.« less
Klauser, A S; Franz, M; Bellmann Weiler, R; Gruber, J; Hartig, F; Mur, E; Wick, M C; Jaschke, W
2011-12-01
To compare joint inflammation assessment using subjective grading of power Doppler ultrasonography (PDUS) and contrast-enhanced ultrasonography (CEUS) versus computer-aided objective CEUS quantification. 37 joints of 28 patients with arthritis of different etiologies underwent B-mode ultrasonography, PDUS, and CEUS using a second-generation contrast agent. Synovial thickness, extent of vascularized pannus and intensity of vascularization were included in a 4-point PDUS and CEUS grading system. Subjective CEUS and PDUS scores were compared to computer-aided objective CEUS quantification using Qontrast® software for the calculation of the signal intensity (SI) and the ratio of SI for contrast enhancement. The interobserver agreement for subjective scoring was good to excellent (κ = 0.8 - 1.0; P < 0.0001). Computer-aided objective CEUS quantification correlated statistically significantly with subjective CEUS (P < 0.001) and PDUS grading (P < 0.05). The Qontrast® SI ratio correlated with subjective CEUS (P < 0.02) and PDUS grading (P < 0.03). Clinical activity did not correlate with vascularity or synovial thickening (P = N. S.) and no correlation between synovial thickening and vascularity extent could be found, neither using PDUS nor CEUS (P = N. S.). Both subjective CEUS grading and objective CEUS quantification are valuable for assessing joint vascularity in arthritis and computer-aided CEUS quantification may be a suitable objective tool for therapy follow-up in arthritis. © Georg Thieme Verlag KG Stuttgart · New York.
Quantification of Uncertainty in Extreme Scale Computations (QUEST)
DOE Office of Scientific and Technical Information (OSTI.GOV)
Ghanem, Roger
QUEST was a SciDAC Institute comprising Sandia National Laboratories, Los Alamos National Laboratory, the University of Southern California, the Massachusetts Institute of Technology, the University of Texas at Austin, and Duke University. The mission of QUEST is to: (1) develop a broad class of uncertainty quantification (UQ) methods/tools, and (2) provide UQ expertise and software to other SciDAC projects, thereby enabling/guiding their UQ activities. The USC effort centered on the development of reduced models and efficient algorithms for implementing various components of the UQ pipeline. USC personnel were responsible for the development of adaptive bases, adaptive quadrature, and reduced modelsmore » to be used in estimation and inference.« less
NASA Astrophysics Data System (ADS)
Cao, Xinhua; Xu, Xiaoyin; Voss, Stephan
2017-03-01
In this paper, we describe an enhanced DICOM Secondary Capture (SC) that integrates Image Quantification (IQ) results, Regions of Interest (ROIs), and Time Activity Curves (TACs) with screen shots by embedding extra medical imaging information into a standard DICOM header. A software toolkit of DICOM IQSC has been developed to implement the SC-centered information integration of quantitative analysis for routine practice of nuclear medicine. Primary experiments show that the DICOM IQSC method is simple and easy to implement seamlessly integrating post-processing workstations with PACS for archiving and retrieving IQ information. Additional DICOM IQSC applications in routine nuclear medicine and clinic research are also discussed.
Quantification of Operational Risk Using A Data Mining
NASA Technical Reports Server (NTRS)
Perera, J. Sebastian
1999-01-01
What is Data Mining? - Data Mining is the process of finding actionable information hidden in raw data. - Data Mining helps find hidden patterns, trends, and important relationships often buried in a sea of data - Typically, automated software tools based on advanced statistical analysis and data modeling technology can be utilized to automate the data mining process
Algorithm Diversity for Resilent Systems
2016-06-27
data structures. 15. SUBJECT TERMS computer security, software diversity, program transformation 16. SECURITY CLASSIFICATION OF: 17. LIMITATION OF 18...systematic method for transforming Datalog rules with general universal and existential quantification into efficient algorithms with precise complexity...worst case in the size of the ground rules. There are numerous choices during the transformation that lead to diverse algorithms and different
An Open-Source Standard T-Wave Alternans Detector for Benchmarking.
Khaustov, A; Nemati, S; Clifford, Gd
2008-09-14
We describe an open source algorithm suite for T-Wave Alternans (TWA) detection and quantification. The software consists of Matlab implementations of the widely used Spectral Method and Modified Moving Average with libraries to read both WFDB and ASCII data under windows and Linux. The software suite can run in both batch mode and with a provided graphical user interface to aid waveform exploration. Our software suite was calibrated using an open source TWA model, described in a partner paper [1] by Clifford and Sameni. For the PhysioNet/CinC Challenge 2008 we obtained a score of 0.881 for the Spectral Method and 0.400 for the MMA method. However, our objective was not to provide the best TWA detector, but rather a basis for detailed discussion of algorithms.
Data Independent Acquisition analysis in ProHits 4.0.
Liu, Guomin; Knight, James D R; Zhang, Jian Ping; Tsou, Chih-Chiang; Wang, Jian; Lambert, Jean-Philippe; Larsen, Brett; Tyers, Mike; Raught, Brian; Bandeira, Nuno; Nesvizhskii, Alexey I; Choi, Hyungwon; Gingras, Anne-Claude
2016-10-21
Affinity purification coupled with mass spectrometry (AP-MS) is a powerful technique for the identification and quantification of physical interactions. AP-MS requires careful experimental design, appropriate control selection and quantitative workflows to successfully identify bona fide interactors amongst a large background of contaminants. We previously introduced ProHits, a Laboratory Information Management System for interaction proteomics, which tracks all samples in a mass spectrometry facility, initiates database searches and provides visualization tools for spectral counting-based AP-MS approaches. More recently, we implemented Significance Analysis of INTeractome (SAINT) within ProHits to provide scoring of interactions based on spectral counts. Here, we provide an update to ProHits to support Data Independent Acquisition (DIA) with identification software (DIA-Umpire and MSPLIT-DIA), quantification tools (through DIA-Umpire, or externally via targeted extraction), and assessment of quantitative enrichment (through mapDIA) and scoring of interactions (through SAINT-intensity). With additional improvements, notably support of the iProphet pipeline, facilitated deposition into ProteomeXchange repositories and enhanced export and viewing functions, ProHits 4.0 offers a comprehensive suite of tools to facilitate affinity proteomics studies. It remains challenging to score, annotate and analyze proteomics data in a transparent manner. ProHits was previously introduced as a LIMS to enable storing, tracking and analysis of standard AP-MS data. In this revised version, we expand ProHits to include integration with a number of identification and quantification tools based on Data-Independent Acquisition (DIA). ProHits 4.0 also facilitates data deposition into public repositories, and the transfer of data to new visualization tools. Copyright © 2016 Elsevier B.V. All rights reserved.
Park, Min Kyung; Park, Jin Young; Nicolas, Geneviève; Paik, Hee Young; Kim, Jeongseon; Slimani, Nadia
2015-06-14
During the past decades, a rapid nutritional transition has been observed along with economic growth in the Republic of Korea. Since this dramatic change in diet has been frequently associated with cancer and other non-communicable diseases, dietary monitoring is essential to understand the association. Benefiting from pre-existing standardised dietary methodologies, the present study aimed to evaluate the feasibility and describe the development of a Korean version of the international computerised 24 h dietary recall method (GloboDiet software) and its complementary tools, developed at the International Agency for Research on Cancer (IARC), WHO. Following established international Standard Operating Procedures and guidelines, about seventy common and country-specific databases on foods, recipes, dietary supplements, quantification methods and coefficients were customised and translated. The main results of the present study highlight the specific adaptations made to adapt the GloboDiet software for research and dietary surveillance in Korea. New (sub-) subgroups were added into the existing common food classification, and new descriptors were added to the facets to classify and describe specific Korean foods. Quantification methods were critically evaluated and adapted considering the foods and food packages available in the Korean market. Furthermore, a picture book of foods/dishes was prepared including new pictures and food portion sizes relevant to Korean diet. The development of the Korean version of GloboDiet demonstrated that it was possible to adapt the IARC-WHO international dietary tool to an Asian context without compromising its concept of standardisation and software structure. It, thus, confirms that this international dietary methodology, used so far only in Europe, is flexible and robust enough to be customised for other regions worldwide.
NASA Astrophysics Data System (ADS)
Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana
2016-10-01
The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets.
Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana
2016-10-14
The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets.
NASA Astrophysics Data System (ADS)
Patra, A. K.; Valentine, G. A.; Bursik, M. I.; Connor, C.; Connor, L.; Jones, M.; Simakov, N.; Aghakhani, H.; Jones-Ivey, R.; Kosar, T.; Zhang, B.
2015-12-01
Over the last 5 years we have created a community collaboratory Vhub.org [Palma et al, J. App. Volc. 3:2 doi:10.1186/2191-5040-3-2] as a place to find volcanology-related resources, and a venue for users to disseminate tools, teaching resources, data, and an online platform to support collaborative efforts. As the community (current active users > 6000 from an estimated community of comparable size) embeds the tools in the collaboratory into educational and research workflows it became imperative to: a) redesign tools into robust, open source reusable software for online and offline usage/enhancement; b) share large datasets with remote collaborators and other users seamlessly with security; c) support complex workflows for uncertainty analysis, validation and verification and data assimilation with large data. The focus on tool development/redevelopment has been twofold - firstly to use best practices in software engineering and new hardware like multi-core and graphic processing units. Secondly we wish to enhance capabilities to support inverse modeling, uncertainty quantification using large ensembles and design of experiments, calibration, validation. Among software engineering practices we practice are open source facilitating community contributions, modularity and reusability. Our initial targets are four popular tools on Vhub - TITAN2D, TEPHRA2, PUFF and LAVA. Use of tools like these requires many observation driven data sets e.g. digital elevation models of topography, satellite imagery, field observations on deposits etc. These data are often maintained in private repositories that are privately shared by "sneaker-net". As a partial solution to this we tested mechanisms using irods software for online sharing of private data with public metadata and access limits. Finally, we adapted use of workflow engines (e.g. Pegasus) to support the complex data and computing workflows needed for usage like uncertainty quantification for hazard analysis using physical models.
Intramyocellular lipid quantification: repeatability with 1H MR spectroscopy.
Torriani, Martin; Thomas, Bijoy J; Halpern, Elkan F; Jensen, Megan E; Rosenthal, Daniel I; Palmer, William E
2005-08-01
To prospectively determine the repeatability and variability of tibialis anterior intramyocellular lipid (IMCL) quantifications performed by using 1.5-T hydrogen 1 (1H) magnetic resonance (MR) spectroscopy in healthy subjects. Institutional review board approval and written informed consent were obtained for this Health Insurance Portability and Accountability Act-compliant study. The authors examined the anterior tibial muscles of 27 healthy subjects aged 19-48 years (12 men, 15 women; mean age, 25 years) by using single-voxel short-echo-time point-resolved 1H MR spectroscopy. During a first visit, the subjects underwent 1H MR spectroscopy before and after being repositioned in the magnet bore, with voxels carefully placed on the basis of osseous landmarks. Measurements were repeated after a mean interval of 12 days. All spectra were fitted by using Java-based MR user interface (jMRUI) and LCModel software, and lipid peaks were scaled to the unsuppressed water peak (at 4.7 ppm) and the total creatine peak (at approximately 3.0 ppm). A one-way random-effects variance components model was used to determine intraday and intervisit coefficients of variation (CVs). A power analysis was performed to determine the detectable percentage change in lipid measurements for two subject sample sizes. Measurements of the IMCL methylene protons peak at a resonance of 1.3 ppm scaled to the unsuppressed water peak (IMCL(W)) that were obtained by using jMRUI software yielded the lowest CVs overall (intraday and intervisit CVs, 13.4% and 14.4%, respectively). The random-effects variance components model revealed that nonbiologic factors (equipment and repositioning) accounted for 50% of the total variability in IMCL quantifications. Power analysis for a sample size of 20 subjects revealed that changes in IMCL(W) of greater than 15% could be confidently detected between 1H MR spectroscopic measurements obtained on different days. 1H MR spectroscopy is feasible for repeatable quantification of IMCL concentrations in longitudinal studies of muscle metabolism.
Röhnisch, Hanna E; Eriksson, Jan; Müllner, Elisabeth; Agback, Peter; Sandström, Corine; Moazzami, Ali A
2018-02-06
A key limiting step for high-throughput NMR-based metabolomics is the lack of rapid and accurate tools for absolute quantification of many metabolites. We developed, implemented, and evaluated an algorithm, AQuA (Automated Quantification Algorithm), for targeted metabolite quantification from complex 1 H NMR spectra. AQuA operates based on spectral data extracted from a library consisting of one standard calibration spectrum for each metabolite. It uses one preselected NMR signal per metabolite for determining absolute concentrations and does so by effectively accounting for interferences caused by other metabolites. AQuA was implemented and evaluated using experimental NMR spectra from human plasma. The accuracy of AQuA was tested and confirmed in comparison with a manual spectral fitting approach using the ChenomX software, in which 61 out of 67 metabolites quantified in 30 human plasma spectra showed a goodness-of-fit (r 2 ) close to or exceeding 0.9 between the two approaches. In addition, three quality indicators generated by AQuA, namely, occurrence, interference, and positional deviation, were studied. These quality indicators permit evaluation of the results each time the algorithm is operated. The efficiency was tested and confirmed by implementing AQuA for quantification of 67 metabolites in a large data set comprising 1342 experimental spectra from human plasma, in which the whole computation took less than 1 s.
Desmarais, Samantha M.; Tropini, Carolina; Miguel, Amanda; Cava, Felipe; Monds, Russell D.; de Pedro, Miguel A.; Huang, Kerwyn Casey
2015-01-01
The bacterial cell wall is a network of glycan strands cross-linked by short peptides (peptidoglycan); it is responsible for the mechanical integrity of the cell and shape determination. Liquid chromatography can be used to measure the abundance of the muropeptide subunits composing the cell wall. Characteristics such as the degree of cross-linking and average glycan strand length are known to vary across species. However, a systematic comparison among strains of a given species has yet to be undertaken, making it difficult to assess the origins of variability in peptidoglycan composition. We present a protocol for muropeptide analysis using ultra performance liquid chromatography (UPLC) and demonstrate that UPLC achieves resolution comparable with that of HPLC while requiring orders of magnitude less injection volume and a fraction of the elution time. We also developed a software platform to automate the identification and quantification of chromatographic peaks, which we demonstrate has improved accuracy relative to other software. This combined experimental and computational methodology revealed that peptidoglycan composition was approximately maintained across strains from three Gram-negative species despite taxonomical and morphological differences. Peptidoglycan composition and density were maintained after we systematically altered cell size in Escherichia coli using the antibiotic A22, indicating that cell shape is largely decoupled from the biochemistry of peptidoglycan synthesis. High-throughput, sensitive UPLC combined with our automated software for chromatographic analysis will accelerate the discovery of peptidoglycan composition and the molecular mechanisms of cell wall structure determination. PMID:26468288
Röst, Hannes L; Liu, Yansheng; D'Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi
2016-09-01
Next-generation mass spectrometric (MS) techniques such as SWATH-MS have substantially increased the throughput and reproducibility of proteomic analysis, but ensuring consistent quantification of thousands of peptide analytes across multiple liquid chromatography-tandem MS (LC-MS/MS) runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we developed TRIC (http://proteomics.ethz.ch/tric/), a software tool that utilizes fragment-ion data to perform cross-run alignment, consistent peak-picking and quantification for high-throughput targeted proteomics. TRIC reduced the identification error compared to a state-of-the-art SWATH-MS analysis without alignment by more than threefold at constant recall while correcting for highly nonlinear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups. Thus, TRIC fills a gap in the pipeline for automated analysis of massively parallel targeted proteomics data sets.
Grewal, Dilraj S; Tanna, Angelo P
2013-03-01
With the rapid adoption of spectral domain optical coherence tomography (SDOCT) in clinical practice and the recent advances in software technology, there is a need for a review of the literature on glaucoma detection and progression analysis algorithms designed for the commercially available instruments. Peripapillary retinal nerve fiber layer (RNFL) thickness and macular thickness, including segmental macular thickness calculation algorithms, have been demonstrated to be repeatable and reproducible, and have a high degree of diagnostic sensitivity and specificity in discriminating between healthy and glaucomatous eyes across the glaucoma continuum. Newer software capabilities such as glaucoma progression detection algorithms provide an objective analysis of longitudinally obtained structural data that enhances our ability to detect glaucomatous progression. RNFL measurements obtained with SDOCT appear more sensitive than time domain OCT (TDOCT) for glaucoma progression detection; however, agreement with the assessments of visual field progression is poor. Over the last few years, several studies have been performed to assess the diagnostic performance of SDOCT structural imaging and its validity in assessing glaucoma progression. Most evidence suggests that SDOCT performs similarly to TDOCT for glaucoma diagnosis; however, SDOCT may be superior for the detection of early stage disease. With respect to progression detection, SDOCT represents an important technological advance because of its improved resolution and repeatability. Advancements in RNFL thickness quantification, segmental macular thickness calculation and progression detection algorithms, when used correctly, may help to improve our ability to diagnose and manage glaucoma.
E-learning for textile enterprises innovation improvement
NASA Astrophysics Data System (ADS)
Blaga, M.; Harpa, R.; Radulescu, I. R.; Stepjanovic, Z.
2017-10-01
The Erasmus Plus project- TEXMatrix: “Matrix of knowledge for innovation and competitiveness in textile enterprises”, financed through the Erasmus+ Programme, Strategic partnerships- KA2 for Vocational Education and Training, aims at spreading the creative and innovative organizational culture inside textile enterprises by transferring and implementing methodologies, tools and concepts for improved training. Five European partners form the project consortium: INCDTP - Bucharest, Romania (coordinator), TecMinho - Portugal, Centrocot - Italy, University Maribor, Slovenia, and “Gheorghe Asachi” Technical University of Iasi, Romania. These will help the textile enterprises involved in the project, to learn how to apply creative thinking in their organizations and how to develop the capacity for innovation and change. The project aims to bridge the gap between textile enterprises need for qualified personnel and the young workforce. It develops an innovative knowledge matrix for the tangible and intangible assets of an enterprise and a benchmarking study, based on which a dedicated software tool will be created. This software tool will aid the decision-making enterprise staff (managers, HR specialists, professionals) as well as the trainees (young employees, students, and scholars) to cope with the new challenges of innovation and competitiveness for the textile field. The purpose of this paper is to present the main objectives and achievements of the project, according to its declared goals, with the focus on the presentation of the knowledge matrix of innovation, which is a powerful instrument for the quantification of the intangible assets of textile enterprises.
Modelling and analysis of FMS productivity variables by ISM, SEM and GTMA approach
NASA Astrophysics Data System (ADS)
Jain, Vineet; Raj, Tilak
2014-09-01
Productivity has often been cited as a key factor in a flexible manufacturing system (FMS) performance, and actions to increase it are said to improve profitability and the wage earning capacity of employees. Improving productivity is seen as a key issue for survival and success in the long term of a manufacturing system. The purpose of this paper is to make a model and analysis of the productivity variables of FMS. This study was performed by different approaches viz. interpretive structural modelling (ISM), structural equation modelling (SEM), graph theory and matrix approach (GTMA) and a cross-sectional survey within manufacturing firms in India. ISM has been used to develop a model of productivity variables, and then it has been analyzed. Exploratory factor analysis (EFA) and confirmatory factor analysis (CFA) are powerful statistical techniques. CFA is carried by SEM. EFA is applied to extract the factors in FMS by the statistical package for social sciences (SPSS 20) software and confirming these factors by CFA through analysis of moment structures (AMOS 20) software. The twenty productivity variables are identified through literature and four factors extracted, which involves the productivity of FMS. The four factors are people, quality, machine and flexibility. SEM using AMOS 20 was used to perform the first order four-factor structures. GTMA is a multiple attribute decision making (MADM) methodology used to find intensity/quantification of productivity variables in an organization. The FMS productivity index has purposed to intensify the factors which affect FMS.
Perez de Souza, Leonardo; Naake, Thomas; Tohge, Takayuki; Fernie, Alisdair R
2017-01-01
Abstract The grand challenge currently facing metabolomics is the expansion of the coverage of the metabolome from a minor percentage of the metabolic complement of the cell toward the level of coverage afforded by other post-genomic technologies such as transcriptomics and proteomics. In plants, this problem is exacerbated by the sheer diversity of chemicals that constitute the metabolome, with the number of metabolites in the plant kingdom generally considered to be in excess of 200 000. In this review, we focus on web resources that can be exploited in order to improve analyte and ultimately metabolite identification and quantification. There is a wide range of available software that not only aids in this but also in the related area of peak alignment; however, for the uninitiated, choosing which program to use is a daunting task. For this reason, we provide an overview of the pros and cons of the software as well as comments regarding the level of programing skills required to effectively exploit their basic functions. In addition, the torrent of available genome and transcriptome sequences that followed the advent of next-generation sequencing has opened up further valuable resources for metabolite identification. All things considered, we posit that only via a continued communal sharing of information such as that deposited in the databases described within the article are we likely to be able to make significant headway toward improving our coverage of the plant metabolome. PMID:28520864
Quantitative Proteomics via High Resolution MS Quantification: Capabilities and Limitations
Higgs, Richard E.; Butler, Jon P.; Han, Bomie; Knierman, Michael D.
2013-01-01
Recent improvements in the mass accuracy and resolution of mass spectrometers have led to renewed interest in label-free quantification using data from the primary mass spectrum (MS1) acquired from data-dependent proteomics experiments. The capacity for higher specificity quantification of peptides from samples enriched for proteins of biological interest offers distinct advantages for hypothesis generating experiments relative to immunoassay detection methods or prespecified peptide ions measured by multiple reaction monitoring (MRM) approaches. Here we describe an evaluation of different methods to post-process peptide level quantification information to support protein level inference. We characterize the methods by examining their ability to recover a known dilution of a standard protein in background matrices of varying complexity. Additionally, the MS1 quantification results are compared to a standard, targeted, MRM approach on the same samples under equivalent instrument conditions. We show the existence of multiple peptides with MS1 quantification sensitivity similar to the best MRM peptides for each of the background matrices studied. Based on these results we provide recommendations on preferred approaches to leveraging quantitative measurements of multiple peptides to improve protein level inference. PMID:23710359
A survey of tools for the analysis of quantitative PCR (qPCR) data.
Pabinger, Stephan; Rödiger, Stefan; Kriegner, Albert; Vierlinger, Klemens; Weinhäusel, Andreas
2014-09-01
Real-time quantitative polymerase-chain-reaction (qPCR) is a standard technique in most laboratories used for various applications in basic research. Analysis of qPCR data is a crucial part of the entire experiment, which has led to the development of a plethora of methods. The released tools either cover specific parts of the workflow or provide complete analysis solutions. Here, we surveyed 27 open-access software packages and tools for the analysis of qPCR data. The survey includes 8 Microsoft Windows, 5 web-based, 9 R-based and 5 tools from other platforms. Reviewed packages and tools support the analysis of different qPCR applications, such as RNA quantification, DNA methylation, genotyping, identification of copy number variations, and digital PCR. We report an overview of the functionality, features and specific requirements of the individual software tools, such as data exchange formats, availability of a graphical user interface, included procedures for graphical data presentation, and offered statistical methods. In addition, we provide an overview about quantification strategies, and report various applications of qPCR. Our comprehensive survey showed that most tools use their own file format and only a fraction of the currently existing tools support the standardized data exchange format RDML. To allow a more streamlined and comparable analysis of qPCR data, more vendors and tools need to adapt the standardized format to encourage the exchange of data between instrument software, analysis tools, and researchers.
Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte
2013-01-01
Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC–electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments. PMID:23788530
Hessling, Bernd; Büttner, Knut; Hecker, Michael; Becher, Dörte
2013-10-01
Quantitative LC-MALDI is an underrepresented method, especially in large-scale experiments. The additional fractionation step that is needed for most MALDI-TOF-TOF instruments, the comparatively long analysis time, and the very limited number of established software tools for the data analysis render LC-MALDI a niche application for large quantitative analyses beside the widespread LC-electrospray ionization workflows. Here, we used LC-MALDI in a relative quantification analysis of Staphylococcus aureus for the first time on a proteome-wide scale. Samples were analyzed in parallel with an LTQ-Orbitrap, which allowed cross-validation with a well-established workflow. With nearly 850 proteins identified in the cytosolic fraction and quantitative data for more than 550 proteins obtained with the MASCOT Distiller software, we were able to prove that LC-MALDI is able to process highly complex samples. The good correlation of quantities determined via this method and the LTQ-Orbitrap workflow confirmed the high reliability of our LC-MALDI approach for global quantification analysis. Because the existing literature reports differences for MALDI and electrospray ionization preferences and the respective experimental work was limited by technical or methodological constraints, we systematically compared biochemical attributes of peptides identified with either instrument. This genome-wide, comprehensive study revealed biases toward certain peptide properties for both MALDI-TOF-TOF- and LTQ-Orbitrap-based approaches. These biases are based on almost 13,000 peptides and result in a general complementarity of the two approaches that should be exploited in future experiments.
IMS software developments for the detection of chemical warfare agent
NASA Technical Reports Server (NTRS)
Klepel, ST.; Graefenhain, U.; Lippe, R.; Stach, J.; Starrock, V.
1995-01-01
Interference compounds like gasoline, diesel, burning wood or fuel, etc. are presented in common battlefield situations. These compounds can cause detectors to respond as a false positive or interfere with the detector's ability to respond to target compounds such as chemical warfare agents. To ensure proper response of the ion mobility spectrometer to chemical warfare agents, two special software packages were developed and incorporated into the Bruker RAID-1. The programs suppress interferring signals caused by car exhaust or smoke gases resulting from burning materials and correct the influence of variable sample gas humidity which is important for detection and quantification of blister agents like mustard gas or lewisite.
RNAbrowse: RNA-Seq De Novo Assembly Results Browser
Mariette, Jérôme; Noirot, Céline; Nabihoudine, Ibounyamine; Bardou, Philippe; Hoede, Claire; Djari, Anis; Cabau, Cédric; Klopp, Christophe
2014-01-01
Transcriptome analysis based on a de novo assembly of next generation RNA sequences is now performed routinely in many laboratories. The generated results, including contig sequences, quantification figures, functional annotations and variation discovery outputs are usually bulky and quite diverse. This article presents a user oriented storage and visualisation environment permitting to explore the data in a top-down manner, going from general graphical views to all possible details. The software package is based on biomart, easy to install and populate with local data. The software package is available under the GNU General Public License (GPL) at http://bioinfo.genotoul.fr/RNAbrowse. PMID:24823498
Plasma protein absolute quantification by nano-LC Q-TOF UDMSE for clinical biomarker verification
ILIES, MARIA; IUGA, CRISTINA ADELA; LOGHIN, FELICIA; DHOPLE, VISHNU MUKUND; HAMMER, ELKE
2017-01-01
Background and aims Proteome-based biomarker studies are targeting proteins that could serve as diagnostic, prognosis, and prediction molecules. In the clinical routine, immunoassays are currently used for the absolute quantification of such biomarkers, with the major limitation that only one molecule can be targeted per assay. The aim of our study was to test a mass spectrometry based absolute quantification method for the verification of plasma protein sets which might serve as reliable biomarker panels for the clinical practice. Methods Six EDTA plasma samples were analyzed after tryptic digestion using a high throughput data independent acquisition nano-LC Q-TOF UDMSE proteomics approach. Synthetic Escherichia coli standard peptides were spiked in each sample for the absolute quantification. Data analysis was performed using ProgenesisQI v2.0 software (Waters Corporation). Results Our method ensured absolute quantification of 242 non redundant plasma proteins in a single run analysis. The dynamic range covered was 105. 86% were represented by classical plasma proteins. The overall median coefficient of variation was 0.36, while a set of 63 proteins was found to be highly stable. Absolute protein concentrations strongly correlated with values reviewed in the literature. Conclusions Nano-LC Q-TOF UDMSE proteomic analysis can be used for a simple and rapid determination of absolute amounts of plasma proteins. A large number of plasma proteins could be analyzed, while a wide dynamic range was covered with low coefficient of variation at protein level. The method proved to be a reliable tool for the quantification of protein panel for biomarker verification in the clinical practice. PMID:29151793
NASA Astrophysics Data System (ADS)
Cook, G. D.; Liedloff, A. C.; Richards, A. E.; Meyer, M.
2016-12-01
Australia is the only OECD country with a significant area of tropical savannas within it borders. Approximately 220 000 km2 of these savannas burn every year releasing 2 to 4 % of Australia's accountable greenhouse gas emissions. Reduction in uncertainty in the quantification of these emissions of methane and nitrous has been fundamental to improving both the national GHG inventory and developing approaches to better manage land to reduce these emissions. Projects to reduce pyrogenic emissions have been adopted across 30% of Australia's high rainfall savannas. Recent work has focussed on quantifying the additional benefit of increased carbon stocks in fine fuel and coarse woody debris (CWD) resulting from improvements in fire management. An integrated set of equations have been developed to enable seemless quantification of emissions and sequestration in these frequently burnt savannas. These show that increases in carbon stored in fine fuel and CWD comprises about 3 times the emissions abatement from improvements in fire management that have been achieved in a project area of 28 000 km2. Future work is focussing on improving the understanding of spatial and temporal variation in fire behaviour across Australia's savanna biome, improvements in quantification of carbon dynamics of CWD and improved quantification of the effects of fire on carbon dynamics in soils of the savannas.
Uncertainty quantification in Rothermel's Model using an efficient sampling method
Edwin Jimenez; M. Yousuff Hussaini; Scott L. Goodrick
2007-01-01
The purpose of the present work is to quantify parametric uncertainty in Rothermelâs wildland fire spread model (implemented in software such as BehavePlus3 and FARSITE), which is undoubtedly among the most widely used fire spread models in the United States. This model consists of a nonlinear system of equations that relates environmental variables (input parameter...
Sedgewick, Gerald J.; Ericson, Marna
2015-01-01
Obtaining digital images of color brightfield microscopy is an important aspect of biomedical research and the clinical practice of diagnostic pathology. Although the field of digital pathology has had tremendous advances in whole-slide imaging systems, little effort has been directed toward standardizing color brightfield digital imaging to maintain image-to-image consistency and tonal linearity. Using a single camera and microscope to obtain digital images of three stains, we show that microscope and camera systems inherently produce image-to-image variation. Moreover, we demonstrate that post-processing with a widely used raster graphics editor software program does not completely correct for session-to-session inconsistency. We introduce a reliable method for creating consistent images with a hardware/software solution (ChromaCal™; Datacolor Inc., NJ) along with its features for creating color standardization, preserving linear tonal levels, providing automated white balancing and setting automated brightness to consistent levels. The resulting image consistency using this method will also streamline mean density and morphometry measurements, as images are easily segmented and single thresholds can be used. We suggest that this is a superior method for color brightfield imaging, which can be used for quantification and can be readily incorporated into workflows. PMID:25575568
Integrated smart structures wingbox
NASA Astrophysics Data System (ADS)
Simon, Solomon H.
1993-09-01
One objective of smart structures development is to demonstrate the ability of a mechanical component to monitor its own structural integrity and health. Achievement of this objective requires the integration of different technologies, i.e.: (1) structures, (2) sensors, and (3) artificial intelligence. We coordinated a team of experts from these three fields. These experts used reliable knowledge towards the forefront of their technologies and combined the appropriate features into an integrated hardware/software smart structures wingbox (SSW) test article. A 1/4 in. hole was drilled into the SSW test article. Although the smart structure had never seen damage of this type, it correctly recognized and located the damage. Based on a knowledge-based simulation, quantification and assessment were also carried out. We have demonstrated that the SSW integrated hardware & software test article can perform six related functions: (1) identification of a defect; (2) location of the defect; (3) quantification of the amount of damage; (4) assessment of performance degradation; (5) continued monitoring in spite of damage; and (6) continuous recording of integrity data. We present the successful results of the integrated test article in this paper, along with plans for future development and deployment of the technology.
Müllenbroich, M Caroline; Silvestri, Ludovico; Onofri, Leonardo; Costantini, Irene; Hoff, Marcel Van't; Sacconi, Leonardo; Iannello, Giulio; Pavone, Francesco S
2015-10-01
Comprehensive mapping and quantification of neuronal projections in the central nervous system requires high-throughput imaging of large volumes with microscopic resolution. To this end, we have developed a confocal light-sheet microscope that has been optimized for three-dimensional (3-D) imaging of structurally intact clarified whole-mount mouse brains. We describe the optical and electromechanical arrangement of the microscope and give details on the organization of the microscope management software. The software orchestrates all components of the microscope, coordinates critical timing and synchronization, and has been written in a versatile and modular structure using the LabVIEW language. It can easily be adapted and integrated to other microscope systems and has been made freely available to the light-sheet community. The tremendous amount of data routinely generated by light-sheet microscopy further requires novel strategies for data handling and storage. To complete the full imaging pipeline of our high-throughput microscope, we further elaborate on big data management from streaming of raw images up to stitching of 3-D datasets. The mesoscale neuroanatomy imaged at micron-scale resolution in those datasets allows characterization and quantification of neuronal projections in unsectioned mouse brains.
Exact and Approximate Probabilistic Symbolic Execution
NASA Technical Reports Server (NTRS)
Luckow, Kasper; Pasareanu, Corina S.; Dwyer, Matthew B.; Filieri, Antonio; Visser, Willem
2014-01-01
Probabilistic software analysis seeks to quantify the likelihood of reaching a target event under uncertain environments. Recent approaches compute probabilities of execution paths using symbolic execution, but do not support nondeterminism. Nondeterminism arises naturally when no suitable probabilistic model can capture a program behavior, e.g., for multithreading or distributed systems. In this work, we propose a technique, based on symbolic execution, to synthesize schedulers that resolve nondeterminism to maximize the probability of reaching a target event. To scale to large systems, we also introduce approximate algorithms to search for good schedulers, speeding up established random sampling and reinforcement learning results through the quantification of path probabilities based on symbolic execution. We implemented the techniques in Symbolic PathFinder and evaluated them on nondeterministic Java programs. We show that our algorithms significantly improve upon a state-of- the-art statistical model checking algorithm, originally developed for Markov Decision Processes.
2011-01-01
Background Since its inception, proteomics has essentially operated in a discovery mode with the goal of identifying and quantifying the maximal number of proteins in a sample. Increasingly, proteomic measurements are also supporting hypothesis-driven studies, in which a predetermined set of proteins is consistently detected and quantified in multiple samples. Selected reaction monitoring (SRM) is a targeted mass spectrometric technique that supports the detection and quantification of specific proteins in complex samples at high sensitivity and reproducibility. Here, we describe ATAQS, an integrated software platform that supports all stages of targeted, SRM-based proteomics experiments including target selection, transition optimization and post acquisition data analysis. This software will significantly facilitate the use of targeted proteomic techniques and contribute to the generation of highly sensitive, reproducible and complete datasets that are particularly critical for the discovery and validation of targets in hypothesis-driven studies in systems biology. Result We introduce a new open source software pipeline, ATAQS (Automated and Targeted Analysis with Quantitative SRM), which consists of a number of modules that collectively support the SRM assay development workflow for targeted proteomic experiments (project management and generation of protein, peptide and transitions and the validation of peptide detection by SRM). ATAQS provides a flexible pipeline for end-users by allowing the workflow to start or end at any point of the pipeline, and for computational biologists, by enabling the easy extension of java algorithm classes for their own algorithm plug-in or connection via an external web site. This integrated system supports all steps in a SRM-based experiment and provides a user-friendly GUI that can be run by any operating system that allows the installation of the Mozilla Firefox web browser. Conclusions Targeted proteomics via SRM is a powerful new technique that enables the reproducible and accurate identification and quantification of sets of proteins of interest. ATAQS is the first open-source software that supports all steps of the targeted proteomics workflow. ATAQS also provides software API (Application Program Interface) documentation that enables the addition of new algorithms to each of the workflow steps. The software, installation guide and sample dataset can be found in http://tools.proteomecenter.org/ATAQS/ATAQS.html PMID:21414234
IFDOTMETER: A New Software Application for Automated Immunofluorescence Analysis.
Rodríguez-Arribas, Mario; Pizarro-Estrella, Elisa; Gómez-Sánchez, Rubén; Yakhine-Diop, S M S; Gragera-Hidalgo, Antonio; Cristo, Alejandro; Bravo-San Pedro, Jose M; González-Polo, Rosa A; Fuentes, José M
2016-04-01
Most laboratories interested in autophagy use different imaging software for managing and analyzing heterogeneous parameters in immunofluorescence experiments (e.g., LC3-puncta quantification and determination of the number and size of lysosomes). One solution would be software that works on a user's laptop or workstation that can access all image settings and provide quick and easy-to-use analysis of data. Thus, we have designed and implemented an application called IFDOTMETER, which can run on all major operating systems because it has been programmed using JAVA (Sun Microsystems). Briefly, IFDOTMETER software has been created to quantify a variety of biological hallmarks, including mitochondrial morphology and nuclear condensation. The program interface is intuitive and user-friendly, making it useful for users not familiar with computer handling. By setting previously defined parameters, the software can automatically analyze a large number of images without the supervision of the researcher. Once analysis is complete, the results are stored in a spreadsheet. Using software for high-throughput cell image analysis offers researchers the possibility of performing comprehensive and precise analysis of a high number of images in an automated manner, making this routine task easier. © 2015 Society for Laboratory Automation and Screening.
Amarathunga, J P; Schuetz, M A; Yarlagadda, K V D; Schmutz, B
2015-04-01
Intramedullary nailing is the standard fixation method for displaced diaphyseal fractures of tibia. Selection of the correct nail insertion point is important for axial alignment of bone fragments and to avoid iatrogenic fractures. However, the standard entry point (SEP) may not always optimise the bone-nail fit due to geometric variations of bones. This study aimed to investigate the optimal entry for a given bone-nail pair using the fit quantification software tool previously developed by the authors. The misfit was quantified for 20 bones with two nail designs (ETN and ETN-Proximal Bend) related to the SEP and 5 entry points which were 5 mm and 10 mm away from the SEP. The SEP was the optimal entry point for 50% of the bones used. For the remaining bones, the optimal entry point was located 5 mm away from the SEP, which improved the overall fit by 40% on average. However, entry points 10 mm away from the SEP doubled the misfit. The optimised bone-nail fit can be achieved through the SEP and within the range of a 5 mm radius, except posteriorly. The study results suggest that the optimal entry point should be selected by considering the fit during insertion and not only at the final position. Copyright © 2015 IPEM. Published by Elsevier Ltd. All rights reserved.
Saha, Tanumoy; Rathmann, Isabel; Galic, Milos
2017-07-11
Filopodia are dynamic, finger-like cellular protrusions associated with migration and cell-cell communication. In order to better understand the complex signaling mechanisms underlying filopodial initiation, elongation and subsequent stabilization or retraction, it is crucial to determine the spatio-temporal protein activity in these dynamic structures. To analyze protein function in filopodia, we recently developed a semi-automated tracking algorithm that adapts to filopodial shape-changes, thus allowing parallel analysis of protrusion dynamics and relative protein concentration along the whole filopodial length. Here, we present a detailed step-by-step protocol for optimized cell handling, image acquisition and software analysis. We further provide instructions for the use of optional features during image analysis and data representation, as well as troubleshooting guidelines for all critical steps along the way. Finally, we also include a comparison of the described image analysis software with other programs available for filopodia quantification. Together, the presented protocol provides a framework for accurate analysis of protein dynamics in filopodial protrusions using image analysis software.
The Use of Variable Q1 Isolation Windows Improves Selectivity in LC-SWATH-MS Acquisition.
Zhang, Ying; Bilbao, Aivett; Bruderer, Tobias; Luban, Jeremy; Strambio-De-Castillia, Caterina; Lisacek, Frédérique; Hopfgartner, Gérard; Varesio, Emmanuel
2015-10-02
As tryptic peptides and metabolites are not equally distributed along the mass range, the probability of cross fragment ion interference is higher in certain windows when fixed Q1 SWATH windows are applied. We evaluated the benefits of utilizing variable Q1 SWATH windows with regards to selectivity improvement. Variable windows based on equalizing the distribution of either the precursor ion population (PIP) or the total ion current (TIC) within each window were generated by an in-house software, swathTUNER. These two variable Q1 SWATH window strategies outperformed, with respect to quantification and identification, the basic approach using a fixed window width (FIX) for proteomic profiling of human monocyte-derived dendritic cells (MDDCs). Thus, 13.8 and 8.4% additional peptide precursors, which resulted in 13.1 and 10.0% more proteins, were confidently identified by SWATH using the strategy PIP and TIC, respectively, in the MDDC proteomic sample. On the basis of the spectral library purity score, some improvement warranted by variable Q1 windows was also observed, albeit to a lesser extent, in the metabolomic profiling of human urine. We show that the novel concept of "scheduled SWATH" proposed here, which incorporates (i) variable isolation windows and (ii) precursor retention time segmentation further improves both peptide and metabolite identifications.
Quantitative fluorescence angiography for neurosurgical interventions.
Weichelt, Claudia; Duscha, Philipp; Steinmeier, Ralf; Meyer, Tobias; Kuß, Julia; Cimalla, Peter; Kirsch, Matthias; Sobottka, Stephan B; Koch, Edmund; Schackert, Gabriele; Morgenstern, Ute
2013-06-01
Present methods for quantitative measurement of cerebral perfusion during neurosurgical operations require additional technology for measurement, data acquisition, and processing. This study used conventional fluorescence video angiography--as an established method to visualize blood flow in brain vessels--enhanced by a quantifying perfusion software tool. For these purposes, the fluorescence dye indocyanine green is given intravenously, and after activation by a near-infrared light source the fluorescence signal is recorded. Video data are analyzed by software algorithms to allow quantification of the blood flow. Additionally, perfusion is measured intraoperatively by a reference system. Furthermore, comparing reference measurements using a flow phantom were performed to verify the quantitative blood flow results of the software and to validate the software algorithm. Analysis of intraoperative video data provides characteristic biological parameters. These parameters were implemented in the special flow phantom for experimental validation of the developed software algorithms. Furthermore, various factors that influence the determination of perfusion parameters were analyzed by means of mathematical simulation. Comparing patient measurement, phantom experiment, and computer simulation under certain conditions (variable frame rate, vessel diameter, etc.), the results of the software algorithms are within the range of parameter accuracy of the reference methods. Therefore, the software algorithm for calculating cortical perfusion parameters from video data presents a helpful intraoperative tool without complex additional measurement technology.
Trends in data processing of comprehensive two-dimensional chromatography: state of the art.
Matos, João T V; Duarte, Regina M B O; Duarte, Armando C
2012-12-01
The operation of advanced chromatographic systems, namely comprehensive two-dimensional (2D) chromatography coupled to multidimensional detectors, allows achieving a great deal of data that need special care to be processed in order to characterize and quantify as much as possible the analytes under study. The aim of this review is to identify the main trends, research needs and gaps on the techniques for data processing of multidimensional data sets obtained from comprehensive 2D chromatography. The following topics have been identified as the most promising for new developments in the near future: data acquisition and handling, peak detection and quantification, measurement of overlapping of 2D peaks, and data analysis software for 2D chromatography. The rational supporting most of the data processing techniques is based on the generalization of one-dimensional (1D) chromatography although algorithms, such as the inverted watershed algorithm, use the 2D chromatographic data as such. However, for processing more complex N-way data there is a need for using more sophisticated techniques. Apart from using other concepts from 1D chromatography, which have not been tested for 2D chromatography, there is still room for new improvements and developments in algorithms and software for dealing with 2D comprehensive chromatographic data. Copyright © 2012 Elsevier B.V. All rights reserved.
Isse, Kumiko; Lesniak, Andrew; Grama, Kedar; Roysam, Badrinath; Minervini, Martha I.; Demetris, Anthony J
2013-01-01
Conventional histopathology is the gold standard for allograft monitoring, but its value proposition is increasingly questioned. “-Omics” analysis of tissues, peripheral blood and fluids and targeted serologic studies provide mechanistic insights into allograft injury not currently provided by conventional histology. Microscopic biopsy analysis, however, provides valuable and unique information: a) spatial-temporal relationships; b) rare events/cells; c) complex structural context; and d) integration into a “systems” model. Nevertheless, except for immunostaining, no transformative advancements have “modernized” routine microscopy in over 100 years. Pathologists now team with hardware and software engineers to exploit remarkable developments in digital imaging, nanoparticle multiplex staining, and computational image analysis software to bridge the traditional histology - global “–omic” analyses gap. Included are side-by-side comparisons, objective biopsy finding quantification, multiplexing, automated image analysis, and electronic data and resource sharing. Current utilization for teaching, quality assurance, conferencing, consultations, research and clinical trials is evolving toward implementation for low-volume, high-complexity clinical services like transplantation pathology. Cost, complexities of implementation, fluid/evolving standards, and unsettled medical/legal and regulatory issues remain as challenges. Regardless, challenges will be overcome and these technologies will enable transplant pathologists to increase information extraction from tissue specimens and contribute to cross-platform biomarker discovery for improved outcomes. PMID:22053785
A semi-automated measurement technique for the assessment of radiolucency.
Pegg, E C; Kendrick, B J L; Pandit, H G; Gill, H S; Murray, D W
2014-07-06
The assessment of radiolucency around an implant is qualitative, poorly defined and has low agreement between clinicians. Accurate and repeatable assessment of radiolucency is essential to prevent misdiagnosis, minimize cases of unnecessary revision, and to correctly monitor and treat patients at risk of loosening and implant failure. The purpose of this study was to examine whether a semi-automated imaging algorithm could improve repeatability and enable quantitative assessment of radiolucency. Six surgeons assessed 38 radiographs of knees after unicompartmental knee arthroplasty for radiolucency, and results were compared with assessments made by the semi-automated program. Large variation was found between the surgeon results, with total agreement in only 9.4% of zones and a kappa value of 0.602; whereas the automated program had total agreement in 81.6% of zones and a kappa value of 0.802. The software had a 'fair to excellent' prediction of the presence or the absence of radiolucency, where the area under the curve of the receiver operating characteristic curves was 0.82 on average. The software predicted radiolucency equally well for cemented and cementless implants (p = 0.996). The identification of radiolucency using an automated method is feasible and these results indicate that it could aid the definition and quantification of radiolucency.
A Fully Customized Baseline Removal Framework for Spectroscopic Applications.
Giguere, Stephen; Boucher, Thomas; Carey, C J; Mahadevan, Sridhar; Dyar, M Darby
2017-07-01
The task of proper baseline or continuum removal is common to nearly all types of spectroscopy. Its goal is to remove any portion of a signal that is irrelevant to features of interest while preserving any predictive information. Despite the importance of baseline removal, median or guessed default parameters are commonly employed, often using commercially available software supplied with instruments. Several published baseline removal algorithms have been shown to be useful for particular spectroscopic applications but their generalizability is ambiguous. The new Custom Baseline Removal (Custom BLR) method presented here generalizes the problem of baseline removal by combining operations from previously proposed methods to synthesize new correction algorithms. It creates novel methods for each technique, application, and training set, discovering new algorithms that maximize the predictive accuracy of the resulting spectroscopic models. In most cases, these learned methods either match or improve on the performance of the best alternative. Examples of these advantages are shown for three different scenarios: quantification of components in near-infrared spectra of corn and laser-induced breakdown spectroscopy data of rocks, and classification/matching of minerals using Raman spectroscopy. Software to implement this optimization is available from the authors. By removing subjectivity from this commonly encountered task, Custom BLR is a significant step toward completely automatic and general baseline removal in spectroscopic and other applications.
Idilman, Ilkay S; Keskin, Onur; Elhan, Atilla Halil; Idilman, Ramazan; Karcaaltincaba, Musturay
2014-05-01
To determine the utility of sequential MRI-estimated proton density fat fraction (MRI-PDFF) for quantification of the longitudinal changes in liver fat content in individuals with nonalcoholic fatty liver disease (NAFLD). A total of 18 consecutive individuals (M/F: 10/8, mean age: 47.7±9.8 years) diagnosed with NAFLD, who underwent sequential PDFF calculations for the quantification of hepatic steatosis at two different time points, were included in the study. All patients underwent T1-independent volumetric multi-echo gradient-echo imaging with T2* correction and spectral fat modeling. A close correlation for quantification of hepatic steatosis between the initial MRI-PDFF and liver biopsy was observed (rs=0.758, p<0.001). The median interval between two sequential MRI-PDFF measurements was 184 days. From baseline to the end of the follow-up period, serum GGT level and homeostasis model assessment score were significantly improved (p=0.015, p=0.006, respectively), whereas BMI, serum AST, and ALT levels were slightly decreased. MRI-PDFFs were significantly improved (p=0.004). A good correlation between two sequential MRI-PDFF calculations was observed (rs=0.714, p=0.001). With linear regression analyses, only delta serum ALT levels had a significant effect on delta MRI-PDFF calculations (r2=38.6%, p=0.006). At least 5.9% improvement in MRI-PDFF is needed to achieve a normalized abnormal ALT level. The improvement of MRI-PDFF score was associated with the improvement of biochemical parameters in patients who had improvement in delta MRI-PDFF (p<0.05). MRI-PDFF can be used for the quantification of the longitudinal changes of hepatic steatosis. The changes in serum ALT levels significantly reflected changes in MRI-PDFF in patients with NAFLD.
NASA Astrophysics Data System (ADS)
Tokkari, Niki; Verdaasdonk, Rudolf M.; Liberton, Niels; Wolff, Jan; den Heijer, Martin; van der Veen, Albert; Klaessens, John H.
2017-02-01
It is difficult to obtain quantitative measurements as to surface areas and volumes from standard photos of the body parts of patients which is highly desirable for objective follow up of treatments in e.g. dermatology. plastic, aesthetic and reconstructive surgery. Recently, 3-D scanners have become available to provide quantification. Phantoms (3-D printed hand, nose and ear, colored bread sculpture) were developed to compare a range from low-cost (Sense), medium (HP Sprout) to high end (Artec Spider, Vectra M3) scanners using different 3D imaging technologies, as to resolution, working range, surface color representation, user friendliness. The 3D scans files (STL, OBJ) were processed with Artec studio and GOM software as to deviation compared to the high resolution Artec Spider scanner taken as `golden' standard. The HP Spout, which uses a fringe projection, proved to be nearly as good as the Artec, however, needs to be converted for clinical use. Photogrammetry as used by the Vectra M3 scanner is limited to provide sufficient data points for accurate surface mapping however provides good color/structure representation. The low performance of the Sense is not recommended for clinical use. The Artec scanner was successfully used to measure the structure/volume changes in the face after hormone treatment in transgender patients. 3D scanners can greatly improve quantitative measurements of surfaces and volumes as objective follow up in clinical studies performed by various clinical specialisms (dermatology, aesthetic and reconstructive surgery). New scanning technologies, like fringe projection, are promising for development of low-cost, high precision scanners.
Fónyad, László; Shinoda, Kazunobu; Farkash, Evan A; Groher, Martin; Sebastian, Divya P; Szász, A Marcell; Colvin, Robert B; Yagi, Yukako
2015-03-28
Chronic allograft vasculopathy (CAV) is a major mechanism of graft failure of transplanted organs in humans. Morphometric analysis of coronary arteries enables the quantitation of CAV in mouse models of heart transplantation. However, conventional histological procedures using single 2-dimensional sections limit the accuracy of CAV quantification. The aim of this study is to improve the accuracy of CAV quantification by reconstructing the murine coronary system in 3-dimensions (3D) and using virtual reconstruction and volumetric analysis to precisely assess neointimal thickness. Mouse tissue samples, native heart and transplanted hearts with chronic allograft vasculopathy, were collected and analyzed. Paraffin embedded samples were serially sectioned, stained and digitized using whole slide digital imaging techniques under normal and ultraviolet lighting. Sophisticated software tools were used to generate and manipulate 3D reconstructions of the major coronary arteries and branches. The 3D reconstruction provides not only accurate measurements but also exact volumetric data of vascular lesions. This virtual coronary arteriography demonstrates that the vasculopathy lesions in this model are localized to the proximal coronary segments. In addition, virtual rotation and volumetric analysis enabled more precise measurements of CAV than single, randomly oriented histologic sections, and offer an improved readout for this important experimental model. We believe 3D reconstruction of 2D histological slides will provide new insights into pathological mechanisms in which structural abnormalities play a role in the development of a disease. The techniques we describe are applicable to the analysis of arteries, veins, bronchioles and similar sized structures in a variety of tissue types and disease model systems. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/3772457541477230 .
Quantifying errors without random sampling.
Phillips, Carl V; LaPole, Luwanna M
2003-06-12
All quantifications of mortality, morbidity, and other health measures involve numerous sources of error. The routine quantification of random sampling error makes it easy to forget that other sources of error can and should be quantified. When a quantification does not involve sampling, error is almost never quantified and results are often reported in ways that dramatically overstate their precision. We argue that the precision implicit in typical reporting is problematic and sketch methods for quantifying the various sources of error, building up from simple examples that can be solved analytically to more complex cases. There are straightforward ways to partially quantify the uncertainty surrounding a parameter that is not characterized by random sampling, such as limiting reported significant figures. We present simple methods for doing such quantifications, and for incorporating them into calculations. More complicated methods become necessary when multiple sources of uncertainty must be combined. We demonstrate that Monte Carlo simulation, using available software, can estimate the uncertainty resulting from complicated calculations with many sources of uncertainty. We apply the method to the current estimate of the annual incidence of foodborne illness in the United States. Quantifying uncertainty from systematic errors is practical. Reporting this uncertainty would more honestly represent study results, help show the probability that estimated values fall within some critical range, and facilitate better targeting of further research.
Dobnik, David; Štebih, Dejan; Blejec, Andrej; Morisset, Dany; Žel, Jana
2016-01-01
The advantages of the digital PCR technology are already well documented until now. One way to achieve better cost efficiency of the technique is to use it in a multiplexing strategy. Droplet digital PCR platforms, which include two fluorescence filters, support at least duplex reactions and with some developments and optimization higher multiplexing is possible. The present study not only shows a development of multiplex assays in droplet digital PCR, but also presents a first thorough evaluation of several parameters in such multiplex digital PCR. Two 4-plex assays were developed for quantification of 8 different DNA targets (7 genetically modified maize events and maize endogene). Per assay, two of the targets were labelled with one fluorophore and two with another. As current analysis software does not support analysis of more than duplex, a new R- and Shiny-based web application analysis tool (http://bit.ly/ddPCRmulti) was developed that automates the analysis of 4-plex results. In conclusion, the two developed multiplex assays are suitable for quantification of GMO maize events and the same approach can be used in any other field with a need for accurate and reliable quantification of multiple DNA targets. PMID:27739510
NASA Technical Reports Server (NTRS)
Goldstein, J. I.; Williams, D. B.
1992-01-01
This paper reviews and discusses future directions in analytical electron microscopy for microchemical analysis using X-ray and Electron Energy Loss Spectroscopy (EELS). The technique of X-ray microanalysis, using the ratio method and k(sub AB) factors, is outlined. The X-ray absorption correction is the major barrier to the objective of obtaining I% accuracy and precision in analysis. Spatial resolution and Minimum Detectability Limits (MDL) are considered with present limitations of spatial resolution in the 2 to 3 microns range and of MDL in the 0.1 to 0.2 wt. % range when a Field Emission Gun (FEG) system is used. Future directions of X-ray analysis include improvement in X-ray spatial resolution to the I to 2 microns range and MDL as low as 0.01 wt. %. With these improvements the detection of single atoms in the analysis volume will be possible. Other future improvements include the use of clean room techniques for thin specimen preparation, quantification available at the I% accuracy and precision level with light element analysis quantification available at better than the 10% accuracy and precision level, the incorporation of a compact wavelength dispersive spectrometer to improve X-ray spectral resolution, light element analysis and MDL, and instrument improvements including source stability, on-line probe current measurements, stage stability, and computerized stage control. The paper reviews the EELS technique, recognizing that it has been slow to develop and still remains firmly in research laboratories rather than in applications laboratories. Consideration of microanalysis with core-loss edges is given along with a discussion of the limitations such as specimen thickness. Spatial resolution and MDL are considered, recognizing that single atom detection is already possible. Plasmon loss analysis is discussed as well as fine structure analysis. New techniques for energy-loss imaging are also summarized. Future directions in the EELS technique will be the development of new spectrometers and improvements in thin specimen preparation. The microanalysis technique needs to be simplified and software developed so that the EELS technique approaches the relative simplicity of the X-ray technique. Finally, one can expect major improvements in EELS imaging as data storage and processing improvements occur.
QUESP and QUEST revisited - fast and accurate quantitative CEST experiments.
Zaiss, Moritz; Angelovski, Goran; Demetriou, Eleni; McMahon, Michael T; Golay, Xavier; Scheffler, Klaus
2018-03-01
Chemical exchange saturation transfer (CEST) NMR or MRI experiments allow detection of low concentrated molecules with enhanced sensitivity via their proton exchange with the abundant water pool. Be it endogenous metabolites or exogenous contrast agents, an exact quantification of the actual exchange rate is required to design optimal pulse sequences and/or specific sensitive agents. Refined analytical expressions allow deeper insight and improvement of accuracy for common quantification techniques. The accuracy of standard quantification methodologies, such as quantification of exchange rate using varying saturation power or varying saturation time, is improved especially for the case of nonequilibrium initial conditions and weak labeling conditions, meaning the saturation amplitude is smaller than the exchange rate (γB 1 < k). The improved analytical 'quantification of exchange rate using varying saturation power/time' (QUESP/QUEST) equations allow for more accurate exchange rate determination, and provide clear insights on the general principles to execute the experiments and to perform numerical evaluation. The proposed methodology was evaluated on the large-shift regime of paramagnetic chemical-exchange-saturation-transfer agents using simulated data and data of the paramagnetic Eu(III) complex of DOTA-tetraglycineamide. The refined formulas yield improved exchange rate estimation. General convergence intervals of the methods that would apply for smaller shift agents are also discussed. Magn Reson Med 79:1708-1721, 2018. © 2017 International Society for Magnetic Resonance in Medicine. © 2017 International Society for Magnetic Resonance in Medicine.
Objective characterization of airway dimensions using image processing.
Pepper, Victoria K; Francom, Christian; Best, Cameron A; Onwuka, Ekene; King, Nakesha; Heuer, Eric; Mahler, Nathan; Grischkan, Jonathan; Breuer, Christopher K; Chiang, Tendy
2016-12-01
With the evolution of medical and surgical management for pediatric airway disorders, the development of easily translated techniques of measuring airway dimensions can improve the quantification of outcomes of these interventions. We have developed a technique that improves the ability to characterize endoscopic airway dimensions using common bronchoscopic equipment and an open-source image-processing platform. We validated our technique of Endoscopic Airway Measurement (EAM) using optical instruments in simulation tracheas. We then evaluated EAM in a large animal model (Ovis aries, n = 5), comparing tracheal dimensions obtained with EAM to measurements obtained via 3-D fluoroscopic reconstruction. The animal then underwent resection of the measured segment, and direct measurement of this segment was performed and compared to radiographic measurements and those obtained using EAM. The simulation tracheas had a direct measurement of 13.6, 18.5, and 24.2 mm in diameter. The mean difference of diameter in simulation tracheas between direct measurements and measurements obtained using EAM was 0.70 ± 0.57 mm. The excised ovine tracheas had an average diameter of 18.54 ± 0.68 mm. The percent difference in diameter obtained from EAM and from 3-D fluoroscopic reconstruction when compared to measurement of the excised tracheal segment was 4.98 ± 2.43% and 10.74 ± 4.07% respectively. Comparison of these three measurements (EAM, measurement of resected trachea, 3-D fluoroscopic reconstruction) with repeated measures ANOVA demonstrated no statistical significance. Endoscopic airway measurement (EAM) provides equivalent measurements of the airway with the improved versatility of measuring non-circular and multi-level dimensions. Using optical bronchoscopic instruments and open-source image-processing software, our data supports preclinical and clinical translation of an accessible technique to provide objective quantification of airway diameter. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Multiscale Modeling and Uncertainty Quantification for Nuclear Fuel Performance
DOE Office of Scientific and Technical Information (OSTI.GOV)
Estep, Donald; El-Azab, Anter; Pernice, Michael
2017-03-23
In this project, we will address the challenges associated with constructing high fidelity multiscale models of nuclear fuel performance. We (*) propose a novel approach for coupling mesoscale and macroscale models, (*) devise efficient numerical methods for simulating the coupled system, and (*) devise and analyze effective numerical approaches for error and uncertainty quantification for the coupled multiscale system. As an integral part of the project, we will carry out analysis of the effects of upscaling and downscaling, investigate efficient methods for stochastic sensitivity analysis of the individual macroscale and mesoscale models, and carry out a posteriori error analysis formore » computed results. We will pursue development and implementation of solutions in software used at Idaho National Laboratories on models of interest to the Nuclear Energy Advanced Modeling and Simulation (NEAMS) program.« less
NASA Astrophysics Data System (ADS)
Vrugt, J. A.
2012-12-01
In the past decade much progress has been made in the treatment of uncertainty in earth systems modeling. Whereas initial approaches has focused mostly on quantification of parameter and predictive uncertainty, recent methods attempt to disentangle the effects of parameter, forcing (input) data, model structural and calibration data errors. In this talk I will highlight some of our recent work involving theory, concepts and applications of Bayesian parameter and/or state estimation. In particular, new methods for sequential Monte Carlo (SMC) and Markov Chain Monte Carlo (MCMC) simulation will be presented with emphasis on massively parallel distributed computing and quantification of model structural errors. The theoretical and numerical developments will be illustrated using model-data synthesis problems in hydrology, hydrogeology and geophysics.
KiT: a MATLAB package for kinetochore tracking.
Armond, Jonathan W; Vladimirou, Elina; McAinsh, Andrew D; Burroughs, Nigel J
2016-06-15
During mitosis, chromosomes are attached to the mitotic spindle via large protein complexes called kinetochores. The motion of kinetochores throughout mitosis is intricate and automated quantitative tracking of their motion has already revealed many surprising facets of their behaviour. Here, we present 'KiT' (Kinetochore Tracking)-an easy-to-use, open-source software package for tracking kinetochores from live-cell fluorescent movies. KiT supports 2D, 3D and multi-colour movies, quantification of fluorescence, integrated deconvolution, parallel execution and multiple algorithms for particle localization. KiT is free, open-source software implemented in MATLAB and runs on all MATLAB supported platforms. KiT can be downloaded as a package from http://www.mechanochemistry.org/mcainsh/software.php The source repository is available at https://bitbucket.org/jarmond/kit and under continuing development. Supplementary data are available at Bioinformatics online. jonathan.armond@warwick.ac.uk. © The Author 2016. Published by Oxford University Press.
cFinder: definition and quantification of multiple haplotypes in a mixed sample.
Niklas, Norbert; Hafenscher, Julia; Barna, Agnes; Wiesinger, Karin; Pröll, Johannes; Dreiseitl, Stephan; Preuner-Stix, Sandra; Valent, Peter; Lion, Thomas; Gabriel, Christian
2015-09-07
Next-generation sequencing allows for determining the genetic composition of a mixed sample. For instance, when performing resistance testing for BCR-ABL1 it is necessary to identify clones and define compound mutations; together with an exact quantification this may complement diagnosis and therapy decisions with additional information. Moreover, that applies not only to oncological issues but also determination of viral, bacterial or fungal infection. The efforts to retrieve multiple haplotypes (more than two) and proportion information from data with conventional software are difficult, cumbersome and demand multiple manual steps. Therefore, we developed a tool called cFinder that is capable of automatic detection of haplotypes and their accurate quantification within one sample. BCR-ABL1 samples containing multiple clones were used for testing and our cFinder could identify all previously found clones together with their abundance and even refine some results. Additionally, reads were simulated using GemSIM with multiple haplotypes, the detection was very close to linear (R(2) = 0.96). Our aim is not to deduce haploblocks over statistics, but to characterize one sample's composition precisely. As a result the cFinder reports the connections of variants (haplotypes) with their readcount and relative occurrence (percentage). Download is available at http://sourceforge.net/projects/cfinder/. Our cFinder is implemented in an efficient algorithm that can be run on a low-performance desktop computer. Furthermore, it considers paired-end information (if available) and is generally open for any current next-generation sequencing technology and alignment strategy. To our knowledge, this is the first software that enables researchers without extensive bioinformatic support to designate multiple haplotypes and how they constitute to a sample.
Phipps, Eric T.; D'Elia, Marta; Edwards, Harold C.; ...
2017-04-18
In this study, quantifying simulation uncertainties is a critical component of rigorous predictive simulation. A key component of this is forward propagation of uncertainties in simulation input data to output quantities of interest. Typical approaches involve repeated sampling of the simulation over the uncertain input data, and can require numerous samples when accurately propagating uncertainties from large numbers of sources. Often simulation processes from sample to sample are similar and much of the data generated from each sample evaluation could be reused. We explore a new method for implementing sampling methods that simultaneously propagates groups of samples together in anmore » embedded fashion, which we call embedded ensemble propagation. We show how this approach takes advantage of properties of modern computer architectures to improve performance by enabling reuse between samples, reducing memory bandwidth requirements, improving memory access patterns, improving opportunities for fine-grained parallelization, and reducing communication costs. We describe a software technique for implementing embedded ensemble propagation based on the use of C++ templates and describe its integration with various scientific computing libraries within Trilinos. We demonstrate improved performance, portability and scalability for the approach applied to the simulation of partial differential equations on a variety of CPU, GPU, and accelerator architectures, including up to 131,072 cores on a Cray XK7 (Titan).« less
Jardine, Griffin J; Holiman, Jeffrey D; Stoeger, Christopher G; Chamberlain, Winston D
2014-09-01
To improve accuracy and efficiency in quantifying the endothelial cell loss (ECL) in eye bank preparation of corneal endothelial grafts. Eight cadaveric corneas were subjected to Descemet Membrane Endothelial Keratoplasty (DMEK) preparation. The endothelial surfaces were stained with a viability stain, calcein AM dye (CAM) and then captured by a digital camera. The ECL rates were quantified in these images by three separate readers using trainable segmentation, a plug-in feature from the imaging software, Fiji. Images were also analyzed by Adobe Photoshop for comparison. Mean times required to process the images were measured between the two modalities. The mean ECL (with standard deviation) as analyzed by Fiji was 22.5% (6.5%) and Adobe was 18.7% (7.0%; p = 0.04). The mean time required to process the images through the two different imaging methods was 19.9 min (7.5) for Fiji and 23.4 min (12.9) for Adobe (p = 0.17). Establishing an accurate, efficient and reproducible means of quantifying ECL in graft preparation and surgical techniques can provide insight to the safety, long-term potential of the graft tissues as well as provide a quality control measure for eye banks and surgeons. Trainable segmentation in Fiji software using CAM is a novel approach to measuring ECL that captured a statistically significantly higher percentage of ECL comparable to Adobe and was more accurate in standardized testing. Interestingly, ECL as determined using both methods in eye bank-prepared DMEK grafts exceeded 18% on average.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Meng, Da; Zhang, Qibin; Gao, Xiaoli
2014-04-30
We have developed a tool for automated, high-throughput analysis of LC-MS/MS data files, which greatly simplifies LC-MS based lipidomics analysis. Our results showed that LipidMiner is accurate and comprehensive in identification and quantification of lipid molecular species. In addition, the workflow implemented in LipidMiner is not limited to identification and quantification of lipids. If a suitable metabolite library is implemented in the library matching module, LipidMiner could be reconfigured as a tool for general metabolomics data analysis. It is of note that LipidMiner currently is limited to singly charged ions, although it is adequate for the purpose of lipidomics sincemore » lipids are rarely multiply charged,[14] even for the polyphosphoinositides. LipidMiner also only processes file formats generated from mass spectrometers from Thermo, i.e. the .RAW format. In the future, we are planning to accommodate file formats generated by mass spectrometers from other predominant instrument vendors to make this tool more universal.« less
Zhang, Weihua; Yi, Jing; Mekarski, Pawel; Ungar, Kurt; Hauck, Barry; Kramer, Gary H
2011-06-01
The purpose of this study is to investigate the possibility of verifying depleted uranium (DU), natural uranium (NU), low enriched uranium (LEU) and high enriched uranium (HEU) by a developed digital gamma-gamma coincidence spectroscopy. The spectroscopy consists of two NaI(Tl) scintillators and XIA LLC Digital Gamma Finder (DGF)/Pixie-4 software and card package. The results demonstrate that the spectroscopy provides an effective method of (235)U and (238)U quantification based on the count rate of their gamma-gamma coincidence counting signatures. The main advantages of this approach over the conventional gamma spectrometry include the facts of low background continuum near coincident signatures of (235)U and (238)U, less interference from other radionuclides by the gamma-gamma coincidence counting, and region-of-interest (ROI) imagine analysis for uranium enrichment determination. Compared to conventional gamma spectrometry, the method offers additional advantage of requiring minimal calibrations for (235)U and (238)U quantification at different sample geometries. Crown Copyright © 2011. Published by Elsevier Ltd. All rights reserved.
Perez de Souza, Leonardo; Naake, Thomas; Tohge, Takayuki; Fernie, Alisdair R
2017-07-01
The grand challenge currently facing metabolomics is the expansion of the coverage of the metabolome from a minor percentage of the metabolic complement of the cell toward the level of coverage afforded by other post-genomic technologies such as transcriptomics and proteomics. In plants, this problem is exacerbated by the sheer diversity of chemicals that constitute the metabolome, with the number of metabolites in the plant kingdom generally considered to be in excess of 200 000. In this review, we focus on web resources that can be exploited in order to improve analyte and ultimately metabolite identification and quantification. There is a wide range of available software that not only aids in this but also in the related area of peak alignment; however, for the uninitiated, choosing which program to use is a daunting task. For this reason, we provide an overview of the pros and cons of the software as well as comments regarding the level of programing skills required to effectively exploit their basic functions. In addition, the torrent of available genome and transcriptome sequences that followed the advent of next-generation sequencing has opened up further valuable resources for metabolite identification. All things considered, we posit that only via a continued communal sharing of information such as that deposited in the databases described within the article are we likely to be able to make significant headway toward improving our coverage of the plant metabolome. © The Authors 2017. Published by Oxford University Press.
NASA Astrophysics Data System (ADS)
Vesselinov, V. V.; Harp, D.
2010-12-01
The process of decision making to protect groundwater resources requires a detailed estimation of uncertainties in model predictions. Various uncertainties associated with modeling a natural system, such as: (1) measurement and computational errors; (2) uncertainties in the conceptual model and model-parameter estimates; (3) simplifications in model setup and numerical representation of governing processes, contribute to the uncertainties in the model predictions. Due to this combination of factors, the sources of predictive uncertainties are generally difficult to quantify individually. Decision support related to optimal design of monitoring networks requires (1) detailed analyses of existing uncertainties related to model predictions of groundwater flow and contaminant transport, (2) optimization of the proposed monitoring network locations in terms of their efficiency to detect contaminants and provide early warning. We apply existing and newly-proposed methods to quantify predictive uncertainties and to optimize well locations. An important aspect of the analysis is the application of newly-developed optimization technique based on coupling of Particle Swarm and Levenberg-Marquardt optimization methods which proved to be robust and computationally efficient. These techniques and algorithms are bundled in a software package called MADS. MADS (Model Analyses for Decision Support) is an object-oriented code that is capable of performing various types of model analyses and supporting model-based decision making. The code can be executed under different computational modes, which include (1) sensitivity analyses (global and local), (2) Monte Carlo analysis, (3) model calibration, (4) parameter estimation, (5) uncertainty quantification, and (6) model selection. The code can be externally coupled with any existing model simulator through integrated modules that read/write input and output files using a set of template and instruction files (consistent with the PEST I/O protocol). MADS can also be internally coupled with a series of built-in analytical simulators. MADS provides functionality to work directly with existing control files developed for the code PEST (Doherty 2009). To perform the computational modes mentioned above, the code utilizes (1) advanced Latin-Hypercube sampling techniques (including Improved Distributed Sampling), (2) various gradient-based Levenberg-Marquardt optimization methods, (3) advanced global optimization methods (including Particle Swarm Optimization), and (4) a selection of alternative objective functions. The code has been successfully applied to perform various model analyses related to environmental management of real contamination sites. Examples include source identification problems, quantification of uncertainty, model calibration, and optimization of monitoring networks. The methodology and software codes are demonstrated using synthetic and real case studies where monitoring networks are optimized taking into account the uncertainty in model predictions of contaminant transport.
Multiplex Droplet Digital PCR Protocols for Quantification of GM Maize Events.
Dobnik, David; Spilsberg, Bjørn; Bogožalec Košir, Alexandra; Štebih, Dejan; Morisset, Dany; Holst-Jensen, Arne; Žel, Jana
2018-01-01
The standard-curve based simplex quantitative polymerase chain reaction (qPCR) has been the gold standard for DNA target quantification for more than a decade. The large and growing number of individual analyses needed to test for genetically modified organisms (GMOs) is reducing the cost-effectiveness of qPCR. Droplet digital PCR (ddPCR) enables absolute quantification without standard curves, avoids the amplification efficiency bias observed with qPCR, allows more accurate estimations at low target copy numbers and, in combination with multiplexing, significantly improves cost efficiency. Here we describe two protocols for multiplex quantification of GM maize events: (1) nondiscriminating, with multiplex quantification of targets as a group (12 GM maize lines) and (2) discriminating, with multiplex quantification of individual targets (events). The first enables the quantification of twelve European Union authorized GM maize events as a group with only two assays, but does not permit determination of the individual events present. The second protocol enables the quantification of four individual targets (three GM events and one endogene) in a single reaction. Both protocols can be modified for quantification of any other DNA target.
Wang, Hanghang; Muehlbauer, Michael J.; O’Neal, Sara K.; Newgard, Christopher B.; Hauser, Elizabeth R.; Shah, Svati H.
2017-01-01
The field of metabolomics as applied to human disease and health is rapidly expanding. In recent efforts of metabolomics research, greater emphasis has been placed on quality control and method validation. In this study, we report an experience with quality control and a practical application of method validation. Specifically, we sought to identify and modify steps in gas chromatography-mass spectrometry (GC-MS)-based, non-targeted metabolomic profiling of human plasma that could influence metabolite identification and quantification. Our experimental design included two studies: (1) a limiting-dilution study, which investigated the effects of dilution on analyte identification and quantification; and (2) a concentration-specific study, which compared the optimal plasma extract volume established in the first study with the volume used in the current institutional protocol. We confirmed that contaminants, concentration, repeatability and intermediate precision are major factors influencing metabolite identification and quantification. In addition, we established methods for improved metabolite identification and quantification, which were summarized to provide recommendations for experimental design of GC-MS-based non-targeted profiling of human plasma. PMID:28841195
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-14
... Market and Planning Efficiency Through Improved Software; Notice of Technical Conference To Discuss Increasing Market and Planning Efficiency Through Improved Software May 7, 2010. Take notice that Commission... planning efficiency through improved software. [[Page 27342
The software-cycle model for re-engineering and reuse
NASA Technical Reports Server (NTRS)
Bailey, John W.; Basili, Victor R.
1992-01-01
This paper reports on the progress of a study which will contribute to our ability to perform high-level, component-based programming by describing means to obtain useful components, methods for the configuration and integration of those components, and an underlying economic model of the costs and benefits associated with this approach to reuse. One goal of the study is to develop and demonstrate methods to recover reusable components from domain-specific software through a combination of tools, to perform the identification, extraction, and re-engineering of components, and domain experts, to direct the applications of those tools. A second goal of the study is to enable the reuse of those components by identifying techniques for configuring and recombining the re-engineered software. This component-recovery or software-cycle model addresses not only the selection and re-engineering of components, but also their recombination into new programs. Once a model of reuse activities has been developed, the quantification of the costs and benefits of various reuse options will enable the development of an adaptable economic model of reuse, which is the principal goal of the overall study. This paper reports on the conception of the software-cycle model and on several supporting techniques of software recovery, measurement, and reuse which will lead to the development of the desired economic model.
2009-09-01
using her beadbeater, Sonya Dyhrman for being my initial biology advisor, Heidi Sosik for her advice on image processing , the residents of Watson...64 2-17 Phycobiliprotein absorption spectra . . . . . . . . . . . . . . . . . . . . . 66 3-1 Image processing for automated cell counts...digital camera and Axiovision 4.6.3 software. Images were measured, and cell metrics were determined using the MATLAB image processing toolbox
Wang, Chunyan; Zhu, Hongbin; Pi, Zifeng; Song, Fengrui; Liu, Zhiqiang; Liu, Shuying
2013-09-15
An analytical method for quantifying underivatized amino acids (AAs) in urine samples of rats was developed by using liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS). Classification of type 2 diabetes rats was based on urine amino acids metabolic profiling. LC-MS/MS analysis was applied through chromatographic separation and multiple reactions monitoring (MRM) transitions of MS/MS. Multivariate profile-wide predictive models were constructed using partial least squares discriminant analysis (PLS-DA) by SIMAC-P 11.5 version software package and hierarchical cluster analysis (HCA) by SPSS 18.0 version software. Some amino acids in urine of rats have significant change. The results of the present study prove that this method could perform the quantification of free AAs in urine of rats by using LC-MS/MS. In summary, the PLS-DA and HCA statistical analysis in our research were preferable to differentiate healthy rats and type 2 diabetes rats by the quantification of AAs in their urine samples. In addition, comparing with health group the seven increased amino acids in urine of type 2 rats were returned to normal under the treatment of acarbose. Copyright © 2013 Elsevier B.V. All rights reserved.
A novel method to characterize silica bodies in grasses.
Dabney, Clemon; Ostergaard, Jason; Watkins, Eric; Chen, Changbin
2016-01-01
The deposition of silicon into epidermal cells of grass species is thought to be an important mechanism that plants use as a defense against pests and environmental stresses. There are a number of techniques available to study the size, density and distribution pattern of silica bodies in grass leaves. However, none of those techniques can provide a high-throughput analysis, especially for a great number of samples. We developed a method utilizing the autofluorescence of silica bodies to investigate their size and distribution, along with the number of carbon inclusions within the silica bodies of perennial grass species Koeleria macrantha. Fluorescence images were analyzed by image software Adobe Photoshop CS5 or ImageJ that remarkably facilitated the quantification of silica bodies in the dry ash. We observed three types of silica bodies or silica body related mineral structures. Silica bodies were detected on both abaxial and adaxial epidermis of K. macrantha leaves, although their sizes, density, and distribution patterns were different. No auto-fluorescence was detected from carbon inclusions. The combination of fluorescence microscopy and image processing software displayed efficient utilization in the identification and quantification of silica bodies in K. macrantha leaf tissues, which should applicable to biological, ecological and geological studies of grasses including forage, turf grasses and cereal crops.
Mapping and Quantification of Vascular Branching in Plants, Animals and Humans by VESGEN Software
NASA Technical Reports Server (NTRS)
Parsons-Wingerter, Patricia A.; Vickerman, Mary B.; Keith, Patricia A.
2010-01-01
Humans face daunting challenges in the successful exploration and colonization of space, including adverse alterations in gravity and radiation. The Earth-determined biology of humans, animals and plants is significantly modified in such extraterrestrial environments. One physiological requirement shared by humans with larger plants and animals is a complex, highly branching vascular system that is dynamically responsive to cellular metabolism, immunological protection and specialized cellular/tissue function. The VESsel GENeration (VESGEN) Analysis has been developed as a mature beta version, pre-release research software for mapping and quantification of the fractal-based complexity of vascular branching. Alterations in vascular branching pattern can provide informative read-outs of altered vascular regulation. Originally developed for biomedical applications in angiogenesis, VESGEN 2D has provided novel insights into the cytokine, transgenic and therapeutic regulation of angiogenesis, lymphangiogenesis and other microvascular remodeling phenomena. Vascular trees, networks and tree-network composites are mapped and quantified. Applications include disease progression from clinical ophthalmic images of the human retina; experimental regulation of vascular remodeling in the mouse retina; avian and mouse coronary vasculature, and other experimental models in vivo. We envision that altered branching in the leaves of plants studied on ISS such as Arabidopsis thaliana cans also be analyzed.
Mapping and Quantification of Vascular Branching in Plants, Animals and Humans by VESGEN Software
NASA Technical Reports Server (NTRS)
Parsons-Wingerter, P. A.; Vickerman, M. B.; Keith, P. A.
2010-01-01
Humans face daunting challenges in the successful exploration and colonization of space, including adverse alterations in gravity and radiation. The Earth-determined biology of plants, animals and humans is significantly modified in such extraterrestrial environments. One physiological requirement shared by larger plants and animals with humans is a complex, highly branching vascular system that is dynamically responsive to cellular metabolism, immunological protection and specialized cellular/tissue function. VESsel GENeration (VESGEN) Analysis has been developed as a mature beta version, pre-release research software for mapping and quantification of the fractal-based complexity of vascular branching. Alterations in vascular branching pattern can provide informative read-outs of altered vascular regulation. Originally developed for biomedical applications in angiogenesis, VESGEN 2D has provided novel insights into the cytokine, transgenic and therapeutic regulation of angiogenesis, lymphangiogenesis and other microvascular remodeling phenomena. Vascular trees, networks and tree-network composites are mapped and quantified. Applications include disease progression from clinical ophthalmic images of the human retina; experimental regulation of vascular remodeling in the mouse retina; avian and mouse coronary vasculature, and other experimental models in vivo. We envision that altered branching in the leaves of plants studied on ISS such as Arabidopsis thaliana cans also be analyzed.
Quantification of myocardial fibrosis by digital image analysis and interactive stereology
2014-01-01
Background Cardiac fibrosis disrupts the normal myocardial structure and has a direct impact on heart function and survival. Despite already available digital methods, the pathologist’s visual score is still widely considered as ground truth and used as a primary method in histomorphometric evaluations. The aim of this study was to compare the accuracy of digital image analysis tools and the pathologist’s visual scoring for evaluating fibrosis in human myocardial biopsies, based on reference data obtained by point counting performed on the same images. Methods Endomyocardial biopsy material from 38 patients diagnosed with inflammatory dilated cardiomyopathy was used. The extent of total cardiac fibrosis was assessed by image analysis on Masson’s trichrome-stained tissue specimens using automated Colocalization and Genie software, by Stereology grid count and manually by Pathologist’s visual score. Results A total of 116 slides were analyzed. The mean results obtained by the Colocalization software (13.72 ± 12.24%) were closest to the reference value of stereology (RVS), while the Genie software and Pathologist score gave a slight underestimation. RVS values correlated strongly with values obtained using the Colocalization and Genie (r > 0.9, p < 0.001) software as well as the pathologist visual score. Differences in fibrosis quantification by Colocalization and RVS were statistically insignificant. However, significant bias was found in the results obtained by using Genie versus RVS and pathologist score versus RVS with mean difference values of: -1.61% and 2.24%. Bland-Altman plots showed a bidirectional bias dependent on the magnitude of the measurement: Colocalization software overestimated the area fraction of fibrosis in the lower end, and underestimated in the higher end of the RVS values. Meanwhile, Genie software as well as the pathologist score showed more uniform results throughout the values, with a slight underestimation in the mid-range for both. Conclusion Both applied digital image analysis methods revealed almost perfect correlation with the criterion standard obtained by stereology grid count and, in terms of accuracy, outperformed the pathologist’s visual score. Genie algorithm proved to be the method of choice with the only drawback of a slight underestimation bias, which is considered acceptable for both clinical and research evaluations. Virtual slides The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/9857909611227193 PMID:24912374
Quantification of myocardial fibrosis by digital image analysis and interactive stereology.
Daunoravicius, Dainius; Besusparis, Justinas; Zurauskas, Edvardas; Laurinaviciene, Aida; Bironaite, Daiva; Pankuweit, Sabine; Plancoulaine, Benoit; Herlin, Paulette; Bogomolovas, Julius; Grabauskiene, Virginija; Laurinavicius, Arvydas
2014-06-09
Cardiac fibrosis disrupts the normal myocardial structure and has a direct impact on heart function and survival. Despite already available digital methods, the pathologist's visual score is still widely considered as ground truth and used as a primary method in histomorphometric evaluations. The aim of this study was to compare the accuracy of digital image analysis tools and the pathologist's visual scoring for evaluating fibrosis in human myocardial biopsies, based on reference data obtained by point counting performed on the same images. Endomyocardial biopsy material from 38 patients diagnosed with inflammatory dilated cardiomyopathy was used. The extent of total cardiac fibrosis was assessed by image analysis on Masson's trichrome-stained tissue specimens using automated Colocalization and Genie software, by Stereology grid count and manually by Pathologist's visual score. A total of 116 slides were analyzed. The mean results obtained by the Colocalization software (13.72 ± 12.24%) were closest to the reference value of stereology (RVS), while the Genie software and Pathologist score gave a slight underestimation. RVS values correlated strongly with values obtained using the Colocalization and Genie (r>0.9, p<0.001) software as well as the pathologist visual score. Differences in fibrosis quantification by Colocalization and RVS were statistically insignificant. However, significant bias was found in the results obtained by using Genie versus RVS and pathologist score versus RVS with mean difference values of: -1.61% and 2.24%. Bland-Altman plots showed a bidirectional bias dependent on the magnitude of the measurement: Colocalization software overestimated the area fraction of fibrosis in the lower end, and underestimated in the higher end of the RVS values. Meanwhile, Genie software as well as the pathologist score showed more uniform results throughout the values, with a slight underestimation in the mid-range for both. Both applied digital image analysis methods revealed almost perfect correlation with the criterion standard obtained by stereology grid count and, in terms of accuracy, outperformed the pathologist's visual score. Genie algorithm proved to be the method of choice with the only drawback of a slight underestimation bias, which is considered acceptable for both clinical and research evaluations. The virtual slide(s) for this article can be found here: http://www.diagnosticpathology.diagnomx.eu/vs/9857909611227193.
NASA Astrophysics Data System (ADS)
Gülşen, Esra; Kurtulus, Bedri; Necati Yaylim, Tolga; Avsar, Ozgur
2017-04-01
In groundwater studies, quantification and detection of fluid flows in borehole is an important part of assessment aquifer characteristic at different depths. Monitoring wells disturbs the natural flow field and this disturbance creates different flow paths to an aquifer. Vertical flow fluid analyses are one of the important techniques to deal with the detection and quantification of these vertical flows in borehole/monitoring wells. Liwa region is located about 146 km to the south west of Abu Dhabi city and about 36 km southwest of Madinat Zayed. SWSR (Strategic Water Storage & Recovery Project) comprises three Schemes (A, B and C) and each scheme contains an infiltration basin in the center, 105 recovery wells, 10 clusters and each cluster comprises 3 monitoring wells with different depths; shallow ( 50 m), intermediate ( 75 m) and deep ( 100 m). The scope of this study is to calculate the transmissivity values at different depth and evaluate the Fluid Flow Log (FFL) data for Scheme A (105 recovery wells) in order to understand the aquifer characteristic at different depths. The transmissivity values at different depth levels are calculated using Razack and Huntley (1991) equation for vertical flow rates of 30 m3 /h, 60 m3 /h, 90 m3 /h, 120 m3 /h and then Empirical Bayesian Kriging is used for interpolation in Scheme A using ArcGIS 10.2 software. FFL are drawn by GeODin software. Derivative analysis of fluid flow data are done by Microsoft Office: Excel software. All statistical analyses are calculated by IBMSPSS software. The interpolation results show that the transmissivity values are higher at the top of the aquifer. In other word, the aquifer is found more productive at the upper part of the Liwa aquifer. We are very grateful for financial support and providing us the data to ZETAS Dubai Inc.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Weixuan; Lian, Jianming; Engel, Dave
2017-07-27
This paper presents a general uncertainty quantification (UQ) framework that provides a systematic analysis of the uncertainty involved in the modeling of a control system, and helps to improve the performance of a control strategy.
A refined methodology for modeling volume quantification performance in CT
NASA Astrophysics Data System (ADS)
Chen, Baiyu; Wilson, Joshua; Samei, Ehsan
2014-03-01
The utility of CT lung nodule volume quantification technique depends on the precision of the quantification. To enable the evaluation of quantification precision, we previously developed a mathematical model that related precision to image resolution and noise properties in uniform backgrounds in terms of an estimability index (e'). The e' was shown to predict empirical precision across 54 imaging and reconstruction protocols, but with different correlation qualities for FBP and iterative reconstruction (IR) due to the non-linearity of IR impacted by anatomical structure. To better account for the non-linearity of IR, this study aimed to refine the noise characterization of the model in the presence of textured backgrounds. Repeated scans of an anthropomorphic lung phantom were acquired. Subtracted images were used to measure the image quantum noise, which was then used to adjust the noise component of the e' calculation measured from a uniform region. In addition to the model refinement, the validation of the model was further extended to 2 nodule sizes (5 and 10 mm) and 2 segmentation algorithms. Results showed that the magnitude of IR's quantum noise was significantly higher in structured backgrounds than in uniform backgrounds (ASiR, 30-50%; MBIR, 100-200%). With the refined model, the correlation between e' values and empirical precision no longer depended on reconstruction algorithm. In conclusion, the model with refined noise characterization relfected the nonlinearity of iterative reconstruction in structured background, and further showed successful prediction of quantification precision across a variety of nodule sizes, dose levels, slice thickness, reconstruction algorithms, and segmentation software.
RipleyGUI: software for analyzing spatial patterns in 3D cell distributions
Hansson, Kristin; Jafari-Mamaghani, Mehrdad; Krieger, Patrik
2013-01-01
The true revolution in the age of digital neuroanatomy is the ability to extensively quantify anatomical structures and thus investigate structure-function relationships in great detail. To facilitate the quantification of neuronal cell patterns we have developed RipleyGUI, a MATLAB-based software that can be used to detect patterns in the 3D distribution of cells. RipleyGUI uses Ripley's K-function to analyze spatial distributions. In addition the software contains statistical tools to determine quantitative statistical differences, and tools for spatial transformations that are useful for analyzing non-stationary point patterns. The software has a graphical user interface making it easy to use without programming experience, and an extensive user manual explaining the basic concepts underlying the different statistical tools used to analyze spatial point patterns. The described analysis tool can be used for determining the spatial organization of neurons that is important for a detailed study of structure-function relationships. For example, neocortex that can be subdivided into six layers based on cell density and cell types can also be analyzed in terms of organizational principles distinguishing the layers. PMID:23658544
2010-01-01
throughout the entire 3D volume which made quantification of the different tissues in the breast possible. The p eaks representing glandular and fat in...coefficients. Keywords: tissue quantification , absolute attenuation coefficient, scatter correction, computed tomography, tomography... tissue types. 1-4 Accurate measurements of t he quantification and di fferentiation of numerous t issues can be useful to identify di sease from
Leveraging transcript quantification for fast computation of alternative splicing profiles.
Alamancos, Gael P; Pagès, Amadís; Trincado, Juan L; Bellora, Nicolás; Eyras, Eduardo
2015-09-01
Alternative splicing plays an essential role in many cellular processes and bears major relevance in the understanding of multiple diseases, including cancer. High-throughput RNA sequencing allows genome-wide analyses of splicing across multiple conditions. However, the increasing number of available data sets represents a major challenge in terms of computation time and storage requirements. We describe SUPPA, a computational tool to calculate relative inclusion values of alternative splicing events, exploiting fast transcript quantification. SUPPA accuracy is comparable and sometimes superior to standard methods using simulated as well as real RNA-sequencing data compared with experimentally validated events. We assess the variability in terms of the choice of annotation and provide evidence that using complete transcripts rather than more transcripts per gene provides better estimates. Moreover, SUPPA coupled with de novo transcript reconstruction methods does not achieve accuracies as high as using quantification of known transcripts, but remains comparable to existing methods. Finally, we show that SUPPA is more than 1000 times faster than standard methods. Coupled with fast transcript quantification, SUPPA provides inclusion values at a much higher speed than existing methods without compromising accuracy, thereby facilitating the systematic splicing analysis of large data sets with limited computational resources. The software is implemented in Python 2.7 and is available under the MIT license at https://bitbucket.org/regulatorygenomicsupf/suppa. © 2015 Alamancos et al.; Published by Cold Spring Harbor Laboratory Press for the RNA Society.
Federal Register 2010, 2011, 2012, 2013, 2014
2012-03-30
... Market and Planning Efficiency Through Improved Software; Notice of Technical Conference: Increasing Real-Time and Day- Ahead Market Efficiency Through Improved Software Take notice that Commission staff will...-time and day-ahead market efficiency through improved software. A detailed agenda with the list of and...
Federal Register 2010, 2011, 2012, 2013, 2014
2011-05-13
... Market and Planning Efficiency Through Improved Software; Notice of Technical Conference: Increasing Real-Time and Day- Ahead Market Efficiency Through Improved Software Take notice that Commission staff will... for increasing real-time and day-ahead market efficiency through improved software. This conference...
Visualization of LC-MS/MS proteomics data in MaxQuant.
Tyanova, Stefka; Temu, Tikira; Carlson, Arthur; Sinitcyn, Pavel; Mann, Matthias; Cox, Juergen
2015-04-01
Modern software platforms enable the analysis of shotgun proteomics data in an automated fashion resulting in high quality identification and quantification results. Additional understanding of the underlying data can be gained with the help of advanced visualization tools that allow for easy navigation through large LC-MS/MS datasets potentially consisting of terabytes of raw data. The updated MaxQuant version has a map navigation component that steers the users through mass and retention time-dependent mass spectrometric signals. It can be used to monitor a peptide feature used in label-free quantification over many LC-MS runs and visualize it with advanced 3D graphic models. An expert annotation system aids the interpretation of the MS/MS spectra used for the identification of these peptide features. © 2015 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
[Quantification of pulmonary emphysema in multislice-CT using different software tools].
Heussel, C P; Achenbach, T; Buschsieweke, C; Kuhnigk, J; Weinheimer, O; Hammer, G; Düber, C; Kauczor, H-U
2006-10-01
The data records of thin-section MSCT of the lung with approx. 300 images are difficult to use in manual evaluation. A computer-assisted pre-diagnosis can help with reporting. Furthermore, post-processing techniques, for instance, for quantification of emphysema on the basis of three-dimensional anatomical information might be improved and the workflow might be further automated. The results of 4 programs (Pulmo, Volume, YACTA and PulmoFUNC) for the quantitative analysis of emphysema (lung and emphysema volume, mean lung density and emphysema index) of 30 consecutive thin-section MSCT datasets with different emphysema severity levels were compared. The classification result of the YACTA program for different types of emphysema was also analyzed. Pulmo and Volume have a median operating time of 105 and 59 minutes respectively due to the necessity for extensive manual correction of the lung segmentation. The programs PulmoFUNC and YACTA, which are automated to a large extent, have a median runtime of 26 and 16 minutes, respectively. The evaluation with Pulmo and Volume using 2 different datasets resulted in implausible values. PulmoFUNC crashed with 2 other datasets in a reproducible manner. Only with YACTA could all graphic datasets be evaluated. The lung volume, emphysema volume, emphysema index and mean lung density determined by YACTA and PulmoFUNC are significantly larger than the corresponding values of Volume and Pulmo (differences: Volume: 119 cm(3)/65 cm(3)/1 %/17 HU, Pulmo: 60 cm(3)/96 cm(3)/1 %/37 HU). Classification of the emphysema type was in agreement with that of the radiologist in 26 panlobular cases, in 22 paraseptalen cases and in 15 centrilobular emphysema cases. The substantial expenditure of time obstructs the employment of quantitative emphysema analysis in the clinical routine. The results of YACTA and PulmoFUNC are affected by the dedicated exclusion of the tracheobronchial system. These fully automatic tools enable not only fast quantification without manual interaction, but also a reproducible measurement without user dependence.
A Python Interface for the Dakota Iterative Systems Analysis Toolkit
NASA Astrophysics Data System (ADS)
Piper, M.; Hutton, E.; Syvitski, J. P.
2016-12-01
Uncertainty quantification is required to improve the accuracy, reliability, and accountability of Earth science models. Dakota is a software toolkit, developed at Sandia National Laboratories, that provides an interface between models and a library of analysis methods, including support for sensitivity analysis, uncertainty quantification, optimization, and calibration techniques. Dakota is a powerful tool, but its learning curve is steep: the user not only must understand the structure and syntax of the Dakota input file, but also must develop intermediate code, called an analysis driver, that allows Dakota to run a model. The CSDMS Dakota interface (CDI) is a Python package that wraps and extends Dakota's user interface. It simplifies the process of configuring and running a Dakota experiment. A user can program to the CDI, allowing a Dakota experiment to be scripted. The CDI creates Dakota input files and provides a generic analysis driver. Any model written in Python that exposes a Basic Model Interface (BMI), as well as any model componentized in the CSDMS modeling framework, automatically works with the CDI. The CDI has a plugin architecture, so models written in other languages, or those that don't expose a BMI, can be accessed by the CDI by programmatically extending a template; an example is provided in the CDI distribution. Currently, six Dakota analysis methods have been implemented for examples from the much larger Dakota library. To demonstrate the CDI, we performed an uncertainty quantification experiment with the HydroTrend hydrological water balance and transport model. In the experiment, we evaluated the response of long-term suspended sediment load at the river mouth (Qs) to uncertainty in two input parameters, annual mean temperature (T) and precipitation (P), over a series of 100-year runs, using the polynomial chaos method. Through Dakota, we calculated moments, local and global (Sobol') sensitivity indices, and probability density and cumulative distribution functions for the response.
Cytometric analysis of retinopathies in retinal trypsin digests
NASA Astrophysics Data System (ADS)
Ghanian, Zahra; Staniszewski, Kevin; Sorenson, Christine M.; Sheibani, Nader; Ranji, Mahsa
2014-03-01
The objective of this work was to design an automated image cytometry tool for determination of various retinal vascular parameters including extraction of features that are relevant to postnatal retinal vascular development, and the progression of diabetic retinopathy. To confirm the utility and accuracy of the software, retinal trypsin digest from TSP1-/- and diabetic Akita/+; TSP1-/- mice were analyzed. TSP1 is a critical inhibitor of development of retinopathies and lack of TSP1 exacerbates progression of early diabetic retinopathies. Loss of vascular cells of and gain more acellular capillaries as two major signs of diabetic retinopathies were used to classify a retina as normal or injured. This software allows quantification and high throughput assessment of retinopathy changes associated with diabetes.
Boiler: lossy compression of RNA-seq alignments using coverage vectors
Pritt, Jacob; Langmead, Ben
2016-01-01
We describe Boiler, a new software tool for compressing and querying large collections of RNA-seq alignments. Boiler discards most per-read data, keeping only a genomic coverage vector plus a few empirical distributions summarizing the alignments. Since most per-read data is discarded, storage footprint is often much smaller than that achieved by other compression tools. Despite this, the most relevant per-read data can be recovered; we show that Boiler compression has only a slight negative impact on results given by downstream tools for isoform assembly and quantification. Boiler also allows the user to pose fast and useful queries without decompressing the entire file. Boiler is free open source software available from github.com/jpritt/boiler. PMID:27298258
Pawlowski, Roger P.; Phipps, Eric T.; Salinger, Andrew G.; ...
2012-01-01
A template-based generic programming approach was presented in Part I of this series of papers [Sci. Program. 20 (2012), 197–219] that separates the development effort of programming a physical model from that of computing additional quantities, such as derivatives, needed for embedded analysis algorithms. In this paper, we describe the implementation details for using the template-based generic programming approach for simulation and analysis of partial differential equations (PDEs). We detail several of the hurdles that we have encountered, and some of the software infrastructure developed to overcome them. We end with a demonstration where we present shape optimization and uncertaintymore » quantification results for a 3D PDE application.« less
Software Engineering Program: Software Process Improvement Guidebook
NASA Technical Reports Server (NTRS)
1996-01-01
The purpose of this document is to provide experience-based guidance in implementing a software process improvement program in any NASA software development or maintenance community. This guidebook details how to define, operate, and implement a working software process improvement program. It describes the concept of the software process improvement program and its basic organizational components. It then describes the structure, organization, and operation of the software process improvement program, illustrating all these concepts with specific NASA examples. The information presented in the document is derived from the experiences of several NASA software organizations, including the SEL, the SEAL, and the SORCE. Their experiences reflect many of the elements of software process improvement within NASA. This guidebook presents lessons learned in a form usable by anyone considering establishing a software process improvement program within his or her own environment. This guidebook attempts to balance general and detailed information. It provides material general enough to be usable by NASA organizations whose characteristics do not directly match those of the sources of the information and models presented herein. It also keeps the ideas sufficiently close to the sources of the practical experiences that have generated the models and information.
Model Uncertainty Quantification Methods In Data Assimilation
NASA Astrophysics Data System (ADS)
Pathiraja, S. D.; Marshall, L. A.; Sharma, A.; Moradkhani, H.
2017-12-01
Data Assimilation involves utilising observations to improve model predictions in a seamless and statistically optimal fashion. Its applications are wide-ranging; from improving weather forecasts to tracking targets such as in the Apollo 11 mission. The use of Data Assimilation methods in high dimensional complex geophysical systems is an active area of research, where there exists many opportunities to enhance existing methodologies. One of the central challenges is in model uncertainty quantification; the outcome of any Data Assimilation study is strongly dependent on the uncertainties assigned to both observations and models. I focus on developing improved model uncertainty quantification methods that are applicable to challenging real world scenarios. These include developing methods for cases where the system states are only partially observed, where there is little prior knowledge of the model errors, and where the model error statistics are likely to be highly non-Gaussian.
Quantitative proteome analysis using isobaric peptide termini labeling (IPTL).
Arntzen, Magnus O; Koehler, Christian J; Treumann, Achim; Thiede, Bernd
2011-01-01
The quantitative comparison of proteome level changes across biological samples has become an essential feature in proteomics that remains challenging. We have recently introduced isobaric peptide termini labeling (IPTL), a novel strategy for isobaric quantification based on the derivatization of peptide termini with complementary isotopically labeled reagents. Unlike non-isobaric quantification methods, sample complexity at the MS level is not increased, providing improved sensitivity and protein coverage. The distinguishing feature of IPTL when comparing it to more established isobaric labeling methods (iTRAQ and TMT) is the presence of quantification signatures in all sequence-determining ions in MS/MS spectra, not only in the low mass reporter ion region. This makes IPTL a quantification method that is accessible to mass spectrometers with limited capabilities in the low mass range. Also, the presence of several quantification points in each MS/MS spectrum increases the robustness of the quantification procedure.
Ahn, Sung Hee; Bae, Yong Jin; Moon, Jeong Hee; Kim, Myung Soo
2013-09-17
We propose to divide matrix suppression in matrix-assisted laser desorption ionization into two parts, normal and anomalous. In quantification of peptides, the normal effect can be accounted for by constructing the calibration curve in the form of peptide-to-matrix ion abundance ratio versus concentration. The anomalous effect forbids reliable quantification and is noticeable when matrix suppression is larger than 70%. With this 70% rule, matrix suppression becomes a guideline for reliable quantification, rather than a nuisance. A peptide in a complex mixture can be quantified even in the presence of large amounts of contaminants, as long as matrix suppression is below 70%. The theoretical basis for the quantification method using a peptide as an internal standard is presented together with its weaknesses. A systematic method to improve quantification of high concentration analytes has also been developed.
Streaming fragment assignment for real-time analysis of sequencing experiments
Roberts, Adam; Pachter, Lior
2013-01-01
We present eXpress, a software package for highly efficient probabilistic assignment of ambiguously mapping sequenced fragments. eXpress uses a streaming algorithm with linear run time and constant memory use. It can determine abundances of sequenced molecules in real time, and can be applied to ChIP-seq, metagenomics and other large-scale sequencing data. We demonstrate its use on RNA-seq data, showing greater efficiency than other quantification methods. PMID:23160280
Federal Register 2010, 2011, 2012, 2013, 2014
2011-07-15
... Market and Planning Efficiency Through Improved Software; Notice Establishing Date for Comments From June... real-time and day- ahead market efficiency through improved software.\\1\\ \\1\\ Notice of technical conference: increasing real-time and day-ahead market efficiency through improved software, 76 Fed. Reg. 28...
A tool for design of primers for microRNA-specific quantitative RT-qPCR.
Busk, Peter K
2014-01-28
MicroRNAs are small but biologically important RNA molecules. Although different methods can be used for quantification of microRNAs, quantitative PCR is regarded as the reference that is used to validate other methods. Several commercial qPCR assays are available but they often come at a high price and the sequences of the primers are not disclosed. An alternative to commercial assays is to manually design primers but this work is tedious and, hence, not practical for the design of primers for a larger number of targets. I have developed the software miRprimer for automatic design of primers for the method miR-specific RT-qPCR, which is one of the best performing microRNA qPCR methods available. The algorithm is based on an implementation of the previously published rules for manual design of miR-specific primers with the additional feature of evaluating the propensity of formation of secondary structures and primer dimers. Testing of the primers showed that 76 out of 79 primers (96%) worked for quantification of microRNAs by miR-specific RT-qPCR of mammalian RNA samples. This success rate corresponds to the success rate of manual primer design. Furthermore, primers designed by this method have been distributed to several labs and used successfully in published studies. The software miRprimer is an automatic and easy method for design of functional primers for miR-specific RT-qPCR. The application is available as stand-alone software that will work on the MS Windows platform and in a developer version written in the Ruby programming language.
FracPaQ: a MATLAB™ Toolbox for the Quantification of Fracture Patterns
NASA Astrophysics Data System (ADS)
Healy, D.; Rizzo, R. E.; Cornwell, D. G.; Timms, N.; Farrell, N. J.; Watkins, H.; Gomez-Rivas, E.; Smith, M.
2016-12-01
The patterns of fractures in deformed rocks are rarely uniform or random. Fracture orientations, sizes, shapes and spatial distributions often exhibit some kind of order. In detail, there may be relationships among the different fracture attributes e.g. small fractures dominated by one orientation, larger fractures by another. These relationships are important because the mechanical (e.g. strength, anisotropy) and transport (e.g. fluids, heat) properties of rock depend on these fracture patterns and fracture attributes. This presentation describes an open source toolbox to quantify fracture patterns, including distributions in fracture attributes and their spatial variation. Software has been developed to quantify fracture patterns from 2-D digital images, such as thin section micrographs, geological maps, outcrop or aerial photographs or satellite images. The toolbox comprises a suite of MATLAB™ scripts based on published quantitative methods for the analysis of fracture attributes: orientations, lengths, intensity, density and connectivity. An estimate of permeability in 2-D is made using a parallel plate model. The software provides an objective and consistent methodology for quantifying fracture patterns and their variations in 2-D across a wide range of length scales. Our current focus for the application of the software is on quantifying the fracture patterns in and around fault zones. There is a large body of published work on the quantification of relatively simple joint patterns, but fault zones present a bigger, and arguably more important, challenge. The method presented is inherently scale independent, and a key task will be to analyse and integrate quantitative fracture pattern data from micro- to macro-scales. Planned future releases will incorporate multi-scale analyses based on a wavelet method to look for scale transitions, and combining fracture traces from multiple 2-D images to derive the statistically equivalent 3-D fracture pattern.
Stockwell, Simon R; Mittnacht, Sibylle
2014-12-16
Advances in understanding the control mechanisms governing the behavior of cells in adherent mammalian tissue culture models are becoming increasingly dependent on modes of single-cell analysis. Methods which deliver composite data reflecting the mean values of biomarkers from cell populations risk losing subpopulation dynamics that reflect the heterogeneity of the studied biological system. In keeping with this, traditional approaches are being replaced by, or supported with, more sophisticated forms of cellular assay developed to allow assessment by high-content microscopy. These assays potentially generate large numbers of images of fluorescent biomarkers, which enabled by accompanying proprietary software packages, allows for multi-parametric measurements per cell. However, the relatively high capital costs and overspecialization of many of these devices have prevented their accessibility to many investigators. Described here is a universally applicable workflow for the quantification of multiple fluorescent marker intensities from specific subcellular regions of individual cells suitable for use with images from most fluorescent microscopes. Key to this workflow is the implementation of the freely available Cell Profiler software(1) to distinguish individual cells in these images, segment them into defined subcellular regions and deliver fluorescence marker intensity values specific to these regions. The extraction of individual cell intensity values from image data is the central purpose of this workflow and will be illustrated with the analysis of control data from a siRNA screen for G1 checkpoint regulators in adherent human cells. However, the workflow presented here can be applied to analysis of data from other means of cell perturbation (e.g., compound screens) and other forms of fluorescence based cellular markers and thus should be useful for a wide range of laboratories.
Isse, K; Lesniak, A; Grama, K; Roysam, B; Minervini, M I; Demetris, A J
2012-01-01
Conventional histopathology is the gold standard for allograft monitoring, but its value proposition is increasingly questioned. "-Omics" analysis of tissues, peripheral blood and fluids and targeted serologic studies provide mechanistic insights into allograft injury not currently provided by conventional histology. Microscopic biopsy analysis, however, provides valuable and unique information: (a) spatial-temporal relationships; (b) rare events/cells; (c) complex structural context; and (d) integration into a "systems" model. Nevertheless, except for immunostaining, no transformative advancements have "modernized" routine microscopy in over 100 years. Pathologists now team with hardware and software engineers to exploit remarkable developments in digital imaging, nanoparticle multiplex staining, and computational image analysis software to bridge the traditional histology-global "-omic" analyses gap. Included are side-by-side comparisons, objective biopsy finding quantification, multiplexing, automated image analysis, and electronic data and resource sharing. Current utilization for teaching, quality assurance, conferencing, consultations, research and clinical trials is evolving toward implementation for low-volume, high-complexity clinical services like transplantation pathology. Cost, complexities of implementation, fluid/evolving standards, and unsettled medical/legal and regulatory issues remain as challenges. Regardless, challenges will be overcome and these technologies will enable transplant pathologists to increase information extraction from tissue specimens and contribute to cross-platform biomarker discovery for improved outcomes. ©Copyright 2011 The American Society of Transplantation and the American Society of Transplant Surgeons.
Goñi-Moreno, Ángel; Kim, Juhyun; de Lorenzo, Víctor
2017-02-01
Visualization of the intracellular constituents of individual bacteria while performing as live biocatalysts is in principle doable through more or less sophisticated fluorescence microscopy. Unfortunately, rigorous quantitation of the wealth of data embodied in the resulting images requires bioinformatic tools that are not widely extended within the community-let alone that they are often subject to licensing that impedes software reuse. In this context we have developed CellShape, a user-friendly platform for image analysis with subpixel precision and double-threshold segmentation system for quantification of fluorescent signals stemming from single-cells. CellShape is entirely coded in Python, a free, open-source programming language with widespread community support. For a developer, CellShape enhances extensibility (ease of software improvements) by acting as an interface to access and use existing Python modules; for an end-user, CellShape presents standalone executable files ready to open without installation. We have adopted this platform to analyse with an unprecedented detail the tridimensional distribution of the constituents of the gene expression flow (DNA, RNA polymerase, mRNA and ribosomal proteins) in individual cells of the industrial platform strain Pseudomonas putida KT2440. While the CellShape first release version (v0.8) is readily operational, users and/or developers are enabled to expand the platform further. Copyright © 2017 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Novel SPECT Technologies and Approaches in Cardiac Imaging
Slomka, Piotr; Hung, Guang-Uei; Germano, Guido; Berman, Daniel S.
2017-01-01
Recent novel approaches in myocardial perfusion single photon emission CT (SPECT) have been facilitated by new dedicated high-efficiency hardware with solid-state detectors and optimized collimators. New protocols include very low-dose (1 mSv) stress-only, two-position imaging to mitigate attenuation artifacts, and simultaneous dual-isotope imaging. Attenuation correction can be performed by specialized low-dose systems or by previously obtained CT coronary calcium scans. Hybrid protocols using CT angiography have been proposed. Image quality improvements have been demonstrated by novel reconstructions and motion correction. Fast SPECT acquisition facilitates dynamic flow and early function measurements. Image processing algorithms have become automated with virtually unsupervised extraction of quantitative imaging variables. This automation facilitates integration with clinical variables derived by machine learning to predict patient outcome or diagnosis. In this review, we describe new imaging protocols made possible by the new hardware developments. We also discuss several novel software approaches for the quantification and interpretation of myocardial perfusion SPECT scans. PMID:29034066
A Study of Applying Pulsed Remote Field Eddy Current in Ferromagnetic Pipes Testing
Luo, Qingwang; Shi, Yibing; Wang, Zhigang; Zhang, Wei; Li, Yanjun
2017-01-01
Pulsed Remote Field Eddy Current Testing (PRFECT) attracts the attention in the testing of ferromagnetic pipes because of its continuous spectrum. This paper simulated the practical PRFECT of pipes by using ANSYS software and employed Least Squares Support Vector Regression (LSSVR) to extract the zero-crossing time to analyze the pipe thickness. As a result, a secondary peak is found in zero-crossing time when transmitter passed by a defect. The secondary peak will lead to wrong quantification and the localization of defects, especially when defects are found only at the transmitter location. Aiming to eliminate the secondary peaks, double sensing coils are set in the transition zone and Wiener deconvolution filter is applied. In the proposed method, position dependent response of the differential signals from the double sensing coils is calibrated by employing zero-mean normalization. The methods proposed in this paper are validated by analyzing the simulation signals and can improve the practicality of PRFECT of ferromagnetic pipes. PMID:28475141
Using iRT, a normalized retention time for more targeted measurement of peptides
Escher, Claudia; Reiter, Lukas; MacLean, Brendan; Ossola, Reto; Herzog, Franz; Chilton, John; MacCoss, Michael J.; Rinner, Oliver
2014-01-01
Multiple reaction monitoring (MRM) has recently become the method of choice for targeted quantitative measurement of proteins using mass spectrometry. The method, however, is limited in the number of peptides that can be measured in one run. This number can be markedly increased by scheduling the acquisition if the accurate retention time (RT) of each peptide is known. Here we present iRT, an empirically derived dimensionless peptide-specific value that allows for highly accurate RT prediction. The iRT of a peptide is a fixed number relative to a standard set of reference iRT-peptides that can be transferred across laboratories and chromatographic systems. We show that iRT facilitates the setup of multiplexed experiments with acquisition windows more than 4 times smaller compared to in silico RT predictions resulting in improved quantification accuracy. iRTs can be determined by any laboratory and shared transparently. The iRT concept has been implemented in Skyline, the most widely used software for MRM experiments. PMID:22577012
Huebsch, Nathaniel; Loskill, Peter; Mandegar, Mohammad A; Marks, Natalie C; Sheehan, Alice S; Ma, Zhen; Mathur, Anurag; Nguyen, Trieu N; Yoo, Jennie C; Judge, Luke M; Spencer, C Ian; Chukka, Anand C; Russell, Caitlin R; So, Po-Lin; Conklin, Bruce R; Healy, Kevin E
2015-05-01
Contractile motion is the simplest metric of cardiomyocyte health in vitro, but unbiased quantification is challenging. We describe a rapid automated method, requiring only standard video microscopy, to analyze the contractility of human-induced pluripotent stem cell-derived cardiomyocytes (iPS-CM). New algorithms for generating and filtering motion vectors combined with a newly developed isogenic iPSC line harboring genetically encoded calcium indicator, GCaMP6f, allow simultaneous user-independent measurement and analysis of the coupling between calcium flux and contractility. The relative performance of these algorithms, in terms of improving signal to noise, was tested. Applying these algorithms allowed analysis of contractility in iPS-CM cultured over multiple spatial scales from single cells to three-dimensional constructs. This open source software was validated with analysis of isoproterenol response in these cells, and can be applied in future studies comparing the drug responsiveness of iPS-CM cultured in different microenvironments in the context of tissue engineering.
A Study of Applying Pulsed Remote Field Eddy Current in Ferromagnetic Pipes Testing.
Luo, Qingwang; Shi, Yibing; Wang, Zhigang; Zhang, Wei; Li, Yanjun
2017-05-05
Pulsed Remote Field Eddy Current Testing (PRFECT) attracts the attention in the testing of ferromagnetic pipes because of its continuous spectrum. This paper simulated the practical PRFECT of pipes by using ANSYS software and employed Least Squares Support Vector Regression (LSSVR) to extract the zero-crossing time to analyze the pipe thickness. As a result, a secondary peak is found in zero-crossing time when transmitter passed by a defect. The secondary peak will lead to wrong quantification and the localization of defects, especially when defects are found only at the transmitter location. Aiming to eliminate the secondary peaks, double sensing coils are set in the transition zone and Wiener deconvolution filter is applied. In the proposed method, position dependent response of the differential signals from the double sensing coils is calibrated by employing zero-mean normalization. The methods proposed in this paper are validated by analyzing the simulation signals and can improve the practicality of PRFECT of ferromagnetic pipes.
Software Engineering Improvement Activities/Plan
NASA Technical Reports Server (NTRS)
2003-01-01
bd Systems personnel accomplished the technical responsibilities for this reporting period, as planned. A close working relationship was maintained with personnel of the MSFC Avionics Department Software Group (ED14). Work accomplishments included development, evaluation, and enhancement of a software cost model, performing literature search and evaluation of software tools available for code analysis and requirements analysis, and participating in other relevant software engineering activities. Monthly reports were submitted. This support was provided to the Flight Software Group/ED 1 4 in accomplishing the software engineering improvement engineering activities of the Marshall Space Flight Center (MSFC) Software Engineering Improvement Plan.
A Prototype for the Support of Integrated Software Process Development and Improvement
NASA Astrophysics Data System (ADS)
Porrawatpreyakorn, Nalinpat; Quirchmayr, Gerald; Chutimaskul, Wichian
An efficient software development process is one of key success factors for quality software. Not only can the appropriate establishment but also the continuous improvement of integrated project management and of the software development process result in efficiency. This paper hence proposes a software process maintenance framework which consists of two core components: an integrated PMBOK-Scrum model describing how to establish a comprehensive set of project management and software engineering processes and a software development maturity model advocating software process improvement. Besides, a prototype tool to support the framework is introduced.
2015-04-15
manage , predict, and mitigate the risk in the original variable. Residual risk can be exemplified as a quantification of the improved... the random variable of interest is viewed in concert with a related random vector that helps to manage , predict, and mitigate the risk in the original... manage , predict and mitigate the risk in the original variable. Residual risk can be exemplified as a quantification of the improved situation faced
2014-05-18
intention of offering improved software libraries for GNSS signal acquisition. It has been the team mission to implement new and improved techniques...with the intention of offering improved software libraries for GNSS signal acquisition. It has been the team mission to implement new and improved...intention of offering improved software libraries for GNSS signal acquisition. It has been the team mission to implement new and improved techniques to
Amoah, Isaac Dennis; Singh, Gulshan; Stenström, Thor Axel; Reddy, Poovendhree
2017-05-01
It is estimated that over a billion people are infected with soil-transmitted helminths (STHs) globally with majority occurring in tropical and subtropical regions of the world. The roundworm (Ascaris lumbricoides), whipworm (Trichuris trichiura), and hookworms (Ancylostoma duodenale and Necator americanus) are the main species infecting people. These infections are mostly gained through exposure to faecally contaminated water, soil or contaminated food and with an increase in the risk of infections due to wastewater and sludge reuse in agriculture. Different methods have been developed for the detection and quantification of STHs eggs in environmental samples. However, there is a lack of a universally accepted technique which creates a challenge for comparative assessments of helminths egg concentrations both in different samples matrices as well as between locations. This review presents a comparison of reported methodologies for the detection of STHs eggs, an assessment of the relative performance of available detection methods and a discussion of new emerging techniques that could be applied for detection and quantification. It is based on a literature search using PubMed and Science Direct considering all geographical locations. Original research articles were selected based on their methodology and results sections. Methods reported in these articles were grouped into conventional, molecular and emerging techniques, the main steps in each method were then compared and discussed. The inclusion of a dissociation step aimed at detaching helminth eggs from particulate matter was found to improve the recovery of eggs. Additionally the selection and application of flotation solutions that take into account the relative densities of the eggs of different species of STHs also results in higher egg recovery. Generally the use of conventional methods was shown to be laborious and time consuming and prone to human error. The alternate use of nucleic acid-based techniques has improved the sensitivity of detection and made species specific identification possible. However, these nucleic acid based methods are expensive and less suitable in regions with limited resources and skill. The loop mediated isothermal amplification method shows promise for application in these settings due to its simplicity and use of basic equipment. In addition, the development of imaging soft-ware for the detection and quantification of STHs shows promise to further reduce human error associated with the analysis of environmental samples. It may be concluded that there is a need to comparatively assess the performance of different methods to determine their applicability in different settings as well as for use with different sample matrices (wastewater, sludge, compost, soil, vegetables etc.). Copyright © 2017 The Authors. Published by Elsevier B.V. All rights reserved.
Scott, Sarah Nicole; Templeton, Jeremy Alan; Hough, Patricia Diane; ...
2014-01-01
This study details a methodology for quantification of errors and uncertainties of a finite element heat transfer model applied to a Ruggedized Instrumentation Package (RIP). The proposed verification and validation (V&V) process includes solution verification to examine errors associated with the code's solution techniques, and model validation to assess the model's predictive capability for quantities of interest. The model was subjected to mesh resolution and numerical parameters sensitivity studies to determine reasonable parameter values and to understand how they change the overall model response and performance criteria. To facilitate quantification of the uncertainty associated with the mesh, automatic meshing andmore » mesh refining/coarsening algorithms were created and implemented on the complex geometry of the RIP. Automated software to vary model inputs was also developed to determine the solution’s sensitivity to numerical and physical parameters. The model was compared with an experiment to demonstrate its accuracy and determine the importance of both modelled and unmodelled physics in quantifying the results' uncertainty. An emphasis is placed on automating the V&V process to enable uncertainty quantification within tight development schedules.« less
Vecchione, Gennaro; Casetta, Bruno; Chiapparino, Antonella; Bertolino, Alessandro; Tomaiuolo, Michela; Cappucci, Filomena; Gatta, Raffaella; Margaglione, Maurizio; Grandone, Elvira
2012-01-01
A simple liquid chromatographic tandem mass spectrometry (LC-MS/MS) method has been developed for simultaneous analysis of 17 basic and one acid psychotropic drugs in human plasma. The method relies on a protein precipitation step for sample preparation and offers high sensitivity, wide linearity without interferences from endogenous matrix components. Chromatography was run on a reversed-phase column with an acetonitrile-H₂O mixture. The quantification of target compounds was performed in multiple reaction monitoring (MRM) and by switching the ionization polarity within the analytical run. A further sensitivity increase was obtained by implementing the functionality "scheduled multiple reaction monitoring" (sMRM) offered by the recent version of the software package managing the instrument. The overall injection interval was less than 5.5 min. Regression coefficients of the calibration curves and limits of quantification (LOQ) showed a good coverage of over-therapeutic, therapeutic and sub-therapeutic ranges. Recovery rates, measured as percentage of recovery of spiked plasma samples, were ≥ 94%. Precision and accuracy data have been satisfactory for a therapeutic drug monitoring (TDM) service as for managing plasma samples from patients receiving psycho-pharmacological treatment. Copyright © 2012 Elsevier B.V. All rights reserved.
Zhou, Juntuo; Liu, Huiying; Liu, Yang; Liu, Jia; Zhao, Xuyang; Yin, Yuxin
2016-04-19
Recent advances in mass spectrometers which have yielded higher resolution and faster scanning speeds have expanded their application in metabolomics of diverse diseases. Using a quadrupole-Orbitrap LC-MS system, we developed an efficient large-scale quantitative method targeting 237 metabolites involved in various metabolic pathways using scheduled, parallel reaction monitoring (PRM). We assessed the dynamic range, linearity, reproducibility, and system suitability of the PRM assay by measuring concentration curves, biological samples, and clinical serum samples. The quantification performances of PRM and MS1-based assays in Q-Exactive were compared, and the MRM assay in QTRAP 6500 was also compared. The PRM assay monitoring 237 polar metabolites showed greater reproducibility and quantitative accuracy than MS1-based quantification and also showed greater flexibility in postacquisition assay refinement than the MRM assay in QTRAP 6500. We present a workflow for convenient PRM data processing using Skyline software which is free of charge. In this study we have established a reliable PRM methodology on a quadrupole-Orbitrap platform for evaluation of large-scale targeted metabolomics, which provides a new choice for basic and clinical metabolomics study.
An Excel‐based implementation of the spectral method of action potential alternans analysis
Pearman, Charles M.
2014-01-01
Abstract Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro‐arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T‐wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. PMID:25501439
Trypanosoma cruzi infectivity assessment in "in vitro" culture systems by automated cell counting.
Liempi, Ana; Castillo, Christian; Cerda, Mauricio; Droguett, Daniel; Duaso, Juan; Barahona, Katherine; Hernández, Ariane; Díaz-Luján, Cintia; Fretes, Ricardo; Härtel, Steffen; Kemmerling, Ulrike
2015-03-01
Chagas disease is an endemic, neglected tropical disease in Latin America that is caused by the protozoan parasite Trypanosoma cruzi. In vitro models constitute the first experimental approach to study the physiopathology of the disease and to assay potential new trypanocidal agents. Here, we report and describe clearly the use of commercial software (MATLAB(®)) to quantify T. cruzi amastigotes and infected mammalian cells (BeWo) and compared this analysis with the manual one. There was no statistically significant difference between the manual and the automatic quantification of the parasite; the two methods showed a correlation analysis r(2) value of 0.9159. The most significant advantage of the automatic quantification was the efficiency of the analysis. The drawback of this automated cell counting method was that some parasites were assigned to the wrong BeWo cell, however this data did not exceed 5% when adequate experimental conditions were chosen. We conclude that this quantification method constitutes an excellent tool for evaluating the parasite load in cells and therefore constitutes an easy and reliable ways to study parasite infectivity. Copyright © 2014 Elsevier B.V. All rights reserved.
Sfetsas, Themistoklis; Michailof, Chrysa; Lappas, Angelos; Li, Qiangyi; Kneale, Brian
2011-05-27
Pyrolysis oils have attracted a lot of interest, as they are liquid energy carriers and general sources of chemicals. In this work, gas chromatography with flame ionization detector (GC-FID) and two-dimensional gas chromatography with time-of-flight mass spectrometry (GC×GC-TOFMS) techniques were used to provide both qualitative and quantitative results of the analysis of three different pyrolysis oils. The chromatographic methods and parameters were optimized and solvent choice and separation restrictions are discussed. Pyrolysis oil samples were diluted in suitable organic solvent and were analyzed by GC×GC-TOFMS. An average of 300 compounds were detected and identified in all three samples using the ChromaToF (Leco) software. The deconvoluted spectra were compared with the NIST software library for correct matching. Group type classification was performed by use of the ChromaToF software. The quantification of 11 selected compounds was performed by means of a multiple-point external calibration curve. Afterwards, the pyrolysis oils were extracted with water, and the aqueous phase was analyzed both by GC-FID and, after proper change of solvent, by GC×GC-TOFMS. As previously, the selected compounds were quantified by both techniques, by means of multiple point external calibration curves. The parameters of the calibration curves were calculated by weighted linear regression analysis. The limit of detection, limit of quantitation and linearity range for each standard compound with each method are presented. The potency of GC×GC-TOFMS for an efficient mapping of the pyrolysis oil is undisputable, and the possibility of using it for quantification as well has been demonstrated. On the other hand, the GC-FID analysis provides reliable results that allow for a rapid screening of the pyrolysis oil. To the best of our knowledge, very few papers have been reported with quantification attempts on pyrolysis oil samples using GC×GC-TOFMS most of which make use of the internal standard method. This work provides the ground for further analysis of pyrolysis oils of diverse sources for a rational design of both their production and utilization process. Copyright © 2010 Elsevier B.V. All rights reserved.
P-MartCancer–Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets
DOE Office of Scientific and Technical Information (OSTI.GOV)
Webb-Robertson, Bobbie-Jo M.; Bramer, Lisa M.; Jensen, Jeffrey L.
P-MartCancer is a new interactive web-based software environment that enables biomedical and biological scientists to perform in-depth analyses of global proteomics data without requiring direct interaction with the data or with statistical software. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access to multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium (CPTAC) at the peptide, gene and protein levels. P-MartCancer is deployed using Azure technologies (http://pmart.labworks.org/cptac.html), the web-service is alternativelymore » available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/) and many statistical functions can be utilized directly from an R package available on GitHub (https://github.com/pmartR).« less
Macedo, Nayana Damiani; Buzin, Aline Rodrigues; de Araujo, Isabela Bastos Binotti Abreu; Nogueira, Breno Valentim; de Andrade, Tadeu Uggere; Endringer, Denise Coutinho; Lenz, Dominik
2017-02-01
The current study proposes an automated machine learning approach for the quantification of cells in cell death pathways according to DNA fragmentation. A total of 17 images of kidney histological slide samples from male Wistar rats were used. The slides were photographed using an Axio Zeiss Vert.A1 microscope with a 40x objective lens coupled with an Axio Cam MRC Zeiss camera and Zen 2012 software. The images were analyzed using CellProfiler (version 2.1.1) and CellProfiler Analyst open-source software. Out of the 10,378 objects, 4970 (47,9%) were identified as TUNEL positive, and 5408 (52,1%) were identified as TUNEL negative. On average, the sensitivity and specificity values of the machine learning approach were 0.80 and 0.77, respectively. Image cytometry provides a quantitative analytical alternative to the more traditional qualitative methods more commonly used in studies. Copyright © 2016 Elsevier Ltd. All rights reserved.
Dasa, Siva Sai Krishna; Kelly, Kimberly A.
2016-01-01
Next-generation sequencing has enhanced the phage display process, allowing for the quantification of millions of sequences resulting from the biopanning process. In response, many valuable analysis programs focused on specificity and finding targeted motifs or consensus sequences were developed. For targeted drug delivery and molecular imaging, it is also necessary to find peptides that are selective—targeting only the cell type or tissue of interest. We present a new analysis strategy and accompanying software, PHage Analysis for Selective Targeted PEPtides (PHASTpep), which identifies highly specific and selective peptides. Using this process, we discovered and validated, both in vitro and in vivo in mice, two sequences (HTTIPKV and APPIMSV) targeted to pancreatic cancer-associated fibroblasts that escaped identification using previously existing software. Our selectivity analysis makes it possible to discover peptides that target a specific cell type and avoid other cell types, enhancing clinical translatability by circumventing complications with systemic use. PMID:27186887
Development of image analysis software for quantification of viable cells in microchips.
Georg, Maximilian; Fernández-Cabada, Tamara; Bourguignon, Natalia; Karp, Paola; Peñaherrera, Ana B; Helguera, Gustavo; Lerner, Betiana; Pérez, Maximiliano S; Mertelsmann, Roland
2018-01-01
Over the past few years, image analysis has emerged as a powerful tool for analyzing various cell biology parameters in an unprecedented and highly specific manner. The amount of data that is generated requires automated methods for the processing and analysis of all the resulting information. The software available so far are suitable for the processing of fluorescence and phase contrast images, but often do not provide good results from transmission light microscopy images, due to the intrinsic variation of the acquisition of images technique itself (adjustment of brightness / contrast, for instance) and the variability between image acquisition introduced by operators / equipment. In this contribution, it has been presented an image processing software, Python based image analysis for cell growth (PIACG), that is able to calculate the total area of the well occupied by cells with fusiform and rounded morphology in response to different concentrations of fetal bovine serum in microfluidic chips, from microscopy images in transmission light, in a highly efficient way.
Nema, Shubham; Hasan, Whidul; Bhargava, Anamika; Bhargava, Yogesh
2016-09-15
Behavioural neuroscience relies on software driven methods for behavioural assessment, but the field lacks cost-effective, robust, open source software for behavioural analysis. Here we propose a novel method which we called as ZebraTrack. It includes cost-effective imaging setup for distraction-free behavioural acquisition, automated tracking using open-source ImageJ software and workflow for extraction of behavioural endpoints. Our ImageJ algorithm is capable of providing control to users at key steps while maintaining automation in tracking without the need for the installation of external plugins. We have validated this method by testing novelty induced anxiety behaviour in adult zebrafish. Our results, in agreement with established findings, showed that during state-anxiety, zebrafish showed reduced distance travelled, increased thigmotaxis and freezing events. Furthermore, we proposed a method to represent both spatial and temporal distribution of choice-based behaviour which is currently not possible to represent using simple videograms. ZebraTrack method is simple and economical, yet robust enough to give results comparable with those obtained from costly proprietary software like Ethovision XT. We have developed and validated a novel cost-effective method for behavioural analysis of adult zebrafish using open-source ImageJ software. Copyright © 2016 Elsevier B.V. All rights reserved.
van Hamersvelt, Robbert W; Willemink, Martin J; de Jong, Pim A; Milles, Julien; Vlassenbroek, Alain; Schilham, Arnold M R; Leiner, Tim
2017-09-01
The aim of this study was to evaluate the feasibility and accuracy of dual-layer spectral detector CT (SDCT) for the quantification of clinically encountered gadolinium concentrations. The cardiac chamber of an anthropomorphic thoracic phantom was equipped with 14 tubular inserts containing different gadolinium concentrations, ranging from 0 to 26.3 mg/mL (0.0, 0.1, 0.2, 0.4, 0.5, 1.0, 2.0, 3.0, 4.0, 5.1, 10.6, 15.7, 20.7 and 26.3 mg/mL). Images were acquired using a novel 64-detector row SDCT system at 120 and 140 kVp. Acquisitions were repeated five times to assess reproducibility. Regions of interest (ROIs) were drawn on three slices per insert. A spectral plot was extracted for every ROI and mean attenuation profiles were fitted to known attenuation profiles of water and pure gadolinium using in-house-developed software to calculate gadolinium concentrations. At both 120 and 140 kVp, excellent correlations between scan repetitions and true and measured gadolinium concentrations were found (R > 0.99, P < 0.001; ICCs > 0.99, CI 0.99-1.00). Relative mean measurement errors stayed below 10% down to 2.0 mg/mL true gadolinium concentration at 120 kVp and below 5% down to 1.0 mg/mL true gadolinium concentration at 140 kVp. SDCT allows for accurate quantification of gadolinium at both 120 and 140 kVp. Lowest measurement errors were found for 140 kVp acquisitions. • Gadolinium quantification may be useful in patients with contraindication to iodine. • Dual-layer spectral detector CT allows for overall accurate quantification of gadolinium. • Interscan variability of gadolinium quantification using SDCT material decomposition is excellent.
quantGenius: implementation of a decision support system for qPCR-based gene quantification.
Baebler, Špela; Svalina, Miha; Petek, Marko; Stare, Katja; Rotter, Ana; Pompe-Novak, Maruša; Gruden, Kristina
2017-05-25
Quantitative molecular biology remains a challenge for researchers due to inconsistent approaches for control of errors in the final results. Due to several factors that can influence the final result, quantitative analysis and interpretation of qPCR data are still not trivial. Together with the development of high-throughput qPCR platforms, there is a need for a tool allowing for robust, reliable and fast nucleic acid quantification. We have developed "quantGenius" ( http://quantgenius.nib.si ), an open-access web application for a reliable qPCR-based quantification of nucleic acids. The quantGenius workflow interactively guides the user through data import, quality control (QC) and calculation steps. The input is machine- and chemistry-independent. Quantification is performed using the standard curve approach, with normalization to one or several reference genes. The special feature of the application is the implementation of user-guided QC-based decision support system, based on qPCR standards, that takes into account pipetting errors, assay amplification efficiencies, limits of detection and quantification of the assays as well as the control of PCR inhibition in individual samples. The intermediate calculations and final results are exportable in a data matrix suitable for further statistical analysis or visualization. We additionally compare the most important features of quantGenius with similar advanced software tools and illustrate the importance of proper QC system in the analysis of qPCR data in two use cases. To our knowledge, quantGenius is the only qPCR data analysis tool that integrates QC-based decision support and will help scientists to obtain reliable results which are the basis for biologically meaningful data interpretation.
Federal Register 2010, 2011, 2012, 2013, 2014
2013-05-28
... Market and Planning Efficiency Through Improved Software; Supplemental Agenda Notice Take notice that... for increasing real-time and day-ahead market efficiency through improved software. A detailed agenda..., the software industry, government, research centers and academia and is intended to build on the...
The Improvement Cycle: Analyzing Our Experience
NASA Technical Reports Server (NTRS)
Pajerski, Rose; Waligora, Sharon
1996-01-01
NASA's Software Engineering Laboratory (SEL), one of the earliest pioneers in the areas of software process improvement and measurement, has had a significant impact on the software business at NASA Goddard. At the heart of the SEL's improvement program is a belief that software products can be improved by optimizing the software engineering process used to develop them and a long-term improvement strategy that facilitates small incremental improvements that accumulate into significant gains. As a result of its efforts, the SEL has incrementally reduced development costs by 60%, decreased error rates by 85%, and reduced cycle time by 25%. In this paper, we analyze the SEL's experiences on three major improvement initiatives to better understand the cyclic nature of the improvement process and to understand why some improvements take much longer than others.
Chew, Avenell L.; Lamey, Tina; McLaren, Terri; De Roach, John
2016-01-01
Purpose To present en face optical coherence tomography (OCT) images generated by graph-search theory algorithm-based custom software and examine correlation with other imaging modalities. Methods En face OCT images derived from high density OCT volumetric scans of 3 healthy subjects and 4 patients using a custom algorithm (graph-search theory) and commercial software (Heidelberg Eye Explorer software (Heidelberg Engineering)) were compared and correlated with near infrared reflectance, fundus autofluorescence, adaptive optics flood-illumination ophthalmoscopy (AO-FIO) and microperimetry. Results Commercial software was unable to generate accurate en face OCT images in eyes with retinal pigment epithelium (RPE) pathology due to segmentation error at the level of Bruch’s membrane (BM). Accurate segmentation of the basal RPE and BM was achieved using custom software. The en face OCT images from eyes with isolated interdigitation or ellipsoid zone pathology were of similar quality between custom software and Heidelberg Eye Explorer software in the absence of any other significant outer retinal pathology. En face OCT images demonstrated angioid streaks, lesions of acute macular neuroretinopathy, hydroxychloroquine toxicity and Bietti crystalline deposits that correlated with other imaging modalities. Conclusions Graph-search theory algorithm helps to overcome the limitations of outer retinal segmentation inaccuracies in commercial software. En face OCT images can provide detailed topography of the reflectivity within a specific layer of the retina which correlates with other forms of fundus imaging. Our results highlight the need for standardization of image reflectivity to facilitate quantification of en face OCT images and longitudinal analysis. PMID:27959968
Swiderska, Zaneta; Korzynska, Anna; Markiewicz, Tomasz; Lorent, Malgorzata; Zak, Jakub; Wesolowska, Anna; Roszkowiak, Lukasz; Slodkowska, Janina; Grala, Bartlomiej
2015-01-01
Background. This paper presents the study concerning hot-spot selection in the assessment of whole slide images of tissue sections collected from meningioma patients. The samples were immunohistochemically stained to determine the Ki-67/MIB-1 proliferation index used for prognosis and treatment planning. Objective. The observer performance was examined by comparing results of the proposed method of automatic hot-spot selection in whole slide images, results of traditional scoring under a microscope, and results of a pathologist's manual hot-spot selection. Methods. The results of scoring the Ki-67 index using optical scoring under a microscope, software for Ki-67 index quantification based on hot spots selected by two pathologists (resp., once and three times), and the same software but on hot spots selected by proposed automatic methods were compared using Kendall's tau-b statistics. Results. Results show intra- and interobserver agreement. The agreement between Ki-67 scoring with manual and automatic hot-spot selection is high, while agreement between Ki-67 index scoring results in whole slide images and traditional microscopic examination is lower. Conclusions. The agreement observed for the three scoring methods shows that automation of area selection is an effective tool in supporting physicians and in increasing the reliability of Ki-67 scoring in meningioma.
Dahab, Gamal M; Kheriza, Mohamed M; El-Beltagi, Hussien M; Fouda, Abdel-Motaal M; El-Din, Osama A Sharaf
2004-01-01
The precise quantification of fibrous tissue in liver biopsy sections is extremely important in the classification, diagnosis and grading of chronic liver disease, as well as in evaluating the response to antifibrotic therapy. Because the recently described methods of digital image analysis of fibrosis in liver biopsy sections have major flaws, including the use of out-dated techniques in image processing, inadequate precision and inability to detect and quantify perisinusoidal fibrosis, we developed a new technique in computerized image analysis of liver biopsy sections based on Adobe Photoshop software. We prepared an experimental model of liver fibrosis involving treatment of rats with oral CCl4 for 6 weeks. After staining liver sections with Masson's trichrome, a series of computer operations were performed including (i) reconstitution of seamless widefield images from a number of acquired fields of liver sections; (ii) image size and solution adjustment; (iii) color correction; (iv) digital selection of a specified color range representing all fibrous tissue in the image and; (v) extraction and calculation. This technique is fully computerized with no manual interference at any step, and thus could be very reliable for objectively quantifying any pattern of fibrosis in liver biopsy sections and in assessing the response to antifibrotic therapy. It could also be a valuable tool in the precise assessment of antifibrotic therapy to other tissue regardless of the pattern of tissue or fibrosis.
Swiderska, Zaneta; Korzynska, Anna; Markiewicz, Tomasz; Lorent, Malgorzata; Zak, Jakub; Wesolowska, Anna; Roszkowiak, Lukasz; Slodkowska, Janina; Grala, Bartlomiej
2015-01-01
Background. This paper presents the study concerning hot-spot selection in the assessment of whole slide images of tissue sections collected from meningioma patients. The samples were immunohistochemically stained to determine the Ki-67/MIB-1 proliferation index used for prognosis and treatment planning. Objective. The observer performance was examined by comparing results of the proposed method of automatic hot-spot selection in whole slide images, results of traditional scoring under a microscope, and results of a pathologist's manual hot-spot selection. Methods. The results of scoring the Ki-67 index using optical scoring under a microscope, software for Ki-67 index quantification based on hot spots selected by two pathologists (resp., once and three times), and the same software but on hot spots selected by proposed automatic methods were compared using Kendall's tau-b statistics. Results. Results show intra- and interobserver agreement. The agreement between Ki-67 scoring with manual and automatic hot-spot selection is high, while agreement between Ki-67 index scoring results in whole slide images and traditional microscopic examination is lower. Conclusions. The agreement observed for the three scoring methods shows that automation of area selection is an effective tool in supporting physicians and in increasing the reliability of Ki-67 scoring in meningioma. PMID:26240787
Comprehensive Design Reliability Activities for Aerospace Propulsion Systems
NASA Technical Reports Server (NTRS)
Christenson, R. L.; Whitley, M. R.; Knight, K. C.
2000-01-01
This technical publication describes the methodology, model, software tool, input data, and analysis result that support aerospace design reliability studies. The focus of these activities is on propulsion systems mechanical design reliability. The goal of these activities is to support design from a reliability perspective. Paralleling performance analyses in schedule and method, this requires the proper use of metrics in a validated reliability model useful for design, sensitivity, and trade studies. Design reliability analysis in this view is one of several critical design functions. A design reliability method is detailed and two example analyses are provided-one qualitative and the other quantitative. The use of aerospace and commercial data sources for quantification is discussed and sources listed. A tool that was developed to support both types of analyses is presented. Finally, special topics discussed include the development of design criteria, issues of reliability quantification, quality control, and reliability verification.
Practical considerations of image analysis and quantification of signal transduction IHC staining.
Grunkin, Michael; Raundahl, Jakob; Foged, Niels T
2011-01-01
The dramatic increase in computer processing power in combination with the availability of high-quality digital cameras during the last 10 years has fertilized the grounds for quantitative microscopy based on digital image analysis. With the present introduction of robust scanners for whole slide imaging in both research and routine, the benefits of automation and objectivity in the analysis of tissue sections will be even more obvious. For in situ studies of signal transduction, the combination of tissue microarrays, immunohistochemistry, digital imaging, and quantitative image analysis will be central operations. However, immunohistochemistry is a multistep procedure including a lot of technical pitfalls leading to intra- and interlaboratory variability of its outcome. The resulting variations in staining intensity and disruption of original morphology are an extra challenge for the image analysis software, which therefore preferably should be dedicated to the detection and quantification of histomorphometrical end points.
MaxReport: An Enhanced Proteomic Result Reporting Tool for MaxQuant.
Zhou, Tao; Li, Chuyu; Zhao, Wene; Wang, Xinru; Wang, Fuqiang; Sha, Jiahao
2016-01-01
MaxQuant is a proteomic software widely used for large-scale tandem mass spectrometry data. We have designed and developed an enhanced result reporting tool for MaxQuant, named as MaxReport. This tool can optimize the results of MaxQuant and provide additional functions for result interpretation. MaxReport can generate report tables for protein N-terminal modifications. It also supports isobaric labelling based relative quantification at the protein, peptide or site level. To obtain an overview of the results, MaxReport performs general descriptive statistical analyses for both identification and quantification results. The output results of MaxReport are well organized and therefore helpful for proteomic users to better understand and share their data. The script of MaxReport, which is freely available at http://websdoor.net/bioinfo/maxreport/, is developed using Python code and is compatible across multiple systems including Windows and Linux.
Knudsen, A.C.; Gunter, M.E.; Herring, J.R.; Grauch, R.I.
2002-01-01
The Permian Phosphoria Formation of southeastern Idaho hosts one of the largest phosphate deposits in the world. Despite the economic significance of this Formation, the fine-grained nature of the phosphorite has discouraged detailed mineralogical characterization and quantification studies. Recently, selenium and other potentially toxic trace elements in mine wastes have drawn increased attention to this formation, and motivated additional study. This study uses powder X-ray diffraction (XRD), with Rietveld quantification software, to quantify and characterize the mineralogy of composite channel samples and individual samples collected from the stratigraphic sections measured by the U.S. Geological Survey in the Meade Peak Member of the Permian Phosphoria Formation at the Enoch Valley mine on Rasmussen Ridge, approximately 15 miles northeast of Soda Springs, Idaho.
Fritzsche, Marco; Fernandes, Ricardo A; Colin-York, Huw; Santos, Ana M; Lee, Steven F; Lagerholm, B Christoffer; Davis, Simon J; Eggeling, Christian
2015-11-13
Detecting intracellular calcium signaling with fluorescent calcium indicator dyes is often coupled with microscopy techniques to follow the activation state of non-excitable cells, including lymphocytes. However, the analysis of global intracellular calcium responses both at the single-cell level and in large ensembles simultaneously has yet to be automated. Here, we present a new software package, CalQuo (Calcium Quantification), which allows the automated analysis and simultaneous monitoring of global fluorescent calcium reporter-based signaling responses in up to 1000 single cells per experiment, at temporal resolutions of sub-seconds to seconds. CalQuo quantifies the number and fraction of responding cells, the temporal dependence of calcium signaling and provides global and individual calcium-reporter fluorescence intensity profiles. We demonstrate the utility of the new method by comparing the calcium-based signaling responses of genetically manipulated human lymphocytic cell lines.
NASA Technical Reports Server (NTRS)
Colwell, R. N. (Principal Investigator); Wall, S. L.; Beck, L. H.; Degloria, S. D.; Ritter, P. R.; Thomas, R. W.; Travlos, A. J.; Fakhoury, E.
1984-01-01
Materials and methods used to characterize selected soil properties and agricultural crops in San Joaquin County, California are described. Results show that: (1) the location and widths of TM bands are suitable for detecting differences in selected soil properties; (2) the number of TM spectral bands allows the quantification of soil spectral curve form and magnitude; and (3) the spatial and geometric quality of TM data allows for the discrimination and quantification of within field variability of soil properties. The design of the LANDSAT based multiple crop acreage estimation experiment for the Idaho Department of Water Resources is described including the use of U.C. Berkeley's Survey Modeling Planning Model. Progress made on Peditor software development on MIDAS, and cooperative computing using local and remote systems is reported as well as development of MIDAS microcomputer systems.
NASA Technical Reports Server (NTRS)
Basili, Victor R.
1992-01-01
The concepts of quality improvements have permeated many businesses. It is clear that the nineties will be the quality era for software and there is a growing need to develop or adapt quality improvement approaches to the software business. Thus we must understand software as an artifact and software as a business. Since the business we are dealing with is software, we must understand the nature of software and software development. The software discipline is evolutionary and experimental; it is a laboratory science. Software is development not production. The technologies of the discipline are human based. There is a lack of models that allow us to reason about the process and the product. All software is not the same; process is a variable, goals are variable, etc. Packaged, reusable, experiences require additional resources in the form of organization, processes, people, etc. There have been a variety of organizational frameworks proposed to improve quality for various businesses. The ones discussed in this presentation include: Plan-Do-Check-Act, a quality improvement process based upon a feedback cycle for optimizing a single process model/production line; the Experience Factory/Quality Improvement Paradigm, continuous improvements through the experimentation, packaging, and reuse of experiences based upon a business's needs; Total Quality Management, a management approach to long term success through customer satisfaction based on the participation of all members of an organization; the SEI capability maturity model, a staged process improvement based upon assessment with regard to a set of key process areas until you reach a level 5 which represents a continuous process improvement; and Lean (software) Development, a principle supporting the concentration of the production on 'value added' activities and the elimination of reduction of 'not value added' activities.
Mapping modern software process engineering techniques onto an HEP development environment
NASA Astrophysics Data System (ADS)
Wellisch, J. P.
2003-04-01
One of the most challenging issues faced in HEP in recent years is the question of how to capitalise on software development and maintenance experience in a continuous manner. To capitalise means in our context to evaluate and apply new process technologies as they arise, and to further evolve technologies already widely in use. It also implies the definition and adoption of standards. The CMS off-line software improvement effort aims at continual software quality improvement, and continual improvement in the efficiency of the working environment with the goal to facilitate doing great new physics. To achieve this, we followed a process improvement program based on ISO-15504, and Rational Unified Process. This experiment in software process improvement in HEP has been progressing now for a period of 3 years. Taking previous experience from ATLAS and SPIDER into account, we used a soft approach of continuous change within the limits of current culture to create of de facto software process standards within the CMS off line community as the only viable route to a successful software process improvement program in HEP. We will present the CMS approach to software process improvement in this process R&D, describe lessons learned, and mistakes made. We will demonstrate the benefits gained, and the current status of the software processes established in CMS off-line software.
Model-based software process improvement
NASA Technical Reports Server (NTRS)
Zettervall, Brenda T.
1994-01-01
The activities of a field test site for the Software Engineering Institute's software process definition project are discussed. Products tested included the improvement model itself, descriptive modeling techniques, the CMM level 2 framework document, and the use of process definition guidelines and templates. The software process improvement model represents a five stage cyclic approach for organizational process improvement. The cycles consist of the initiating, diagnosing, establishing, acting, and leveraging phases.
A software tool for automatic classification and segmentation of 2D/3D medical images
NASA Astrophysics Data System (ADS)
Strzelecki, Michal; Szczypinski, Piotr; Materka, Andrzej; Klepaczko, Artur
2013-02-01
Modern medical diagnosis utilizes techniques of visualization of human internal organs (CT, MRI) or of its metabolism (PET). However, evaluation of acquired images made by human experts is usually subjective and qualitative only. Quantitative analysis of MR data, including tissue classification and segmentation, is necessary to perform e.g. attenuation compensation, motion detection, and correction of partial volume effect in PET images, acquired with PET/MR scanners. This article presents briefly a MaZda software package, which supports 2D and 3D medical image analysis aiming at quantification of image texture. MaZda implements procedures for evaluation, selection and extraction of highly discriminative texture attributes combined with various classification, visualization and segmentation tools. Examples of MaZda application in medical studies are also provided.
Orczyk, C; Rusinek, H; Rosenkrantz, A B; Mikheev, A; Deng, F-M; Melamed, J; Taneja, S S
2013-12-01
To assess a novel method of three-dimensional (3D) co-registration of prostate cancer digital histology and in-vivo multiparametric magnetic resonance imaging (mpMRI) image sets for clinical usefulness. A software platform was developed to achieve 3D co-registration. This software was prospectively applied to three patients who underwent radical prostatectomy. Data comprised in-vivo mpMRI [T2-weighted, dynamic contrast-enhanced weighted images (DCE); apparent diffusion coefficient (ADC)], ex-vivo T2-weighted imaging, 3D-rebuilt pathological specimen, and digital histology. Internal landmarks from zonal anatomy served as reference points for assessing co-registration accuracy and precision. Applying a method of deformable transformation based on 22 internal landmarks, a 1.6 mm accuracy was reached to align T2-weighted images and the 3D-rebuilt pathological specimen, an improvement over rigid transformation of 32% (p = 0.003). The 22 zonal anatomy landmarks were more accurately mapped using deformable transformation than rigid transformation (p = 0.0008). An automatic method based on mutual information, enabled automation of the process and to include perfusion and diffusion MRI images. Evaluation of co-registration accuracy using the volume overlap index (Dice index) met clinically relevant requirements, ranging from 0.81-0.96 for sequences tested. Ex-vivo images of the specimen did not significantly improve co-registration accuracy. This preliminary analysis suggests that deformable transformation based on zonal anatomy landmarks is accurate in the co-registration of mpMRI and histology. Including diffusion and perfusion sequences in the same 3D space as histology is essential further clinical information. The ability to localize cancer in 3D space may improve targeting for image-guided biopsy, focal therapy, and disease quantification in surveillance protocols. Copyright © 2013 The Royal College of Radiologists. Published by Elsevier Ltd. All rights reserved.
Krishnaiah, Yellela S R; Katragadda, Usha; Khan, Mansoor A
2014-05-01
Cold flow is a phenomenon occurring in drug-in-adhesive type of transdermal drug delivery systems (DIA-TDDS) because of the migration of DIA coat beyond the edge. Excessive cold flow can affect their therapeutic effectiveness, make removal of DIA-TDDS difficult from the pouch, and potentially decrease available dose if any drug remains adhered to pouch. There are no compendial or noncompendial methods available for quantification of this critical quality attribute. The objective was to develop a method for quantification of cold flow using stereomicroscopic imaging technique. Cold flow was induced by applying 1 kg force on punched-out samples of marketed estradiol DIA-TDDS (model product) stored at 25°C, 32°C, and 40°C/60% relative humidity (RH) for 1, 2, or 3 days. At the end of testing period, dimensional change in the area of DIA-TDDS samples was measured using image analysis software, and expressed as percent of cold flow. The percent of cold flow significantly decreased (p < 0.001) with increase in size of punched-out DIA-TDDS samples and increased (p < 0.001) with increase in cold flow induction temperature and time. This first ever report suggests that dimensional change in the area of punched-out samples stored at 32°C/60%RH for 2 days applied with 1 kg force could be used for quantification of cold flow in DIA-TDDS. © 2014 Wiley Periodicals, Inc. and the American Pharmacists Association.
Radio-frequency energy quantification in magnetic resonance imaging
NASA Astrophysics Data System (ADS)
Alon, Leeor
Mapping of radio frequency (RF) energy deposition has been challenging for 50+ years, especially, when scanning patients in the magnetic resonance imaging (MRI) environment. As result, electromagnetic simulation software is often used for estimating the specific absorption rate (SAR), the rate of RF energy deposition in tissue. The thesis work presents challenges associated with aligning information provided by electromagnetic simulation and MRI experiments. As result of the limitations of simulations, experimental methods for the quantification of SAR were established. A system for quantification of the total RF energy deposition was developed for parallel transmit MRI (a system that uses multiple antennas to excite and image the body). The system is capable of monitoring and predicting channel-by-channel RF energy deposition, whole body SAR and capable of tracking potential hardware failures that occur in the transmit chain and may cause the deposition of excessive energy into patients. Similarly, we demonstrated that local RF power deposition can be mapped and predicted for parallel transmit systems based on a series of MRI temperature mapping acquisitions. Resulting from the work, we developed tools for optimal reconstruction temperature maps from MRI acquisitions. The tools developed for temperature mapping paved the way for utilizing MRI as a diagnostic tool for evaluation of RF/microwave emitting device safety. Quantification of the RF energy was demonstrated for both MRI compatible and non-MRI-compatible devices (such as cell phones), while having the advantage of being noninvasive, of providing millimeter resolution and high accuracy.
Reiman, Mario; Laan, Maris; Rull, Kristiina; Sõber, Siim
2017-08-01
RNA degradation is a ubiquitous process that occurs in living and dead cells, as well as during handling and storage of extracted RNA. Reduced RNA quality caused by degradation is an established source of uncertainty for all RNA-based gene expression quantification techniques. RNA sequencing is an increasingly preferred method for transcriptome analyses, and dependence of its results on input RNA integrity is of significant practical importance. This study aimed to characterize the effects of varying input RNA integrity [estimated as RNA integrity number (RIN)] on transcript level estimates and delineate the characteristic differences between transcripts that differ in degradation rate. The study used ribodepleted total RNA sequencing data from a real-life clinically collected set ( n = 32) of human solid tissue (placenta) samples. RIN-dependent alterations in gene expression profiles were quantified by using DESeq2 software. Our results indicate that small differences in RNA integrity affect gene expression quantification by introducing a moderate and pervasive bias in expression level estimates that significantly affected 8.1% of studied genes. The rapidly degrading transcript pool was enriched in pseudogenes, short noncoding RNAs, and transcripts with extended 3' untranslated regions. Typical slowly degrading transcripts (median length, 2389 nt) represented protein coding genes with 4-10 exons and high guanine-cytosine content.-Reiman, M., Laan, M., Rull, K., Sõber, S. Effects of RNA integrity on transcript quantification by total RNA sequencing of clinically collected human placental samples. © FASEB.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-08-03
... Market and Planning Efficiency Through Improved Software; Notice Establishing Date for Comments July 27... software related to wholesale electricity markets and planning: \\1\\ \\1\\ Notice of Technical Conference to Discuss Increasing Market and Planning Efficiency Through Improved Software, 75 FR 27,341 (2010). June 2-3...
Al Feteisi, Hajar; Achour, Brahim; Rostami-Hodjegan, Amin; Barber, Jill
2015-01-01
Drug-metabolizing enzymes and transporters play an important role in drug absorption, distribution, metabolism and excretion and, consequently, they influence drug efficacy and toxicity. Quantification of drug-metabolizing enzymes and transporters in various tissues is therefore essential for comprehensive elucidation of drug absorption, distribution, metabolism and excretion. Recent advances in liquid chromatography coupled with tandem mass spectrometry (LC-MS/MS) have improved the quantification of pharmacologically relevant proteins. This report presents an overview of mass spectrometry-based methods currently used for the quantification of drug-metabolizing enzymes and drug transporters, mainly focusing on applications and cost associated with various quantitative strategies based on stable isotope-labeled standards (absolute quantification peptide standards, quantification concatemers, protein standards for absolute quantification) and label-free analysis. In mass spectrometry, there is no simple relationship between signal intensity and analyte concentration. Proteomic strategies are therefore complex and several factors need to be considered when selecting the most appropriate method for an intended application, including the number of proteins and samples. Quantitative strategies require appropriate mass spectrometry platforms, yet choice is often limited by the availability of appropriate instrumentation. Quantitative proteomics research requires specialist practical skills and there is a pressing need to dedicate more effort and investment to training personnel in this area. Large-scale multicenter collaborations are also needed to standardize quantitative strategies in order to improve physiologically based pharmacokinetic models.
Software Design Improvements. Part 2; Software Quality and the Design and Inspection Process
NASA Technical Reports Server (NTRS)
Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom
1997-01-01
The application of assurance engineering techniques improves the duration of failure-free performance of software. The totality of features and characteristics of a software product are what determine its ability to satisfy customer needs. Software in safety-critical systems is very important to NASA. We follow the System Safety Working Groups definition for system safety software as: 'The optimization of system safety in the design, development, use and maintenance of software and its integration with safety-critical systems in an operational environment. 'If it is not safe, say so' has become our motto. This paper goes over methods that have been used by NASA to make software design improvements by focusing on software quality and the design and inspection process.
Existing Fortran interfaces to Trilinos in preparation for exascale ForTrilinos development
DOE Office of Scientific and Technical Information (OSTI.GOV)
Evans, Katherine J.; Young, Mitchell T.; Collins, Benjamin S.
This report summarizes the current state of Fortran interfaces to the Trilinos library within several key applications of the Exascale Computing Program (ECP), with the aim of informing developers about strategies to develop ForTrilinos, an exascale-ready, Fortran interface software package within Trilinos. The two software projects assessed within are the DOE Office of Science's Accelerated Climate Model for Energy (ACME) atmosphere component, CAM, and the DOE Office of Nuclear Energy's core-simulator portion of VERA, a nuclear reactor simulation code. Trilinos is an object-oriented, C++ based software project, and spans a collection of algorithms and other enabling technologies such as uncertaintymore » quantification and mesh generation. To date, Trilinos has enabled these codes to achieve large-scale simulation results, however the simulation needs of CAM and VERA-CS will approach exascale over the next five years. A Fortran interface to Trilinos that enables efficient use of programming models and more advanced algorithms is necessary. Where appropriate, the needs of the CAM and VERA-CS software to achieve their simulation goals are called out specifically. With this report, a design document and execution plan for ForTrilinos development can proceed.« less
Data processing has major impact on the outcome of quantitative label-free LC-MS analysis.
Chawade, Aakash; Sandin, Marianne; Teleman, Johan; Malmström, Johan; Levander, Fredrik
2015-02-06
High-throughput multiplexed protein quantification using mass spectrometry is steadily increasing in popularity, with the two major techniques being data-dependent acquisition (DDA) and targeted acquisition using selected reaction monitoring (SRM). However, both techniques involve extensive data processing, which can be performed by a multitude of different software solutions. Analysis of quantitative LC-MS/MS data is mainly performed in three major steps: processing of raw data, normalization, and statistical analysis. To evaluate the impact of data processing steps, we developed two new benchmark data sets, one each for DDA and SRM, with samples consisting of a long-range dilution series of synthetic peptides spiked in a total cell protein digest. The generated data were processed by eight different software workflows and three postprocessing steps. The results show that the choice of the raw data processing software and the postprocessing steps play an important role in the final outcome. Also, the linear dynamic range of the DDA data could be extended by an order of magnitude through feature alignment and a charge state merging algorithm proposed here. Furthermore, the benchmark data sets are made publicly available for further benchmarking and software developments.
Kimura, Sumito; Streiff, Cole; Zhu, Meihua; Shimada, Eriko; Datta, Saurabh; Ashraf, Muhammad; Sahn, David J
2014-02-01
The aim of this study was to assess the accuracy, feasibility, and reproducibility of determining stroke volume from a novel 3-dimensional (3D) color Doppler flow quantification method for mitral valve (MV) inflow and left ventricular outflow tract (LVOT) outflow at different stroke volumes when compared with the actual flow rate in a pumped porcine cardiac model. Thirteen freshly harvested pig hearts were studied in a water tank. We inserted a latex balloon into each left ventricle from the MV annulus to the LVOT, which were passively pumped at different stroke volumes (30-80 mL) using a calibrated piston pump at increments of 10 mL. Four-dimensional flow volumes were obtained without electrocardiographic gating. The digital imaging data were analyzed offline using prototype software. Two hemispheric flow-sampling planes for color Doppler velocity measurements were placed at the MV annulus and LVOT. The software computed the flow volumes at the MV annulus and LVOT within the user-defined volume and cardiac cycle. This novel 3D Doppler flow quantification method detected incremental increases in MV inflow and LVOT outflow in close agreement with pumped stroke volumes (MV inflow, r = 0.96; LVOT outflow, r = 0.96; P < .01). Bland-Altman analysis demonstrated overestimation of both (MV inflow, 5.42 mL; LVOT outflow, 4.46 mL) with 95% of points within 95% limits of agreement. Interobserver variability values showed good agreement for all stroke volumes at both the MV annulus and LVOT. This study has shown that the 3D color Doppler flow quantification method we used is able to compute stroke volumes accurately at the MV annulus and LVOT in the same cardiac cycle without electrocardiographic gating. This method may be valuable for assessment of cardiac output in clinical studies.
Lowering the quantification limit of the QubitTM RNA HS assay using RNA spike-in.
Li, Xin; Ben-Dov, Iddo Z; Mauro, Maurizio; Williams, Zev
2015-05-06
RNA quantification is often a prerequisite for most RNA analyses such as RNA sequencing. However, the relatively low sensitivity and large sample consumption of traditional RNA quantification methods such as UV spectrophotometry and even the much more sensitive fluorescence-based RNA quantification assays, such as the Qubit™ RNA HS Assay, are often inadequate for measuring minute levels of RNA isolated from limited cell and tissue samples and biofluids. Thus, there is a pressing need for a more sensitive method to reliably and robustly detect trace levels of RNA without interference from DNA. To improve the quantification limit of the Qubit™ RNA HS Assay, we spiked-in a known quantity of RNA to achieve the minimum reading required by the assay. Samples containing trace amounts of RNA were then added to the spike-in and measured as a reading increase over RNA spike-in baseline. We determined the accuracy and precision of reading increases between 1 and 20 pg/μL as well as RNA-specificity in this range, and compared to those of RiboGreen(®), another sensitive fluorescence-based RNA quantification assay. We then applied Qubit™ Assay with RNA spike-in to quantify plasma RNA samples. RNA spike-in improved the quantification limit of the Qubit™ RNA HS Assay 5-fold, from 25 pg/μL down to 5 pg/μL while maintaining high specificity to RNA. This enabled quantification of RNA with original concentration as low as 55.6 pg/μL compared to 250 pg/μL for the standard assay and decreased sample consumption from 5 to 1 ng. Plasma RNA samples that were not measurable by the Qubit™ RNA HS Assay were measurable by our modified method. The Qubit™ RNA HS Assay with RNA spike-in is able to quantify RNA with high specificity at 5-fold lower concentration and uses 5-fold less sample quantity than the standard Qubit™ Assay.
Federal Register 2010, 2011, 2012, 2013, 2014
2010-05-28
... Market and Planning Efficiency Through Improved Software; Notice of Agenda and Procedures for Staff... conference to be held on June 2, 2010 and June 3, 2010, to discuss issues related to unit commitment software... Unit Commitment Software Federal Energy Regulatory Commission June 2, 2010 8 a.m Richard O'Neill, FERC...
Federal Register 2010, 2011, 2012, 2013, 2014
2010-06-01
... Market and Planning Efficiency Through Improved Software; Notice of Agenda and Procedures for Staff... planning models and software. The technical conference will be held from 8 a.m. to 5:30 p.m. (EDT) on June.... Agenda for AD10-12 Staff Technical Conference on Planning Models and Software Federal Energy Regulatory...
Using uncertainty quantification, we aim to improve the quality of modeling data from high throughput screening assays for use in risk assessment. ToxCast is a large-scale screening program that analyzes thousands of chemicals using over 800 assays representing hundreds of bioche...
Organizational management practices for achieving software process improvement
NASA Technical Reports Server (NTRS)
Kandt, Ronald Kirk
2004-01-01
The crisis in developing software has been known for over thirty years. Problems that existed in developing software in the early days of computing still exist today. These problems include the delivery of low-quality products, actual development costs that exceed expected development costs, and actual development time that exceeds expected development time. Several solutions have been offered to overcome out inability to deliver high-quality software, on-time and within budget. One of these solutions involves software process improvement. However, such efforts often fail because of organizational management issues. This paper discusses business practices that organizations should follow to improve their chances of initiating and sustaining successful software process improvement efforts.
Crossing the chasm: how to develop weather and climate models for next generation computers?
NASA Astrophysics Data System (ADS)
Lawrence, Bryan N.; Rezny, Michael; Budich, Reinhard; Bauer, Peter; Behrens, Jörg; Carter, Mick; Deconinck, Willem; Ford, Rupert; Maynard, Christopher; Mullerworth, Steven; Osuna, Carlos; Porter, Andrew; Serradell, Kim; Valcke, Sophie; Wedi, Nils; Wilson, Simon
2018-05-01
Weather and climate models are complex pieces of software which include many individual components, each of which is evolving under pressure to exploit advances in computing to enhance some combination of a range of possible improvements (higher spatio-temporal resolution, increased fidelity in terms of resolved processes, more quantification of uncertainty, etc.). However, after many years of a relatively stable computing environment with little choice in processing architecture or programming paradigm (basically X86 processors using MPI for parallelism), the existing menu of processor choices includes significant diversity, and more is on the horizon. This computational diversity, coupled with ever increasing software complexity, leads to the very real possibility that weather and climate modelling will arrive at a chasm which will separate scientific aspiration from our ability to develop and/or rapidly adapt codes to the available hardware. In this paper we review the hardware and software trends which are leading us towards this chasm, before describing current progress in addressing some of the tools which we may be able to use to bridge the chasm. This brief introduction to current tools and plans is followed by a discussion outlining the scientific requirements for quality model codes which have satisfactory performance and portability, while simultaneously supporting productive scientific evolution. We assert that the existing method of incremental model improvements employing small steps which adjust to the changing hardware environment is likely to be inadequate for crossing the chasm between aspiration and hardware at a satisfactory pace, in part because institutions cannot have all the relevant expertise in house. Instead, we outline a methodology based on large community efforts in engineering and standardisation, which will depend on identifying a taxonomy of key activities - perhaps based on existing efforts to develop domain-specific languages, identify common patterns in weather and climate codes, and develop community approaches to commonly needed tools and libraries - and then collaboratively building up those key components. Such a collaborative approach will depend on institutions, projects, and individuals adopting new interdependencies and ways of working.
Absolute quantification by droplet digital PCR versus analog real-time PCR
Hindson, Christopher M; Chevillet, John R; Briggs, Hilary A; Gallichotte, Emily N; Ruf, Ingrid K; Hindson, Benjamin J; Vessella, Robert L; Tewari, Muneesh
2014-01-01
Nanoliter-sized droplet technology paired with digital PCR (ddPCR) holds promise for highly precise, absolute nucleic acid quantification. Our comparison of microRNA quantification by ddPCR and real-time PCR revealed greater precision (coefficients of variation decreased by 37–86%) and improved day-to-day reproducibility (by a factor of seven) of ddPCR but with comparable sensitivity. When we applied ddPCR to serum microRNA biomarker analysis, this translated to superior diagnostic performance for identifying individuals with cancer. PMID:23995387
Levy, Franck; Marechaux, Sylvestre; Iacuzio, Laura; Schouver, Elie Dan; Castel, Anne Laure; Toledano, Manuel; Rusek, Stephane; Dor, Vincent; Tribouilloy, Christophe; Dreyfus, Gilles
2018-03-30
Quantitative assessment of primary mitral regurgitation (MR) using left ventricular (LV) volumes obtained with three-dimensional transthoracic echocardiography (3D TTE) recently showed encouraging results. Nevertheless, 3D TTE is not incorporated into everyday practice, as current LV chamber quantification software products are time consuming. To investigate the accuracy and reproducibility of new automated fast 3D TTE software (HeartModel A.I. ; Philips Healthcare, Andover, MA, USA) for the quantification of LV volumes and MR severity in patients with isolated degenerative primary MR; and to compare regurgitant volume (RV) obtained with 3D TTE with a cardiac magnetic resonance (CMR) reference. Fifty-three patients (37 men; mean age 64±12 years) with at least mild primary isolated MR, and having comprehensive 3D TTE and CMR studies within 24h, were eligible for inclusion. MR RV was calculated using the proximal isovelocity surface area (PISA) method and the volumetric method (total LV stroke volume minus aortic stroke volume) with either CMR or 3D TTE. Inter- and intraobserver reproducibility of 3D TTE was excellent (coefficient of variation≤10%) for LV volumes. MR RV was similar using CMR and 3D TTE (57±23mL vs 56±28mL; P=0.22), but was significantly higher using the PISA method (69±30mL; P<0.05 compared with CMR and 3D TTE). The PISA method consistently overestimated MR RV compared with CMR (bias 12±21mL), while no significant bias was found between 3D TTE and CMR (bias 2±14mL). Concordance between echocardiography and CMR was higher using 3D TTE MR grading (intraclass correlation coefficient [ICC]=0.89) than with PISA MR grading (ICC=0.78). Complete agreement with CMR grading was more frequent with 3D TTE than with the PISA method (76% vs 63%). 3D TTE RV assessment using the new generation of automated software correlates well with CMR in patients with isolated degenerative primary MR. Copyright © 2018 Elsevier Masson SAS. All rights reserved.
Software Design Improvements. Part 1; Software Benefits and Limitations
NASA Technical Reports Server (NTRS)
Lalli, Vincent R.; Packard, Michael H.; Ziemianski, Tom
1997-01-01
Computer hardware and associated software have been used for many years to process accounting information, to analyze test data and to perform engineering analysis. Now computers and software also control everything from automobiles to washing machines and the number and type of applications are growing at an exponential rate. The size of individual program has shown similar growth. Furthermore, software and hardware are used to monitor and/or control potentially dangerous products and safety-critical systems. These uses include everything from airplanes and braking systems to medical devices and nuclear plants. The question is: how can this hardware and software be made more reliable? Also, how can software quality be improved? What methodology needs to be provided on large and small software products to improve the design and how can software be verified?
Titan Science Return Quantification
NASA Technical Reports Server (NTRS)
Weisbin, Charles R.; Lincoln, William
2014-01-01
Each proposal for a NASA mission concept includes a Science Traceability Matrix (STM), intended to show that what is being proposed would contribute to satisfying one or more of the agency's top-level science goals. But the information traditionally provided cannot be used directly to quantitatively compare anticipated science return. We added numerical elements to NASA's STM and developed a software tool to process the data. We then applied this methodology to evaluate a group of competing concepts for a proposed mission to Saturn's moon, Titan.
2011-01-01
Shotgun lipidome profiling relies on direct mass spectrometric analysis of total lipid extracts from cells, tissues or organisms and is a powerful tool to elucidate the molecular composition of lipidomes. We present a novel informatics concept of the molecular fragmentation query language implemented within the LipidXplorer open source software kit that supports accurate quantification of individual species of any ionizable lipid class in shotgun spectra acquired on any mass spectrometry platform. PMID:21247462
Rapid development of Proteomic applications with the AIBench framework.
López-Fernández, Hugo; Reboiro-Jato, Miguel; Glez-Peña, Daniel; Méndez Reboredo, José R; Santos, Hugo M; Carreira, Ricardo J; Capelo-Martínez, José L; Fdez-Riverola, Florentino
2011-09-15
In this paper we present two case studies of Proteomics applications development using the AIBench framework, a Java desktop application framework mainly focused in scientific software development. The applications presented in this work are Decision Peptide-Driven, for rapid and accurate protein quantification, and Bacterial Identification, for Tuberculosis biomarker search and diagnosis. Both tools work with mass spectrometry data, specifically with MALDI-TOF spectra, minimizing the time required to process and analyze the experimental data. Copyright 2011 The Author(s). Published by Journal of Integrative Bioinformatics.
NASA Astrophysics Data System (ADS)
Downs, R. R.; Lenhardt, W. C.; Robinson, E.
2014-12-01
Science software is integral to the scientific process and must be developed and managed in a sustainable manner to ensure future access to scientific data and related resources. Organizations that are part of the scientific enterprise, as well as members of the scientific community who work within these entities, can contribute to the sustainability of science software and to practices that improve scientific community capabilities for science software sustainability. As science becomes increasingly digital and therefore, dependent on software, improving community practices for sustainable science software will contribute to the sustainability of science. Members of the Earth science informatics community, including scientific data producers and distributers, end-user scientists, system and application developers, and data center managers, use science software regularly and face the challenges and the opportunities that science software presents for the sustainability of science. To gain insight on practices needed for the sustainability of science software from the science software experiences of the Earth science informatics community, an interdisciplinary group of 300 community members were asked to engage in simultaneous roundtable discussions and report on their answers to questions about the requirements for improving scientific software sustainability. This paper will present an analysis of the issues reported and the conclusions offered by the participants. These results provide perspectives for science software sustainability practices and have implications for actions that organizations and their leadership can initiate to improve the sustainability of science software.
Collender, Philip A.; Kirby, Amy E.; Addiss, David G.; Freeman, Matthew C.; Remais, Justin V.
2015-01-01
Limiting the environmental transmission of soil-transmitted helminths (STH), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost effective methods to detect and quantify STH in the environment. We review the state of the art of STH quantification in soil, biosolids, water, produce, and vegetation with respect to four major methodological issues: environmental sampling; recovery of STH from environmental matrices; quantification of recovered STH; and viability assessment of STH ova. We conclude that methods for sampling and recovering STH require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment. PMID:26440788
SAGA: A project to automate the management of software production systems
NASA Technical Reports Server (NTRS)
Campbell, Roy H.; Beckman-Davies, C. S.; Benzinger, L.; Beshers, G.; Laliberte, D.; Render, H.; Sum, R.; Smith, W.; Terwilliger, R.
1986-01-01
Research into software development is required to reduce its production cost and to improve its quality. Modern software systems, such as the embedded software required for NASA's space station initiative, stretch current software engineering techniques. The requirements to build large, reliable, and maintainable software systems increases with time. Much theoretical and practical research is in progress to improve software engineering techniques. One such technique is to build a software system or environment which directly supports the software engineering process, i.e., the SAGA project, comprising the research necessary to design and build a software development which automates the software engineering process. Progress under SAGA is described.
quanTLC, an online open-source solution for videodensitometric quantification.
Fichou, Dimitri; Morlock, Gertrud E
2018-07-27
The image is the key feature of planar chromatography. Videodensitometry by digital image conversion is the fastest way of its evaluation. Instead of scanning single sample tracks one after the other, only few clicks are needed to convert all tracks at one go. A minimalistic software was newly developed, termed quanTLC, that allowed the quantitative evaluation of samples in few minutes. quanTLC includes important assets such as open-source, online, free of charge, intuitive to use and tailored to planar chromatography, as none of the nine existent software for image evaluation covered these aspects altogether. quanTLC supports common image file formats for chromatogram upload. All necessary steps were included, i.e., videodensitogram extraction, preprocessing, automatic peak integration, calibration, statistical data analysis, reporting and data export. The default options for each step are suitable for most analyses while still being tunable, if needed. A one-minute video was recorded to serve as user manual. The software capabilities are shown on the example of a lipophilic dye mixture separation. The quantitative results were verified by comparison with those obtained by commercial videodensitometry software and opto-mechanical slit-scanning densitometry. The data can be exported at each step to be processed in further software, if required. The code was released open-source to be exploited even further. The software itself is online useable without installation and directly accessible at http://shinyapps.ernaehrung.uni-giessen.de/quanTLC. Copyright © 2018 Elsevier B.V. All rights reserved.
PET/MRI for neurologic applications.
Catana, Ciprian; Drzezga, Alexander; Heiss, Wolf-Dieter; Rosen, Bruce R
2012-12-01
PET and MRI provide complementary information in the study of the human brain. Simultaneous PET/MRI data acquisition allows the spatial and temporal correlation of the measured signals, creating opportunities impossible to realize using stand-alone instruments. This paper reviews the methodologic improvements and potential neurologic and psychiatric applications of this novel technology. We first present methods for improving the performance and information content of each modality by using the information provided by the other technique. On the PET side, we discuss methods that use the simultaneously acquired MRI data to improve the PET data quantification. On the MRI side, we present how improved PET quantification can be used to validate several MRI techniques. Finally, we describe promising research, translational, and clinical applications that can benefit from these advanced tools.
Välikangas, Tommi; Suomi, Tomi; Elo, Laura L
2017-05-31
Label-free mass spectrometry (MS) has developed into an important tool applied in various fields of biological and life sciences. Several software exist to process the raw MS data into quantified protein abundances, including open source and commercial solutions. Each software includes a set of unique algorithms for different tasks of the MS data processing workflow. While many of these algorithms have been compared separately, a thorough and systematic evaluation of their overall performance is missing. Moreover, systematic information is lacking about the amount of missing values produced by the different proteomics software and the capabilities of different data imputation methods to account for them.In this study, we evaluated the performance of five popular quantitative label-free proteomics software workflows using four different spike-in data sets. Our extensive testing included the number of proteins quantified and the number of missing values produced by each workflow, the accuracy of detecting differential expression and logarithmic fold change and the effect of different imputation and filtering methods on the differential expression results. We found that the Progenesis software performed consistently well in the differential expression analysis and produced few missing values. The missing values produced by the other software decreased their performance, but this difference could be mitigated using proper data filtering or imputation methods. Among the imputation methods, we found that the local least squares (lls) regression imputation consistently increased the performance of the software in the differential expression analysis, and a combination of both data filtering and local least squares imputation increased performance the most in the tested data sets. © The Author 2017. Published by Oxford University Press.
Statistical Validation for Clinical Measures: Repeatability and Agreement of Kinect™-Based Software.
Lopez, Natalia; Perez, Elisa; Tello, Emanuel; Rodrigo, Alejandro; Valentinuzzi, Max E
2018-01-01
The rehabilitation process is a fundamental stage for recovery of people's capabilities. However, the evaluation of the process is performed by physiatrists and medical doctors, mostly based on their observations, that is, a subjective appreciation of the patient's evolution. This paper proposes a tracking platform of the movement made by an individual's upper limb using Kinect sensor(s) to be applied for the patient during the rehabilitation process. The main contribution is the development of quantifying software and the statistical validation of its performance, repeatability, and clinical use in the rehabilitation process. The software determines joint angles and upper limb trajectories for the construction of a specific rehabilitation protocol and quantifies the treatment evolution. In turn, the information is presented via a graphical interface that allows the recording, storage, and report of the patient's data. For clinical purposes, the software information is statistically validated with three different methodologies, comparing the measures with a goniometer in terms of agreement and repeatability. The agreement of joint angles measured with the proposed software and goniometer is evaluated with Bland-Altman plots; all measurements fell well within the limits of agreement, meaning interchangeability of both techniques. Additionally, the results of Bland-Altman analysis of repeatability show 95% confidence. Finally, the physiotherapists' qualitative assessment shows encouraging results for the clinical use. The main conclusion is that the software is capable of offering a clinical history of the patient and is useful for quantification of the rehabilitation success. The simplicity, low cost, and visualization possibilities enhance the use of the software Kinect for rehabilitation and other applications, and the expert's opinion endorses the choice of our approach for clinical practice. Comparison of the new measurement technique with established goniometric methods determines that the proposed software agrees sufficiently to be used interchangeably.
USDA-ARS?s Scientific Manuscript database
The ability to rapidly screen a large number of individuals is the key to any successful plant breeding program. One of the primary bottlenecks in high throughput screening is the preparation of DNA samples, particularly the quantification and normalization of samples for downstream processing. A ...
How to Improve Your Impact Factor: Questioning the Quantification of Academic Quality
ERIC Educational Resources Information Center
Smeyers, Paul; Burbules, Nicholas C.
2011-01-01
A broad-scale quantification of the measure of quality for scholarship is under way. This trend has fundamental implications for the future of academic publishing and employment. In this essay we want to raise questions about these burgeoning practices, particularly how they affect philosophy of education and similar sub-disciplines. First,…
Zhang, Xuezhu; Peng, Qiyu; Zhou, Jian; Huber, Jennifer S; Moses, William W; Qi, Jinyi
2018-03-16
The first generation Tachyon PET (Tachyon-I) is a demonstration single-ring PET scanner that reaches a coincidence timing resolution of 314 ps using LSO scintillator crystals coupled to conventional photomultiplier tubes. The objective of this study was to quantify the improvement in both lesion detection and quantification performance resulting from the improved time-of-flight (TOF) capability of the Tachyon-I scanner. We developed a quantitative TOF image reconstruction method for the Tachyon-I and evaluated its TOF gain for lesion detection and quantification. Scans of either a standard NEMA torso phantom or healthy volunteers were used as the normal background data. Separately scanned point source and sphere data were superimposed onto the phantom or human data after accounting for the object attenuation. We used the bootstrap method to generate multiple independent noisy datasets with and without a lesion present. The signal-to-noise ratio (SNR) of a channelized hotelling observer (CHO) was calculated for each lesion size and location combination to evaluate the lesion detection performance. The bias versus standard deviation trade-off of each lesion uptake was also calculated to evaluate the quantification performance. The resulting CHO-SNR measurements showed improved performance in lesion detection with better timing resolution. The detection performance was also dependent on the lesion size and location, in addition to the background object size and shape. The results of bias versus noise trade-off showed that the noise (standard deviation) reduction ratio was about 1.1-1.3 over the TOF 500 ps and 1.5-1.9 over the non-TOF modes, similar to the SNR gains for lesion detection. In conclusion, this Tachyon-I PET study demonstrated the benefit of improved time-of-flight capability on lesion detection and ROI quantification for both phantom and human subjects.
NASA Astrophysics Data System (ADS)
Zhang, Xuezhu; Peng, Qiyu; Zhou, Jian; Huber, Jennifer S.; Moses, William W.; Qi, Jinyi
2018-03-01
The first generation Tachyon PET (Tachyon-I) is a demonstration single-ring PET scanner that reaches a coincidence timing resolution of 314 ps using LSO scintillator crystals coupled to conventional photomultiplier tubes. The objective of this study was to quantify the improvement in both lesion detection and quantification performance resulting from the improved time-of-flight (TOF) capability of the Tachyon-I scanner. We developed a quantitative TOF image reconstruction method for the Tachyon-I and evaluated its TOF gain for lesion detection and quantification. Scans of either a standard NEMA torso phantom or healthy volunteers were used as the normal background data. Separately scanned point source and sphere data were superimposed onto the phantom or human data after accounting for the object attenuation. We used the bootstrap method to generate multiple independent noisy datasets with and without a lesion present. The signal-to-noise ratio (SNR) of a channelized hotelling observer (CHO) was calculated for each lesion size and location combination to evaluate the lesion detection performance. The bias versus standard deviation trade-off of each lesion uptake was also calculated to evaluate the quantification performance. The resulting CHO-SNR measurements showed improved performance in lesion detection with better timing resolution. The detection performance was also dependent on the lesion size and location, in addition to the background object size and shape. The results of bias versus noise trade-off showed that the noise (standard deviation) reduction ratio was about 1.1–1.3 over the TOF 500 ps and 1.5–1.9 over the non-TOF modes, similar to the SNR gains for lesion detection. In conclusion, this Tachyon-I PET study demonstrated the benefit of improved time-of-flight capability on lesion detection and ROI quantification for both phantom and human subjects.
HTAPP: High-Throughput Autonomous Proteomic Pipeline
Yu, Kebing; Salomon, Arthur R.
2011-01-01
Recent advances in the speed and sensitivity of mass spectrometers and in analytical methods, the exponential acceleration of computer processing speeds, and the availability of genomic databases from an array of species and protein information databases have led to a deluge of proteomic data. The development of a lab-based automated proteomic software platform for the automated collection, processing, storage, and visualization of expansive proteomic datasets is critically important. The high-throughput autonomous proteomic pipeline (HTAPP) described here is designed from the ground up to provide critically important flexibility for diverse proteomic workflows and to streamline the total analysis of a complex proteomic sample. This tool is comprised of software that controls the acquisition of mass spectral data along with automation of post-acquisition tasks such as peptide quantification, clustered MS/MS spectral database searching, statistical validation, and data exploration within a user-configurable lab-based relational database. The software design of HTAPP focuses on accommodating diverse workflows and providing missing software functionality to a wide range of proteomic researchers to accelerate the extraction of biological meaning from immense proteomic data sets. Although individual software modules in our integrated technology platform may have some similarities to existing tools, the true novelty of the approach described here is in the synergistic and flexible combination of these tools to provide an integrated and efficient analysis of proteomic samples. PMID:20336676
Boiler: lossy compression of RNA-seq alignments using coverage vectors.
Pritt, Jacob; Langmead, Ben
2016-09-19
We describe Boiler, a new software tool for compressing and querying large collections of RNA-seq alignments. Boiler discards most per-read data, keeping only a genomic coverage vector plus a few empirical distributions summarizing the alignments. Since most per-read data is discarded, storage footprint is often much smaller than that achieved by other compression tools. Despite this, the most relevant per-read data can be recovered; we show that Boiler compression has only a slight negative impact on results given by downstream tools for isoform assembly and quantification. Boiler also allows the user to pose fast and useful queries without decompressing the entire file. Boiler is free open source software available from github.com/jpritt/boiler. © The Author(s) 2016. Published by Oxford University Press on behalf of Nucleic Acids Research.
An Excel-based implementation of the spectral method of action potential alternans analysis.
Pearman, Charles M
2014-12-01
Action potential (AP) alternans has been well established as a mechanism of arrhythmogenesis and sudden cardiac death. Proper interpretation of AP alternans requires a robust method of alternans quantification. Traditional methods of alternans analysis neglect higher order periodicities that may have greater pro-arrhythmic potential than classical 2:1 alternans. The spectral method of alternans analysis, already widely used in the related study of microvolt T-wave alternans, has also been used to study AP alternans. Software to meet the specific needs of AP alternans analysis is not currently available in the public domain. An AP analysis tool is implemented here, written in Visual Basic for Applications and using Microsoft Excel as a shell. This performs a sophisticated analysis of alternans behavior allowing reliable distinction of alternans from random fluctuations, quantification of alternans magnitude, and identification of which phases of the AP are most affected. In addition, the spectral method has been adapted to allow detection and quantification of higher order regular oscillations. Analysis of action potential morphology is also performed. A simple user interface enables easy import, analysis, and export of collated results. © 2014 The Author. Physiological Reports published by Wiley Periodicals, Inc. on behalf of the American Physiological Society and The Physiological Society.
Matthews, Mike B; Kearns, Stuart L; Buse, Ben
2018-04-01
The accuracy to which Cu and Al coatings can be determined, and the effect this has on the quantification of the substrate, is investigated. Cu and Al coatings of nominally 5, 10, 15, and 20 nm were sputter coated onto polished Bi using two configurations of coater: One with the film thickness monitor (FTM) sensor colocated with the samples, and one where the sensor is located to one side. The FTM thicknesses are compared against those calculated from measured Cu Lα and Al Kα k-ratios using PENEPMA, GMRFilm, and DTSA-II. Selected samples were also cross-sectioned using focused ion beam. Both systems produced repeatable coatings, the thickest coating being approximately four times the thinnest coating. The side-located FTM sensor indicated thicknesses less than half those of the software modeled results, propagating on to 70% errors in substrate quantification at 5 kV. The colocated FTM sensor produced errors in film thickness and substrate quantification of 10-20%. Over the range of film thicknesses and accelerating voltages modeled both the substrate and coating k-ratios can be approximated by linear trends as functions of film thickness. The Al films were found to have a reduced density of ~2 g/cm2.
A proven approach for more effective software development and maintenance
NASA Technical Reports Server (NTRS)
Pajerski, Rose; Hall, Dana; Sinclair, Craig
1994-01-01
Modern space flight mission operations and associated ground data systems are increasingly dependent upon reliable, quality software. Critical functions such as command load preparation, health and status monitoring, communications link scheduling and conflict resolution, and transparent gateway protocol conversion are routinely performed by software. Given budget constraints and the ever increasing capabilities of processor technology, the next generation of control centers and data systems will be even more dependent upon software across all aspects of performance. A key challenge now is to implement improved engineering, management, and assurance processes for the development and maintenance of that software; processes that cost less, yield higher quality products, and that self-correct for continual improvement evolution. The NASA Goddard Space Flight Center has a unique experience base that can be readily tapped to help solve the software challenge. Over the past eighteen years, the Software Engineering Laboratory within the code 500 Flight Dynamics Division has evolved a software development and maintenance methodology that accommodates the unique characteristics of an organization while optimizing and continually improving the organization's software capabilities. This methodology relies upon measurement, analysis, and feedback much analogous to that of control loop systems. It is an approach with a time-tested track record proven through repeated applications across a broad range of operational software development and maintenance projects. This paper describes the software improvement methodology employed by the Software Engineering Laboratory, and how it has been exploited within the Flight Dynamics Division with GSFC Code 500. Examples of specific improvement in the software itself and its processes are presented to illustrate the effectiveness of the methodology. Finally, the initial findings are given when this methodology was applied across the mission operations and ground data systems software domains throughout Code 500.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kronewitter, Scott R.; Marginean, Ioan; Cox, Jonathan T.
The N-glycan diversity of human serum glycoproteins, i.e. the human blood serum N-glycome, is complex due to the range of glycan structures potentially synthesizable by human glycosylation enzymes. The reported glycome, however, is limited by methods of sample preparation, available analytical platforms, e.g., based upon electrospray ionization-mass spectrometry (ESI-MS), and software tools for data analysis. In this report, several improvements have been implemented in sample preparation and analysis to extend ESI-MS glycan characterization and to provide an improved view of glycan diversity. Sample preparation improvements include acidified, microwave-accelerated, PNGase F N-glycan release, and sodium borohydride reduction were optimized to improvemore » quantitative yields and conserve the number of glycoforms detected. Two-stage desalting (during solid phase extraction and on the analytical column) increased the sensitivity by reducing analyte signal division between multiple reducing-end-forms or cation adducts. On-line separations were improved by using extended length graphitized carbon columns and adding TFA as an acid modifier to a formic acid/reversed phase gradient which provides additional resolving power and significantly improved desorption of both large and heavily sialylated glycans. To improve MS sensitivity and provide gentler ionization conditions at the source-MS interface, subambient pressure ionization with nanoelectrospray (SPIN) has been utilized. When method improvements are combined together with the Glycomics Quintavariate Informed Quantification (GlyQ-IQ) recently described1 these technologies demonstrate the ability to significantly extend glycan detection sensitivity and provide expanded glycan coverage. We demonstrate application of these advances in the context of the human serum glycome, and for which our initial observations include detection of a new class of heavily sialylated N-glycans, including polysialylated N-glycans.« less
Capability Maturity Model (CMM) for Software Process Improvements
NASA Technical Reports Server (NTRS)
Ling, Robert Y.
2000-01-01
This slide presentation reviews the Avionic Systems Division's implementation of the Capability Maturity Model (CMM) for improvements in the software development process. The presentation reviews the process involved in implementing the model and the benefits of using CMM to improve the software development process.
Automated quantification of myocardial perfusion SPECT using simplified normal limits.
Slomka, Piotr J; Nishina, Hidetaka; Berman, Daniel S; Akincioglu, Cigdem; Abidov, Aiden; Friedman, John D; Hayes, Sean W; Germano, Guido
2005-01-01
To simplify development of normal limits for myocardial perfusion SPECT (MPS), we implemented a quantification scheme in which normal limits are derived without visual scoring of abnormal scans or optimization of regional thresholds. Normal limits were derived from same-day TI-201 rest/Tc-99m-sestamibi stress scans of male (n = 40) and female (n = 40) low-likelihood patients. Defect extent, total perfusion deficit (TPD), and regional perfusion extents were derived by comparison to normal limits in polar-map coordinates. MPS scans from 256 consecutive patients without known coronary artery disease, who underwent coronary angiography, were analyzed. The new method of quantification (TPD) was compared with our previously developed quantification system and visual scoring. The receiver operator characteristic area under the curve for detection of 50% or greater stenoses by TPD (0.88 +/- 0.02) was higher than by visual scoring (0.83 +/- 0.03) ( P = .039) or standard quantification (0.82 +/- 0.03) ( P = .004). For detection of 70% or greater stenoses, it was higher for TPD (0.89 +/- 0.02) than for standard quantification (0.85 +/- 0.02) ( P = .014). Sensitivity and specificity were 93% and 79%, respectively, for TPD; 81% and 85%, respectively, for visual scoring; and 80% and 73%, respectively, for standard quantification. The use of stress mode-specific normal limits did not improve performance. Simplified quantification achieves performance better than or equivalent to visual scoring or quantification based on per-segment visual optimization of abnormality thresholds.
Prioritization of factors impacting on performance of power looms using AHP
NASA Astrophysics Data System (ADS)
Dulange, S. R.; Pundir, A. K.; Ganapathy, L.
2014-08-01
The purpose of this paper is to identify the critical success factors influencing the performance of power loom textiles, to evaluate their impact on the organizational performance and to find out the effect of these factors on the organizational performance of small and medium-sized enterprises (SMEs) in the Solapur (Maharashtra) industrial sector using AHP. In the methodology adopted, factors are identified through the literature survey and finalization of these factors is done by taking the opinion of experts in the Indian context. By cognitive map, the relation between these factors (direct and indirect effect) is determined and cause and effect diagram is prepared. Then these factors are arranged hierarchically and tree diagram is prepared. A questionnaire was designed and distributed among the experts; data is collected. Using expert choice software data is filled to quantify by pair-wise comparison of these factors and are prioritized. The weights demonstrate several key findings: local and global priority reveals that there is a substantial effect of the human resource, product style, and volume on the organizational performance. The skills and technology upgradation impact on organizational performance. Maintenance plays an important role in improving the organizational performances of the SMEs. Overall, the results showed the central role of the operational factors are important. The research is subject to the normal limitations of AHP. The study is using perceptual data provided by Experts which may not provide clear measures of impact factors. However, this can be overcome using more experts to collect data in future studies. Interestingly, the findings here may be generalisable outside Solapur like Ichalkarnji, Malegaon, and Bhiwadi (Maharashtra). Solapur power loom SMEs should consider AHP as an innovative tool for quantification of factors impacting on performance and improving operational and organizational performance in today's dynamic manufacturing environment. The finding suggests the notion that these critical success factors (CSFs) are to be studied carefully and improvement strategy should be developed. Moreover, the study emphasizes the need to link priority of factors to organizational performance and improvement. The study integrates the CSFs of performance and its quantification using AHP and its effect on performance of power loom textiles. The indirect impacts of underlying and fundamental factors are considered. Very few studies have been performed to investigate and understand this issue. Therefore, the research can make a useful contribution.
Bedoya, Cesar; Cardona, Andrés; Galeano, July; Cortés-Mancera, Fabián; Sandoz, Patrick; Zarzycki, Artur
2017-12-01
The wound healing assay is widely used for the quantitative analysis of highly regulated cellular events. In this essay, a wound is voluntarily produced on a confluent cell monolayer, and then the rate of wound reduction (WR) is characterized by processing images of the same regions of interest (ROIs) recorded at different time intervals. In this method, sharp-image ROI recovery is indispensable to compensate for displacements of the cell cultures due either to the exploration of multiple sites of the same culture or to transfers from the microscope stage to a cell incubator. ROI recovery is usually done manually and, despite a low-magnification microscope objective is generally used (10x), repositioning imperfections constitute a major source of errors detrimental to the WR measurement accuracy. We address this ROI recovery issue by using pseudoperiodic patterns fixed onto the cell culture dishes, allowing the easy localization of ROIs and the accurate quantification of positioning errors. The method is applied to a tumor-derived cell line, and the WR rates are measured by means of two different image processing software. Sharp ROI recovery based on the proposed method is found to improve significantly the accuracy of the WR measurement and the positioning under the microscope.
WISE: Automated support for software project management and measurement. M.S. Thesis
NASA Technical Reports Server (NTRS)
Ramakrishnan, Sudhakar
1995-01-01
One important aspect of software development and IV&V is measurement. Unless a software development effort is measured in some way, it is difficult to judge the effectiveness of current efforts and predict future performances. Collection of metrics and adherence to a process are difficult tasks in a software project. Change activity is a powerful indicator of project status. Automated systems that can handle change requests, issues, and other process documents provide an excellent platform for tracking the status of the project. A World Wide Web based architecture is developed for (a) making metrics collection an implicit part of the software process, (b) providing metric analysis dynamically, (c) supporting automated tools that can complement current practices of in-process improvement, and (d) overcoming geographical barrier. An operational system (WISE) instantiates this architecture allowing for the improvement of software process in a realistic environment. The tool tracks issues in software development process, provides informal communication between the users with different roles, supports to-do lists (TDL), and helps in software process improvement. WISE minimizes the time devoted to metrics collection, analysis, and captures software change data. Automated tools like WISE focus on understanding and managing the software process. The goal is improvement through measurement.
VESGEN Software for Mapping and Quantification of Vascular Regulators
NASA Technical Reports Server (NTRS)
Parsons-Wingerter, Patricia A.; Vickerman, Mary B.; Keith, Patricia A.
2012-01-01
VESsel GENeration (VESGEN) Analysis is an automated software that maps and quantifies effects of vascular regulators on vascular morphology by analyzing important vessel parameters. Quantification parameters include vessel diameter, length, branch points, density, and fractal dimension. For vascular trees, measurements are reported as dependent functions of vessel branching generation. VESGEN maps and quantifies vascular morphological events according to fractal-based vascular branching generation. It also relies on careful imaging of branching and networked vascular form. It was developed as a plug-in for ImageJ (National Institutes of Health, USA). VESGEN uses image-processing concepts of 8-neighbor pixel connectivity, skeleton, and distance map to analyze 2D, black-and-white (binary) images of vascular trees, networks, and tree-network composites. VESGEN maps typically 5 to 12 (or more) generations of vascular branching, starting from a single parent vessel. These generations are tracked and measured for critical vascular parameters that include vessel diameter, length, density and number, and tortuosity per branching generation. The effects of vascular therapeutics and regulators on vascular morphology and branching tested in human clinical or laboratory animal experimental studies are quantified by comparing vascular parameters with control groups. VESGEN provides a user interface to both guide and allow control over the users vascular analysis process. An option is provided to select a morphological tissue type of vascular trees, network or tree-network composites, which determines the general collections of algorithms, intermediate images, and output images and measurements that will be produced.
Development and validation of techniques for improving software dependability
NASA Technical Reports Server (NTRS)
Knight, John C.
1992-01-01
A collection of document abstracts are presented on the topic of improving software dependability through NASA grant NAG-1-1123. Specific topics include: modeling of error detection; software inspection; test cases; Magnetic Stereotaxis System safety specifications and fault trees; and injection of synthetic faults into software.
van Mourik, Louise M; Leonards, Pim E G; Gaus, Caroline; de Boer, Jacob
2015-10-01
Concerns about the high production volumes, persistency, bioaccumulation potential and toxicity of chlorinated paraffin (CP) mixtures, especially short-chain CPs (SCCPs), are rising. However, information on their levels and fate in the environment is still insufficient, impeding international classifications and regulations. This knowledge gap is mainly due to the difficulties that arise with CP analysis, in particular the chromatographic separation within CPs and between CPs and other compounds. No fully validated routine analytical method is available yet and only semi-quantitative analysis is possible, although the number of studies reporting new and improved methods have rapidly increased since 2010. Better cleanup procedures that remove interfering compounds, and new instrumental techniques, which distinguish between medium-chain CPs (MCCPs) and SCCPs, have been developed. While gas chromatography coupled to an electron capture negative ionisation mass spectrometry (GC/ECNI-MS) remains the most commonly applied technique, novel and promising use of high resolution time of flight MS (TOF-MS) has also been reported. We expect that recent developments in high resolution TOF-MS and Orbitrap technologies will further improve the detection of CPs, including long-chain CPs (LCCPs), and the group separation and quantification of CP homologues. Also, new CP quantification methods have emerged, including the use of mathematical algorithms, multiple linear regression and principal component analysis. These quantification advancements are also reflected in considerably improved interlaboratory agreements since 2010. Analysis of lower chlorinated paraffins (
TRIC: an automated alignment strategy for reproducible protein quantification in targeted proteomics
Röst, Hannes L.; Liu, Yansheng; D’Agostino, Giuseppe; Zanella, Matteo; Navarro, Pedro; Rosenberger, George; Collins, Ben C.; Gillet, Ludovic; Testa, Giuseppe; Malmström, Lars; Aebersold, Ruedi
2016-01-01
Large scale, quantitative proteomic studies have become essential for the analysis of clinical cohorts, large perturbation experiments and systems biology studies. While next-generation mass spectrometric techniques such as SWATH-MS have substantially increased throughput and reproducibility, ensuring consistent quantification of thousands of peptide analytes across multiple LC-MS/MS runs remains a challenging and laborious manual process. To produce highly consistent and quantitatively accurate proteomics data matrices in an automated fashion, we have developed the TRIC software which utilizes fragment ion data to perform cross-run alignment, consistent peak-picking and quantification for high throughput targeted proteomics. TRIC uses a graph-based alignment strategy based on non-linear retention time correction to integrate peak elution information from all LC-MS/MS runs acquired in a study. When compared to state-of-the-art SWATH-MS data analysis, the algorithm was able to reduce the identification error by more than 3-fold at constant recall, while correcting for highly non-linear chromatographic effects. On a pulsed-SILAC experiment performed on human induced pluripotent stem (iPS) cells, TRIC was able to automatically align and quantify thousands of light and heavy isotopic peak groups and substantially increased the quantitative completeness and biological information in the data, providing insights into protein dynamics of iPS cells. Overall, this study demonstrates the importance of consistent quantification in highly challenging experimental setups, and proposes an algorithm to automate this task, constituting the last missing piece in a pipeline for automated analysis of massively parallel targeted proteomics datasets. PMID:27479329
Accurate proteome-wide protein quantification from high-resolution 15N mass spectra
2011-01-01
In quantitative mass spectrometry-based proteomics, the metabolic incorporation of a single source of 15N-labeled nitrogen has many advantages over using stable isotope-labeled amino acids. However, the lack of a robust computational framework for analyzing the resulting spectra has impeded wide use of this approach. We have addressed this challenge by introducing a new computational methodology for analyzing 15N spectra in which quantification is integrated with identification. Application of this method to an Escherichia coli growth transition reveals significant improvement in quantification accuracy over previous methods. PMID:22182234
Collender, Philip A; Kirby, Amy E; Addiss, David G; Freeman, Matthew C; Remais, Justin V
2015-12-01
Limiting the environmental transmission of soil-transmitted helminths (STHs), which infect 1.5 billion people worldwide, will require sensitive, reliable, and cost-effective methods to detect and quantify STHs in the environment. We review the state-of-the-art of STH quantification in soil, biosolids, water, produce, and vegetation with regard to four major methodological issues: environmental sampling; recovery of STHs from environmental matrices; quantification of recovered STHs; and viability assessment of STH ova. We conclude that methods for sampling and recovering STHs require substantial advances to provide reliable measurements for STH control. Recent innovations in the use of automated image identification and developments in molecular genetic assays offer considerable promise for improving quantification and viability assessment. Copyright © 2015 Elsevier Ltd. All rights reserved.
Missile signal processing common computer architecture for rapid technology upgrade
NASA Astrophysics Data System (ADS)
Rabinkin, Daniel V.; Rutledge, Edward; Monticciolo, Paul
2004-10-01
Interceptor missiles process IR images to locate an intended target and guide the interceptor towards it. Signal processing requirements have increased as the sensor bandwidth increases and interceptors operate against more sophisticated targets. A typical interceptor signal processing chain is comprised of two parts. Front-end video processing operates on all pixels of the image and performs such operations as non-uniformity correction (NUC), image stabilization, frame integration and detection. Back-end target processing, which tracks and classifies targets detected in the image, performs such algorithms as Kalman tracking, spectral feature extraction and target discrimination. In the past, video processing was implemented using ASIC components or FPGAs because computation requirements exceeded the throughput of general-purpose processors. Target processing was performed using hybrid architectures that included ASICs, DSPs and general-purpose processors. The resulting systems tended to be function-specific, and required custom software development. They were developed using non-integrated toolsets and test equipment was developed along with the processor platform. The lifespan of a system utilizing the signal processing platform often spans decades, while the specialized nature of processor hardware and software makes it difficult and costly to upgrade. As a result, the signal processing systems often run on outdated technology, algorithms are difficult to update, and system effectiveness is impaired by the inability to rapidly respond to new threats. A new design approach is made possible three developments; Moore's Law - driven improvement in computational throughput; a newly introduced vector computing capability in general purpose processors; and a modern set of open interface software standards. Today's multiprocessor commercial-off-the-shelf (COTS) platforms have sufficient throughput to support interceptor signal processing requirements. This application may be programmed under existing real-time operating systems using parallel processing software libraries, resulting in highly portable code that can be rapidly migrated to new platforms as processor technology evolves. Use of standardized development tools and 3rd party software upgrades are enabled as well as rapid upgrade of processing components as improved algorithms are developed. The resulting weapon system will have a superior processing capability over a custom approach at the time of deployment as a result of a shorter development cycles and use of newer technology. The signal processing computer may be upgraded over the lifecycle of the weapon system, and can migrate between weapon system variants enabled by modification simplicity. This paper presents a reference design using the new approach that utilizes an Altivec PowerPC parallel COTS platform. It uses a VxWorks-based real-time operating system (RTOS), and application code developed using an efficient parallel vector library (PVL). A quantification of computing requirements and demonstration of interceptor algorithm operating on this real-time platform are provided.
NASA Technical Reports Server (NTRS)
1976-01-01
Only a few efforts are currently underway to develop an adequate technology base for the various themes. Particular attention must be given to software commonality and evolutionary capability, to increased system integrity and autonomy; and to improved communications among the program users, the program developers, and the programs themselves. There is a need for quantum improvement in software development methods and increasing the awareness of software by all concerned. Major thrusts identified include: (1) data and systems management; (2) software technology for autonomous systems; (3) technology and methods for improving the software development process; (4) advances related to systems of software elements including their architecture, their attributes as systems, and their interfaces with users and other systems; and (5) applications of software including both the basic algorithms used in a number of applications and the software specific to a particular theme or discipline area. The impact of each theme on software is assessed.
Reuse Metrics for Object Oriented Software
NASA Technical Reports Server (NTRS)
Bieman, James M.
1998-01-01
One way to increase the quality of software products and the productivity of software development is to reuse existing software components when building new software systems. In order to monitor improvements in reuse, the level of reuse must be measured. In this NASA supported project we (1) derived a suite of metrics which quantify reuse attributes for object oriented, object based, and procedural software, (2) designed prototype tools to take these measurements in Ada, C++, Java, and C software, (3) evaluated the reuse in available software, (4) analyzed the relationship between coupling, cohesion, inheritance, and reuse, (5) collected object oriented software systems for our empirical analyses, and (6) developed quantitative criteria and methods for restructuring software to improve reusability.
PET/MRI for Neurological Applications
Catana, Ciprian; Drzezga, Alexander; Heiss, Wolf-Dieter; Rosen, Bruce R.
2013-01-01
PET and MRI provide complementary information in the study of the human brain. Simultaneous PET/MR data acquisition allows the spatial and temporal correlation of the measured signals, opening up opportunities impossible to realize using stand-alone instruments. This paper reviews the methodological improvements and potential neurological and psychiatric applications of this novel technology. We first present methods for improving the performance and information content of each modality by using the information provided by the other technique. On the PET side, we discuss methods that use the simultaneously acquired MR data to improve the PET data quantification. On the MR side, we present how improved PET quantification could be used to validate a number of MR techniques. Finally, we describe promising research, translational and clinical applications that could benefit from these advanced tools. PMID:23143086
Utilization of remote sensing techniques for the quantification of fire behavior in two pine stands
Eric V. Mueller; Nicholas Skowronski; Kenneth Clark; Michael Gallagher; Robert Kremens; Jan C. Thomas; Mohamad El Houssami; Alexander Filkov; Rory M. Hadden; William Mell; Albert Simeoni
2017-01-01
Quantification of field-scale fire behavior is necessary to improve the current scientific understanding of wildland fires and to develop and test relevant, physics-based models. In particular, detailed descriptions of individual fires are required, for which the available literature is limited. In this work, two such field-scale experiments, carried out in pine stands...
HPLC Quantification of astaxanthin and canthaxanthin in Salmonidae eggs.
Tzanova, Milena; Argirova, Mariana; Atanasov, Vasil
2017-04-01
Astaxanthin and canthaxanthin are naturally occurring antioxidants referred to as xanthophylls. They are used as food additives in fish farms to improve the organoleptic qualities of salmonid products and to prevent reproductive diseases. This study reports the development and single-laboratory validation of a rapid method for quantification of astaxanthin and canthaxanthin in eggs of rainbow trout (Oncorhynchus mykiss) and brook trout (Salvelinus fontinalis М.). An advantage of the proposed method is the perfect combination of selective extraction of the xanthophylls and analysis of the extract by high-performance liquid chromatography and photodiode array detection. The method validation was carried out in terms of linearity, accuracy, precision, recovery and limits of detection and quantification. The method was applied for simultaneous quantification of the two xanthophylls in eggs of rainbow trout and brook trout after their selective extraction. The results show that astaxanthin accumulations in salmonid fish eggs are larger than those of canthaxanthin. As the levels of these two xanthophylls affect fish fertility, this method can be used to improve the nutritional quality and to minimize the occurrence of the M74 syndrome in fish populations. Copyright © 2016 John Wiley & Sons, Ltd.
Huillet, Céline; Adrait, Annie; Lebert, Dorothée; Picard, Guillaume; Trauchessec, Mathieu; Louwagie, Mathilde; Dupuis, Alain; Hittinger, Luc; Ghaleh, Bijan; Le Corvoisier, Philippe; Jaquinod, Michel; Garin, Jérôme; Bruley, Christophe; Brun, Virginie
2012-01-01
Development of new biomarkers needs to be significantly accelerated to improve diagnostic, prognostic, and toxicity monitoring as well as therapeutic follow-up. Biomarker evaluation is the main bottleneck in this development process. Selected Reaction Monitoring (SRM) combined with stable isotope dilution has emerged as a promising option to speed this step, particularly because of its multiplexing capacities. However, analytical variabilities because of upstream sample handling or incomplete trypsin digestion still need to be resolved. In 2007, we developed the PSAQ™ method (Protein Standard Absolute Quantification), which uses full-length isotope-labeled protein standards to quantify target proteins. In the present study we used clinically validated cardiovascular biomarkers (LDH-B, CKMB, myoglobin, and troponin I) to demonstrate that the combination of PSAQ and SRM (PSAQ-SRM) allows highly accurate biomarker quantification in serum samples. A multiplex PSAQ-SRM assay was used to quantify these biomarkers in clinical samples from myocardial infarction patients. Good correlation between PSAQ-SRM and ELISA assay results was found and demonstrated the consistency between these analytical approaches. Thus, PSAQ-SRM has the capacity to improve both accuracy and reproducibility in protein analysis. This will be a major contribution to efficient biomarker development strategies. PMID:22080464
Kobayashi, Yuto; Kamishima, Tamotsu; Sugimori, Hiroyuki; Ichikawa, Shota; Noguchi, Atsushi; Kono, Michihito; Iiyama, Toshitake; Sutherland, Kenneth; Atsumi, Tatsuya
2018-03-01
Synovitis, which is a hallmark of rheumatoid arthritis (RA), needs to be precisely quantified to determine the treatment plan. Time-intensity curve (TIC) shape analysis is an objective assessment method for characterizing the pixels as artery, inflamed synovium, or other tissues using dynamic contrast-enhanced MRI (DCE-MRI). To assess the feasibility of our original arterial mask subtraction method (AMSM) with mutual information (MI) for quantification of synovitis in RA. Prospective study. Ten RA patients (nine women and one man; mean age, 56.8 years; range, 38-67 years). 3T/DCE-MRI. After optimization of TIC shape analysis to the hand region, a combination of TIC shape analysis and AMSM was applied to synovial quantification. The MI between pre- and postcontrast images was utilized to determine the arterial mask phase objectively, which was compared with human subjective selection. The volume of objectively measured synovitis by software was compared with that of manual outlining by an experienced radiologist. Simple TIC shape analysis and TIC shape analysis combined with AMSM were compared in slices without synovitis according to subjective evaluation. Pearson's correlation coefficient, paired t-test and intraclass correlation coefficient (ICC). TIC shape analysis was successfully optimized in the hand region with a correlation coefficient of 0.725 (P < 0.01) with the results of manual assessment regarded as ground truth. Objective selection utilizing MI had substantial agreement (ICC = 0.734) with subjective selection. Correlation of synovial volumetry in combination with TIC shape analysis and AMSM with manual assessment was excellent (r = 0.922, P < 0.01). In addition, negative predictive ability in slices without synovitis pixels was significantly increased (P < 0.01). The combination of TIC shape analysis and image subtraction reinforced with MI can accurately quantify synovitis of RA in the hand by eliminating arterial pixels. 2 Technical Efficacy: Stage 2 J. Magn. Reson. Imaging 2018. © 2018 International Society for Magnetic Resonance in Medicine.
FracPaQ: a MATLAB™ toolbox for the quantification of fracture patterns
NASA Astrophysics Data System (ADS)
Healy, David; Rizzo, Roberto; Farrell, Natalie; Watkins, Hannah; Cornwell, David; Gomez-Rivas, Enrique; Timms, Nick
2017-04-01
The patterns of fractures in deformed rocks are rarely uniform or random. Fracture orientations, sizes, shapes and spatial distributions often exhibit some kind of order. In detail, there may be relationships among the different fracture attributes e.g. small fractures dominated by one orientation, larger fractures by another. These relationships are important because the mechanical (e.g. strength, anisotropy) and transport (e.g. fluids, heat) properties of rock depend on these fracture patterns and fracture attributes. This presentation describes an open source toolbox to quantify fracture patterns, including distributions in fracture attributes and their spatial variation. Software has been developed to quantify fracture patterns from 2-D digital images, such as thin section micrographs, geological maps, outcrop or aerial photographs or satellite images. The toolbox comprises a suite of MATLAB™ scripts based on published quantitative methods for the analysis of fracture attributes: orientations, lengths, intensity, density and connectivity. An estimate of permeability in 2-D is made using a parallel plate model. The software provides an objective and consistent methodology for quantifying fracture patterns and their variations in 2-D across a wide range of length scales. Our current focus for the application of the software is on quantifying crack and fracture patterns in and around fault zones. There is a large body of published work on the quantification of relatively simple joint patterns, but fault zones present a bigger, and arguably more important, challenge. The methods presented are inherently scale independent, and a key task will be to analyse and integrate quantitative fracture pattern data from micro- to macro-scales. New features in this release include multi-scale analyses based on a wavelet method to look for scale transitions, support for multi-colour traces in the input file processed as separate fracture sets, and combining fracture traces from multiple 2-D images to derive the statistically equivalent 3-D fracture pattern expressed as a 2nd rank crack tensor.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Fang, Y; Huang, H; Su, T
Purpose: Texture-based quantification of image heterogeneity has been a popular topic for imaging studies in recent years. As previous studies mainly focus on oncological applications, we report our recent efforts of applying such techniques on cardiac perfusion imaging. A fully automated procedure has been developed to perform texture analysis for measuring the image heterogeneity. Clinical data were used to evaluate the preliminary performance of such methods. Methods: Myocardial perfusion images of Thallium-201 scans were collected from 293 patients with suspected coronary artery disease. Each subject underwent a Tl-201 scan and a percutaneous coronary intervention (PCI) within three months. The PCImore » Result was used as the gold standard of coronary ischemia of more than 70% stenosis. Each Tl-201 scan was spatially normalized to an image template for fully automatic segmentation of the LV. The segmented voxel intensities were then carried into the texture analysis with our open-source software Chang Gung Image Texture Analysis toolbox (CGITA). To evaluate the clinical performance of the image heterogeneity for detecting the coronary stenosis, receiver operating characteristic (ROC) analysis was used to compute the overall accuracy, sensitivity and specificity as well as the area under curve (AUC). Those indices were compared to those obtained from the commercially available semi-automatic software QPS. Results: With the fully automatic procedure to quantify heterogeneity from Tl-201 scans, we were able to achieve a good discrimination with good accuracy (74%), sensitivity (73%), specificity (77%) and AUC of 0.82. Such performance is similar to those obtained from the semi-automatic QPS software that gives a sensitivity of 71% and specificity of 77%. Conclusion: Based on fully automatic procedures of data processing, our preliminary data indicate that the image heterogeneity of myocardial perfusion imaging can provide useful information for automatic determination of the myocardial ischemia.« less
Lavallée-Adam, Mathieu; Rauniyar, Navin; McClatchy, Daniel B; Yates, John R
2014-12-05
The majority of large-scale proteomics quantification methods yield long lists of quantified proteins that are often difficult to interpret and poorly reproduced. Computational approaches are required to analyze such intricate quantitative proteomics data sets. We propose a statistical approach to computationally identify protein sets (e.g., Gene Ontology (GO) terms) that are significantly enriched with abundant proteins with reproducible quantification measurements across a set of replicates. To this end, we developed PSEA-Quant, a protein set enrichment analysis algorithm for label-free and label-based protein quantification data sets. It offers an alternative approach to classic GO analyses, models protein annotation biases, and allows the analysis of samples originating from a single condition, unlike analogous approaches such as GSEA and PSEA. We demonstrate that PSEA-Quant produces results complementary to GO analyses. We also show that PSEA-Quant provides valuable information about the biological processes involved in cystic fibrosis using label-free protein quantification of a cell line expressing a CFTR mutant. Finally, PSEA-Quant highlights the differences in the mechanisms taking place in the human, rat, and mouse brain frontal cortices based on tandem mass tag quantification. Our approach, which is available online, will thus improve the analysis of proteomics quantification data sets by providing meaningful biological insights.
2015-01-01
The majority of large-scale proteomics quantification methods yield long lists of quantified proteins that are often difficult to interpret and poorly reproduced. Computational approaches are required to analyze such intricate quantitative proteomics data sets. We propose a statistical approach to computationally identify protein sets (e.g., Gene Ontology (GO) terms) that are significantly enriched with abundant proteins with reproducible quantification measurements across a set of replicates. To this end, we developed PSEA-Quant, a protein set enrichment analysis algorithm for label-free and label-based protein quantification data sets. It offers an alternative approach to classic GO analyses, models protein annotation biases, and allows the analysis of samples originating from a single condition, unlike analogous approaches such as GSEA and PSEA. We demonstrate that PSEA-Quant produces results complementary to GO analyses. We also show that PSEA-Quant provides valuable information about the biological processes involved in cystic fibrosis using label-free protein quantification of a cell line expressing a CFTR mutant. Finally, PSEA-Quant highlights the differences in the mechanisms taking place in the human, rat, and mouse brain frontal cortices based on tandem mass tag quantification. Our approach, which is available online, will thus improve the analysis of proteomics quantification data sets by providing meaningful biological insights. PMID:25177766
Geological modelling of mineral deposits for prediction in mining
NASA Astrophysics Data System (ADS)
Sides, E. J.
Accurate prediction of the shape, location, size and properties of the solid rock materials to be extracted during mining is essential for reliable technical and financial planning. This is achieved through geological modelling of the three-dimensional (3D) shape and properties of the materials present in mineral deposits, and the presentation of results in a form which is accessible to mine planning engineers. In recent years the application of interactive graphics software, offering 3D database handling, modelling and visualisation, has greatly enhanced the options available for predicting the subsurface limits and characteristics of mineral deposits. A review of conventional 3D geological interpretation methods, and the model struc- tures and modelling methods used in reserve estimation and mine planning software packages, illustrates the importance of such approaches in the modern mining industry. Despite the widespread introduction and acceptance of computer hardware and software in mining applications, in recent years, there has been little fundamental change in the way in which geology is used in orebody modelling for predictive purposes. Selected areas of current research, aimed at tackling issues such as the use of orientation data, quantification of morphological differences, incorporation of geological age relationships, multi-resolution models and the application of virtual reality hardware and software, are discussed.
Using iRT, a normalized retention time for more targeted measurement of peptides.
Escher, Claudia; Reiter, Lukas; MacLean, Brendan; Ossola, Reto; Herzog, Franz; Chilton, John; MacCoss, Michael J; Rinner, Oliver
2012-04-01
Multiple reaction monitoring (MRM) has recently become the method of choice for targeted quantitative measurement of proteins using mass spectrometry. The method, however, is limited in the number of peptides that can be measured in one run. This number can be markedly increased by scheduling the acquisition if the accurate retention time (RT) of each peptide is known. Here we present iRT, an empirically derived dimensionless peptide-specific value that allows for highly accurate RT prediction. The iRT of a peptide is a fixed number relative to a standard set of reference iRT-peptides that can be transferred across laboratories and chromatographic systems. We show that iRT facilitates the setup of multiplexed experiments with acquisition windows more than four times smaller compared to in silico RT predictions resulting in improved quantification accuracy. iRTs can be determined by any laboratory and shared transparently. The iRT concept has been implemented in Skyline, the most widely used software for MRM experiments. © 2012 WILEY-VCH Verlag GmbH & Co. KGaA, Weinheim.
Cellular neural networks, the Navier-Stokes equation, and microarray image reconstruction.
Zineddin, Bachar; Wang, Zidong; Liu, Xiaohui
2011-11-01
Although the last decade has witnessed a great deal of improvements achieved for the microarray technology, many major developments in all the main stages of this technology, including image processing, are still needed. Some hardware implementations of microarray image processing have been proposed in the literature and proved to be promising alternatives to the currently available software systems. However, the main drawback of those proposed approaches is the unsuitable addressing of the quantification of the gene spot in a realistic way without any assumption about the image surface. Our aim in this paper is to present a new image-reconstruction algorithm using the cellular neural network that solves the Navier-Stokes equation. This algorithm offers a robust method for estimating the background signal within the gene-spot region. The MATCNN toolbox for Matlab is used to test the proposed method. Quantitative comparisons are carried out, i.e., in terms of objective criteria, between our approach and some other available methods. It is shown that the proposed algorithm gives highly accurate and realistic measurements in a fully automated manner within a remarkably efficient time.
Taylor, C R
2014-08-01
The traditional microscope, together with the "routine" hematoxylin and eosin (H & E) stain, remains the "gold standard" for diagnosis of cancer and other diseases; remarkably, it and the majority of associated biological stains are more than 150 years old. Immunohistochemistry has added to the repertoire of "stains" available. Because of the need for specific identification and even measurement of "biomarkers," immunohistochemistry has increased the demand for consistency of performance and interpretation of staining results. Rapid advances in the capabilities of digital imaging hardware and software now offer a realistic route to improved reproducibility, accuracy and quantification by utilizing whole slide digital images for diagnosis, education and research. There also are potential efficiencies in work flow and the promise of powerful new analytical methods; however, there also are challenges with respect to validation of the quality and fidelity of digital images, including the standard H & E stain, so that diagnostic performance by pathologists is not compromised when they rely on whole slide images instead of traditional stained tissues on glass slides.
Huebsch, Nathaniel; Loskill, Peter; Mandegar, Mohammad A.; Marks, Natalie C.; Sheehan, Alice S.; Ma, Zhen; Mathur, Anurag; Nguyen, Trieu N.; Yoo, Jennie C.; Judge, Luke M.; Spencer, C. Ian; Chukka, Anand C.; Russell, Caitlin R.; So, Po-Lin
2015-01-01
Contractile motion is the simplest metric of cardiomyocyte health in vitro, but unbiased quantification is challenging. We describe a rapid automated method, requiring only standard video microscopy, to analyze the contractility of human-induced pluripotent stem cell-derived cardiomyocytes (iPS-CM). New algorithms for generating and filtering motion vectors combined with a newly developed isogenic iPSC line harboring genetically encoded calcium indicator, GCaMP6f, allow simultaneous user-independent measurement and analysis of the coupling between calcium flux and contractility. The relative performance of these algorithms, in terms of improving signal to noise, was tested. Applying these algorithms allowed analysis of contractility in iPS-CM cultured over multiple spatial scales from single cells to three-dimensional constructs. This open source software was validated with analysis of isoproterenol response in these cells, and can be applied in future studies comparing the drug responsiveness of iPS-CM cultured in different microenvironments in the context of tissue engineering. PMID:25333967
Droplet Digital™ PCR Next-Generation Sequencing Library QC Assay.
Heredia, Nicholas J
2018-01-01
Digital PCR is a valuable tool to quantify next-generation sequencing (NGS) libraries precisely and accurately. Accurately quantifying NGS libraries enable accurate loading of the libraries on to the sequencer and thus improve sequencing performance by reducing under and overloading error. Accurate quantification also benefits users by enabling uniform loading of indexed/barcoded libraries which in turn greatly improves sequencing uniformity of the indexed/barcoded samples. The advantages gained by employing the Droplet Digital PCR (ddPCR™) library QC assay includes the precise and accurate quantification in addition to size quality assessment, enabling users to QC their sequencing libraries with confidence.
Current trends in quantitative proteomics - an update.
Li, H; Han, J; Pan, J; Liu, T; Parker, C E; Borchers, C H
2017-05-01
Proteins can provide insights into biological processes at the functional level, so they are very promising biomarker candidates. The quantification of proteins in biological samples has been routinely used for the diagnosis of diseases and monitoring the treatment. Although large-scale protein quantification in complex samples is still a challenging task, a great amount of effort has been made to advance the technologies that enable quantitative proteomics. Seven years ago, in 2009, we wrote an article about the current trends in quantitative proteomics. In writing this current paper, we realized that, today, we have an even wider selection of potential tools for quantitative proteomics. These tools include new derivatization reagents, novel sampling formats, new types of analyzers and scanning techniques, and recently developed software to assist in assay development and data analysis. In this review article, we will discuss these innovative methods, and their current and potential applications in proteomics. Copyright © 2017 John Wiley & Sons, Ltd. Copyright © 2017 John Wiley & Sons, Ltd.
Tolivia, Jorge; Navarro, Ana; del Valle, Eva; Perez, Cristina; Ordoñez, Cristina; Martínez, Eva
2006-02-01
To describe a simple method to achieve the differential selection and subsequent quantification of the strength signal using only one section. Several methods for performing quantitative histochemistry, immunocytochemistry or hybridocytochemistry, without use of specific commercial image analysis systems, rely on pixel-counting algorithms, which do not provide information on the amount of chromogen present in the section. Other techniques use complex algorithms to calculate the cumulative signal strength using two consecutive sections. To separate the chromogen signal we used the "Color range" option of the Adobe Photoshop program, which provides a specific file for a particular chromogen selection that could be applied on similar sections. The measurement of the chromogen signal strength of the specific staining is achieved with the Scion Image software program. The method described in this paper can also be applied to simultaneous detection of different signals on the same section or different parameters (area of particles, number of particles, etc.) when the "Analyze particles" tool of the Scion program is used.
Porto, Alessandra Beggiato; Okazaki, Victor Hugo Alves
2017-10-01
The quantification of thoracic kyphosis and lumbar lordosis can be assessed in different ways; among them radiography and photogrammetry. However, the assessment procedures are not consistent in the literature for either method. The objective of this study was to conduct a literature review about postural assessment through radiography and photogrammetry, for delineating the procedures for both methods. In total 38 studies were selected by an online search in the MEDLINE and LILACS databases with the keywords: radiograph and posture, postural alignment, photogrammetry or photometry or biophotogrammetry. For the radiographic method, the results showed divergences in arm positioning and in the calculation of thoracic and lumbar angles. The photogrammetry demonstrated differences in relation to the camera, tripod, plumb line and feet positioning, angle calculation, software utilization, and the use of footwear. Standardization is proposed for both methods to help establish normative values and comparisons between diagnoses. Copyright © 2017 Elsevier Ltd. All rights reserved.
Large-scale time-lapse microscopy of Oct4 expression in human embryonic stem cell colonies.
Bhadriraju, Kiran; Halter, Michael; Amelot, Julien; Bajcsy, Peter; Chalfoun, Joe; Vandecreme, Antoine; Mallon, Barbara S; Park, Kye-Yoon; Sista, Subhash; Elliott, John T; Plant, Anne L
2016-07-01
Identification and quantification of the characteristics of stem cell preparations is critical for understanding stem cell biology and for the development and manufacturing of stem cell based therapies. We have developed image analysis and visualization software that allows effective use of time-lapse microscopy to provide spatial and dynamic information from large numbers of human embryonic stem cell colonies. To achieve statistically relevant sampling, we examined >680 colonies from 3 different preparations of cells over 5days each, generating a total experimental dataset of 0.9 terabyte (TB). The 0.5 Giga-pixel images at each time point were represented by multi-resolution pyramids and visualized using the Deep Zoom Javascript library extended to support viewing Giga-pixel images over time and extracting data on individual colonies. We present a methodology that enables quantification of variations in nominally-identical preparations and between colonies, correlation of colony characteristics with Oct4 expression, and identification of rare events. Copyright © 2016. Published by Elsevier B.V.
Bergmeister, Konstantin D; Gröger, Marion; Aman, Martin; Willensdorfer, Anna; Manzano-Szalai, Krisztina; Salminger, Stefan; Aszmann, Oskar C
2016-08-01
Skeletal muscle consists of different fiber types which adapt to exercise, aging, disease, or trauma. Here we present a protocol for fast staining, automatic acquisition, and quantification of fiber populations with ImageJ. Biceps and lumbrical muscles were harvested from Sprague-Dawley rats. Quadruple immunohistochemical staining was performed on single sections using antibodies against myosin heavy chains and secondary fluorescent antibodies. Slides were scanned automatically with a slide scanner. Manual and automatic analyses were performed and compared statistically. The protocol provided rapid and reliable staining for automated image acquisition. Analyses between manual and automatic data indicated Pearson correlation coefficients for biceps of 0.645-0.841 and 0.564-0.673 for lumbrical muscles. Relative fiber populations were accurate to a degree of ± 4%. This protocol provides a reliable tool for quantification of muscle fiber populations. Using freely available software, it decreases the required time to analyze whole muscle sections. Muscle Nerve 54: 292-299, 2016. © 2016 Wiley Periodicals, Inc.
NASA Astrophysics Data System (ADS)
Hadjidoukas, P. E.; Angelikopoulos, P.; Papadimitriou, C.; Koumoutsakos, P.
2015-03-01
We present Π4U, an extensible framework, for non-intrusive Bayesian Uncertainty Quantification and Propagation (UQ+P) of complex and computationally demanding physical models, that can exploit massively parallel computer architectures. The framework incorporates Laplace asymptotic approximations as well as stochastic algorithms, along with distributed numerical differentiation and task-based parallelism for heterogeneous clusters. Sampling is based on the Transitional Markov Chain Monte Carlo (TMCMC) algorithm and its variants. The optimization tasks associated with the asymptotic approximations are treated via the Covariance Matrix Adaptation Evolution Strategy (CMA-ES). A modified subset simulation method is used for posterior reliability measurements of rare events. The framework accommodates scheduling of multiple physical model evaluations based on an adaptive load balancing library and shows excellent scalability. In addition to the software framework, we also provide guidelines as to the applicability and efficiency of Bayesian tools when applied to computationally demanding physical models. Theoretical and computational developments are demonstrated with applications drawn from molecular dynamics, structural dynamics and granular flow.
Space Flight Software Development Software for Intelligent System Health Management
NASA Technical Reports Server (NTRS)
Trevino, Luis C.; Crumbley, Tim
2004-01-01
The slide presentation examines the Marshall Space Flight Center Flight Software Branch, including software development projects, mission critical space flight software development, software technical insight, advanced software development technologies, and continuous improvement in the software development processes and methods.
Validation of a free software for unsupervised assessment of abdominal fat in MRI.
Maddalo, Michele; Zorza, Ivan; Zubani, Stefano; Nocivelli, Giorgio; Calandra, Giulio; Soldini, Pierantonio; Mascaro, Lorella; Maroldi, Roberto
2017-05-01
To demonstrate the accuracy of an unsupervised (fully automated) software for fat segmentation in magnetic resonance imaging. The proposed software is a freeware solution developed in ImageJ that enables the quantification of metabolically different adipose tissues in large cohort studies. The lumbar part of the abdomen (19cm in craniocaudal direction, centered in L3) of eleven healthy volunteers (age range: 21-46years, BMI range: 21.7-31.6kg/m 2 ) was examined in a breath hold on expiration with a GE T1 Dixon sequence. Single-slice and volumetric data were considered for each subject. The results of the visceral and subcutaneous adipose tissue assessments obtained by the unsupervised software were compared to supervised segmentations of reference. The associated statistical analysis included Pearson correlations, Bland-Altman plots and volumetric differences (VD % ). Values calculated by the unsupervised software significantly correlated with corresponding supervised segmentations of reference for both subcutaneous adipose tissue - SAT (R=0.9996, p<0.001) and visceral adipose tissue - VAT (R=0.995, p<0.001). Bland-Altman plots showed the absence of systematic errors and a limited spread of the differences. In the single-slice analysis, VD % were (1.6±2.9)% for SAT and (4.9±6.9)% for VAT. In the volumetric analysis, VD % were (1.3±0.9)% for SAT and (2.9±2.7)% for VAT. The developed software is capable of segmenting the metabolically different adipose tissues with a high degree of accuracy. This free add-on software for ImageJ can easily have a widespread and enable large-scale population studies regarding the adipose tissue and its related diseases. Copyright © 2017 Associazione Italiana di Fisica Medica. Published by Elsevier Ltd. All rights reserved.
Conte, Gian Marco; Castellano, Antonella; Altabella, Luisa; Iadanza, Antonella; Cadioli, Marcello; Falini, Andrea; Anzalone, Nicoletta
2017-04-01
Dynamic susceptibility contrast MRI (DSC) and dynamic contrast-enhanced MRI (DCE) are useful tools in the diagnosis and follow-up of brain gliomas; nevertheless, both techniques leave the open issue of data reproducibility. We evaluated the reproducibility of data obtained using two different commercial software for perfusion maps calculation and analysis, as one of the potential sources of variability can be the software itself. DSC and DCE analyses from 20 patients with gliomas were tested for both the intrasoftware (as intraobserver and interobserver reproducibility) and the intersoftware reproducibility, as well as the impact of different postprocessing choices [vascular input function (VIF) selection and deconvolution algorithms] on the quantification of perfusion biomarkers plasma volume (Vp), volume transfer constant (K trans ) and rCBV. Data reproducibility was evaluated with the intraclass correlation coefficient (ICC) and Bland-Altman analysis. For all the biomarkers, the intra- and interobserver reproducibility resulted in almost perfect agreement in each software, whereas for the intersoftware reproducibility the value ranged from 0.311 to 0.577, suggesting fair to moderate agreement; Bland-Altman analysis showed high dispersion of data, thus confirming these findings. Comparisons of different VIF estimation methods for DCE biomarkers resulted in ICC of 0.636 for K trans and 0.662 for Vp; comparison of two deconvolution algorithms in DSC resulted in an ICC of 0.999. The use of single software ensures very good intraobserver and interobservers reproducibility. Caution should be taken when comparing data obtained using different software or different postprocessing within the same software, as reproducibility is not guaranteed anymore.
Diagnostic evaluation of three cardiac software packages using a consecutive group of patients
2011-01-01
Purpose The aim of this study was to compare the diagnostic performance of the three software packages 4DMSPECT (4DM), Emory Cardiac Toolbox (ECTb), and Cedars Quantitative Perfusion SPECT (QPS) for quantification of myocardial perfusion scintigram (MPS) using a large group of consecutive patients. Methods We studied 1,052 consecutive patients who underwent 2-day stress/rest 99mTc-sestamibi MPS studies. The reference/gold-standard classifications for the MPS studies were obtained from three physicians, with more than 25 years each of experience in nuclear cardiology, who re-evaluated all MPS images. Automatic processing was carried out using 4DM, ECTb, and QPS software packages. Total stress defect extent (TDE) and summed stress score (SSS) based on a 17-segment model were obtained from the software packages. Receiver-operating characteristic (ROC) analysis was performed. Results A total of 734 patients were classified as normal and the remaining 318 were classified as having infarction and/or ischemia. The performance of the software packages calculated as the area under the SSS ROC curve were 0.87 for 4DM, 0.80 for QPS, and 0.76 for ECTb (QPS vs. ECTb p = 0.03; other differences p < 0.0001). The area under the TDE ROC curve were 0.87 for 4DM, 0.82 for QPS, and 0.76 for ECTb (QPS vs. ECTb p = 0.0005; other differences p < 0.0001). Conclusion There are considerable differences in performance between the three software packages with 4DM showing the best performance and ECTb the worst. These differences in performance should be taken in consideration when software packages are used in clinical routine or in clinical studies. PMID:22214226
Software Process Improvement: Supporting the Linking of the Software and the Business Strategies
NASA Astrophysics Data System (ADS)
Albuquerque, Adriano Bessa; Rocha, Ana Regina; Lima, Andreia Cavalcanti
The market is becoming more and more competitive, a lot of products and services depend of the software product and the software is one of the most important assets, which influence the organizations’ businesses. Considering this context, we can observe that the companies must to deal with the software, developing or acquiring, carefully. One of the perspectives that can help to take advantage of the software, supporting effectively the business, is to invest on the organization’s software processes. This paper presents an approach to evaluate and improve the processes assets of the software organizations, based on internationally well-known standards and process models. This approach is supported by automated tools from the TABA Workstation and is part of a wider improvement strategy constituted of three layers (organizational layer, process execution layer and external entity layer). Moreover, this paper presents the experience of use and their results.
Improving a data-acquisition software system with abstract data type components
NASA Technical Reports Server (NTRS)
Howard, S. D.
1990-01-01
Abstract data types and object-oriented design are active research areas in computer science and software engineering. Much of the interest is aimed at new software development. Abstract data type packages developed for a discontinued software project were used to improve a real-time data-acquisition system under maintenance. The result saved effort and contributed to a significant improvement in the performance, maintainability, and reliability of the Goldstone Solar System Radar Data Acquisition System.
Kehoe, Helen
2017-01-01
Changes to the software used in general practice could improve the collection of the Aboriginal and Torres Strait Islander status of all patients, and boost access to healthcare measures specifically for Aboriginal and Torres Strait Islander peoples provided directly or indirectly by general practitioners (GPs). Despite longstanding calls for improvements to general practice software to better support Aboriginal and Torres Strait Islander health, little change has been made. The aim of this article is to promote software improvements by identifying desirable software attributes and encouraging GPs to promote their adoption. Establishing strong links between collecting Aboriginal and Torres Strait Islander status, clinical decision supports, and uptake of GP-mediated health measures specifically for Aboriginal and Torres Strait Islander peoples - and embedding these links in GP software - is a long overdue reform. In the absence of government initiatives in this area, GPs are best placed to advocate for software changes, using the model described here as a starting point for action.
Brown, Jason L; Bennett, Joseph R; French, Connor M
2017-01-01
SDMtoolbox 2.0 is a software package for spatial studies of ecology, evolution, and genetics. The release of SDMtoolbox 2.0 allows researchers to use the most current ArcGIS software and MaxEnt software, and reduces the amount of time that would be spent developing common solutions. The central aim of this software is to automate complicated and repetitive spatial analyses in an intuitive graphical user interface. One core tenant facilitates careful parameterization of species distribution models (SDMs) to maximize each model's discriminatory ability and minimize overfitting. This includes carefully processing of occurrence data, environmental data, and model parameterization. This program directly interfaces with MaxEnt, one of the most powerful and widely used species distribution modeling software programs, although SDMtoolbox 2.0 is not limited to species distribution modeling or restricted to modeling in MaxEnt. Many of the SDM pre- and post-processing tools have 'universal' analogs for use with any modeling software. The current version contains a total of 79 scripts that harness the power of ArcGIS for macroecology, landscape genetics, and evolutionary studies. For example, these tools allow for biodiversity quantification (such as species richness or corrected weighted endemism), generation of least-cost paths and corridors among shared haplotypes, assessment of the significance of spatial randomizations, and enforcement of dispersal limitations of SDMs projected into future climates-to only name a few functions contained in SDMtoolbox 2.0. Lastly, dozens of generalized tools exists for batch processing and conversion of GIS data types or formats, which are broadly useful to any ArcMap user.
De Spiegelaere, Ward; Malatinkova, Eva; Lynch, Lindsay; Van Nieuwerburgh, Filip; Messiaen, Peter; O'Doherty, Una; Vandekerckhove, Linos
2014-06-01
Quantification of integrated proviral HIV DNA by repetitive-sampling Alu-HIV PCR is a candidate virological tool to monitor the HIV reservoir in patients. However, the experimental procedures and data analysis of the assay are complex and hinder its widespread use. Here, we provide an improved and simplified data analysis method by adopting binomial and Poisson statistics. A modified analysis method on the basis of Poisson statistics was used to analyze the binomial data of positive and negative reactions from a 42-replicate Alu-HIV PCR by use of dilutions of an integration standard and on samples of 57 HIV-infected patients. Results were compared with the quantitative output of the previously described Alu-HIV PCR method. Poisson-based quantification of the Alu-HIV PCR was linearly correlated with the standard dilution series, indicating that absolute quantification with the Poisson method is a valid alternative for data analysis of repetitive-sampling Alu-HIV PCR data. Quantitative outputs of patient samples assessed by the Poisson method correlated with the previously described Alu-HIV PCR analysis, indicating that this method is a valid alternative for quantifying integrated HIV DNA. Poisson-based analysis of the Alu-HIV PCR data enables absolute quantification without the need of a standard dilution curve. Implementation of the CI estimation permits improved qualitative analysis of the data and provides a statistical basis for the required minimal number of technical replicates. © 2014 The American Association for Clinical Chemistry.
Meunier, Carl J; Roberts, James G; McCarty, Gregory S; Sombers, Leslie A
2017-02-15
Background-subtracted fast-scan cyclic voltammetry (FSCV) has emerged as a powerful analytical technique for monitoring subsecond molecular fluctuations in live brain tissue. Despite increasing utilization of FSCV, efforts to improve the accuracy of quantification have been limited due to the complexity of the technique and the dynamic recording environment. It is clear that variable electrode performance renders calibration necessary for accurate quantification; however, the nature of in vivo measurements can make conventional postcalibration difficult, or even impossible. Analyte-specific voltammograms and scaling factors that are critical for quantification can shift or fluctuate in vivo. This is largely due to impedance changes, and the effects of impedance on these measurements have not been characterized. We have previously reported that the background current can be used to predict electrode-specific scaling factors in situ. In this work, we employ model circuits to investigate the impact of impedance on FSCV measurements. Additionally, we take another step toward in situ electrode calibration by using the oxidation potential of quinones on the electrode surface to accurately predict the oxidation potential for dopamine at any point in an electrochemical experiment, as both are dependent on impedance. The model, validated both in adrenal slice and live brain tissue, enables information encoded in the shape of the background voltammogram to determine electrochemical parameters that are critical for accurate quantification. This improves data interpretation and provides a significant next step toward more automated methods for in vivo data analysis.
Leveraging OpenStudio's Application Programming Interfaces: Preprint
DOE Office of Scientific and Technical Information (OSTI.GOV)
Long, N.; Ball, B.; Goldwasser, D.
2013-11-01
OpenStudio development efforts have been focused on providing Application Programming Interfaces (APIs) where users are able to extend OpenStudio without the need to compile the open source libraries. This paper will discuss the basic purposes and functionalities of the core libraries that have been wrapped with APIs including the Building Model, Results Processing, Advanced Analysis, UncertaintyQuantification, and Data Interoperability through Translators. Several building energy modeling applications have been produced using OpenStudio's API and Software Development Kits (SDK) including the United States Department of Energy's Asset ScoreCalculator, a mobile-based audit tool, an energy design assistance reporting protocol, and a portfolio scalemore » incentive optimization analysismethodology. Each of these software applications will be discussed briefly and will describe how the APIs were leveraged for various uses including high-level modeling, data transformations from detailed building audits, error checking/quality assurance of models, and use of high-performance computing for mass simulations.« less
MRMPROBS suite for metabolomics using large-scale MRM assays.
Tsugawa, Hiroshi; Kanazawa, Mitsuhiro; Ogiwara, Atsushi; Arita, Masanori
2014-08-15
We developed new software environment for the metabolome analysis of large-scale multiple reaction monitoring (MRM) assays. It supports the data format of four major mass spectrometer vendors and mzML common data format. This program provides a process pipeline from the raw-format import to high-dimensional statistical analyses. The novel aspect is graphical user interface-based visualization to perform peak quantification, to interpolate missing values and to normalize peaks interactively based on quality control samples. Together with the software platform, the MRM standard library of 301 metabolites with 775 transitions is also available, which contributes to the reliable peak identification by using retention time and ion abundances. MRMPROBS is available for Windows OS under the creative-commons by-attribution license at http://prime.psc.riken.jp. © The Author 2014. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Evangelista, Dennis J.; Ray, Dylan D.; Hedrick, Tyson L.
2016-01-01
ABSTRACT Ecological, behavioral and biomechanical studies often need to quantify animal movement and behavior in three dimensions. In laboratory studies, a common tool to accomplish these measurements is the use of multiple, calibrated high-speed cameras. Until very recently, the complexity, weight and cost of such cameras have made their deployment in field situations risky; furthermore, such cameras are not affordable to many researchers. Here, we show how inexpensive, consumer-grade cameras can adequately accomplish these measurements both within the laboratory and in the field. Combined with our methods and open source software, the availability of inexpensive, portable and rugged cameras will open up new areas of biological study by providing precise 3D tracking and quantification of animal and human movement to researchers in a wide variety of field and laboratory contexts. PMID:27444791
The Rényi divergence enables accurate and precise cluster analysis for localisation microscopy.
Staszowska, Adela D; Fox-Roberts, Patrick; Hirvonen, Liisa M; Peddie, Christopher J; Collinson, Lucy M; Jones, Gareth E; Cox, Susan
2018-06-01
Clustering analysis is a key technique for quantitatively characterising structures in localisation microscopy images. To build up accurate information about biological structures, it is critical that the quantification is both accurate (close to the ground truth) and precise (has small scatter and is reproducible). Here we describe how the Rényi divergence can be used for cluster radius measurements in localisation microscopy data. We demonstrate that the Rényi divergence can operate with high levels of background and provides results which are more accurate than Ripley's functions, Voronoi tesselation or DBSCAN. Data supporting this research will be made accessible via a web link. Software codes developed for this work can be accessed via http://coxphysics.com/Renyi_divergence_software.zip. Implemented in C ++. Correspondence and requests for materials can be also addressed to the corresponding author. adela.staszowska@gmail.com or susan.cox@kcl.ac.uk. Supplementary data are available at Bioinformatics online.
P-MartCancer-Interactive Online Software to Enable Analysis of Shotgun Cancer Proteomic Datasets.
Webb-Robertson, Bobbie-Jo M; Bramer, Lisa M; Jensen, Jeffrey L; Kobold, Markus A; Stratton, Kelly G; White, Amanda M; Rodland, Karin D
2017-11-01
P-MartCancer is an interactive web-based software environment that enables statistical analyses of peptide or protein data, quantitated from mass spectrometry-based global proteomics experiments, without requiring in-depth knowledge of statistical programming. P-MartCancer offers a series of statistical modules associated with quality assessment, peptide and protein statistics, protein quantification, and exploratory data analyses driven by the user via customized workflows and interactive visualization. Currently, P-MartCancer offers access and the capability to analyze multiple cancer proteomic datasets generated through the Clinical Proteomics Tumor Analysis Consortium at the peptide, gene, and protein levels. P-MartCancer is deployed as a web service (https://pmart.labworks.org/cptac.html), alternatively available via Docker Hub (https://hub.docker.com/r/pnnl/pmart-web/). Cancer Res; 77(21); e47-50. ©2017 AACR . ©2017 American Association for Cancer Research.
TASI: A software tool for spatial-temporal quantification of tumor spheroid dynamics.
Hou, Yue; Konen, Jessica; Brat, Daniel J; Marcus, Adam I; Cooper, Lee A D
2018-05-08
Spheroid cultures derived from explanted cancer specimens are an increasingly utilized resource for studying complex biological processes like tumor cell invasion and metastasis, representing an important bridge between the simplicity and practicality of 2-dimensional monolayer cultures and the complexity and realism of in vivo animal models. Temporal imaging of spheroids can capture the dynamics of cell behaviors and microenvironments, and when combined with quantitative image analysis methods, enables deep interrogation of biological mechanisms. This paper presents a comprehensive open-source software framework for Temporal Analysis of Spheroid Imaging (TASI) that allows investigators to objectively characterize spheroid growth and invasion dynamics. TASI performs spatiotemporal segmentation of spheroid cultures, extraction of features describing spheroid morpho-phenotypes, mathematical modeling of spheroid dynamics, and statistical comparisons of experimental conditions. We demonstrate the utility of this tool in an analysis of non-small cell lung cancer spheroids that exhibit variability in metastatic and proliferative behaviors.
Mapping RNA-seq Reads with STAR
Dobin, Alexander; Gingeras, Thomas R.
2015-01-01
Mapping of large sets of high-throughput sequencing reads to a reference genome is one of the foundational steps in RNA-seq data analysis. The STAR software package performs this task with high levels of accuracy and speed. In addition to detecting annotated and novel splice junctions, STAR is capable of discovering more complex RNA sequence arrangements, such as chimeric and circular RNA. STAR can align spliced sequences of any length with moderate error rates providing scalability for emerging sequencing technologies. STAR generates output files that can be used for many downstream analyses such as transcript/gene expression quantification, differential gene expression, novel isoform reconstruction, signal visualization, and so forth. In this unit we describe computational protocols that produce various output files, use different RNA-seq datatypes, and utilize different mapping strategies. STAR is Open Source software that can be run on Unix, Linux or Mac OS X systems. PMID:26334920
Mapping RNA-seq Reads with STAR.
Dobin, Alexander; Gingeras, Thomas R
2015-09-03
Mapping of large sets of high-throughput sequencing reads to a reference genome is one of the foundational steps in RNA-seq data analysis. The STAR software package performs this task with high levels of accuracy and speed. In addition to detecting annotated and novel splice junctions, STAR is capable of discovering more complex RNA sequence arrangements, such as chimeric and circular RNA. STAR can align spliced sequences of any length with moderate error rates, providing scalability for emerging sequencing technologies. STAR generates output files that can be used for many downstream analyses such as transcript/gene expression quantification, differential gene expression, novel isoform reconstruction, and signal visualization. In this unit, we describe computational protocols that produce various output files, use different RNA-seq datatypes, and utilize different mapping strategies. STAR is open source software that can be run on Unix, Linux, or Mac OS X systems. Copyright © 2015 John Wiley & Sons, Inc.
Dual Approach To Superquantile Estimation And Applications To Density Fitting
2016-06-01
incorporate additional constraints to improve the fidelity of density estimates in tail regions. We limit our investigation to data with heavy tails, where...samples of various heavy -tailed distributions. 14. SUBJECT TERMS probability density estimation, epi-splines, optimization, risk quantification...limit our investigation to data with heavy tails, where risk quantification is typically the most difficult. Demonstrations are provided in the form of
NASA Astrophysics Data System (ADS)
Valaparla, Sunil K.; Peng, Qi; Gao, Feng; Clarke, Geoffrey D.
2014-03-01
Accurate measurements of human body fat distribution are desirable because excessive body fat is associated with impaired insulin sensitivity, type 2 diabetes mellitus (T2DM) and cardiovascular disease. In this study, we hypothesized that the performance of water suppressed (WS) MRI is superior to non-water suppressed (NWS) MRI for volumetric assessment of abdominal subcutaneous (SAT), intramuscular (IMAT), visceral (VAT), and total (TAT) adipose tissues. We acquired T1-weighted images on a 3T MRI system (TIM Trio, Siemens), which was analyzed using semi-automated segmentation software that employs a fuzzy c-means (FCM) clustering algorithm. Sixteen contiguous axial slices, centered at the L4-L5 level of the abdomen, were acquired in eight T2DM subjects with water suppression (WS) and without (NWS). Histograms from WS images show improved separation of non-fatty tissue pixels from fatty tissue pixels, compared to NWS images. Paired t-tests of WS versus NWS showed a statistically significant lower volume of lipid in the WS images for VAT (145.3 cc less, p=0.006) and IMAT (305 cc less, p<0.001), but not SAT (14.1 cc more, NS). WS measurements of TAT also resulted in lower fat volumes (436.1 cc less, p=0.002). There is strong correlation between WS and NWS quantification methods for SAT measurements (r=0.999), but poorer correlation for VAT studies (r=0.845). These results suggest that NWS pulse sequences may overestimate adipose tissue volumes and that WS pulse sequences are more desirable due to the higher contrast generated between fatty and non-fatty tissues.
Sheikhzadeh, Fahime; Ward, Rabab K; Carraro, Anita; Chen, Zhao Yang; van Niekerk, Dirk; Miller, Dianne; Ehlen, Tom; MacAulay, Calum E; Follen, Michele; Lane, Pierre M; Guillaud, Martial
2015-10-24
Cervical cancer remains a major health problem, especially in developing countries. Colposcopic examination is used to detect high-grade lesions in patients with a history of abnormal pap smears. New technologies are needed to improve the sensitivity and specificity of this technique. We propose to test the potential of fluorescence confocal microscopy to identify high-grade lesions. We examined the quantification of ex vivo confocal fluorescence microscopy to differentiate among normal cervical tissue, low-grade Cervical Intraepithelial Neoplasia (CIN), and high-grade CIN. We sought to (1) quantify nuclear morphology and tissue architecture features by analyzing images of cervical biopsies; and (2) determine the accuracy of high-grade CIN detection via confocal microscopy relative to the accuracy of detection by colposcopic impression. Forty-six biopsies obtained from colposcopically normal and abnormal cervical sites were evaluated. Confocal images were acquired at different depths from the epithelial surface and histological images were analyzed using in-house software. The features calculated from the confocal images compared well with those features obtained from the histological images and histopathological reviews of the specimens (obtained by a gynecologic pathologist). The correlations between two of these features (the nuclear-cytoplasmic ratio and the average of three nearest Delaunay-neighbors distance) and the grade of dysplasia were higher than that of colposcopic impression. The sensitivity of detecting high-grade dysplasia by analysing images collected at the surface of the epithelium, and at 15 and 30 μm below the epithelial surface were respectively 100, 100, and 92 %. Quantitative analysis of confocal fluorescence images showed its capacity for discriminating high-grade CIN lesions vs. low-grade CIN lesions and normal tissues, at different depth of imaging. This approach could be used to help clinicians identify high-grade CIN in clinical settings.
Woliner-van der Weg, Wietske; Deden, Laura N; Meeuwis, Antoi P W; Koenrades, Maaike; Peeters, Laura H C; Kuipers, Henny; Laanstra, Geert Jan; Gotthardt, Martin; Slump, Cornelis H; Visser, Eric P
2016-12-01
Quantitative single photon emission computed tomography (SPECT) is challenging, especially for pancreatic beta cell imaging with 111 In-exendin due to high uptake in the kidneys versus much lower uptake in the nearby pancreas. Therefore, we designed a three-dimensionally (3D) printed phantom representing the pancreas and kidneys to mimic the human situation in beta cell imaging. The phantom was used to assess the effect of different reconstruction settings on the quantification of the pancreas uptake for two different, commercially available software packages. 3D-printed, hollow pancreas and kidney compartments were inserted into the National Electrical Manufacturers Association (NEMA) NU2 image quality phantom casing. These organs and the background compartment were filled with activities simulating relatively high and low pancreatic 111 In-exendin uptake for, respectively, healthy humans and type 1 diabetes patients. Images were reconstructed using Siemens Flash 3D and Hermes Hybrid Recon, with varying numbers of iterations and subsets and corrections. Images were visually assessed on homogeneity and artefacts, and quantitatively by the pancreas-to-kidney activity concentration ratio. Phantom images were similar to clinical images and showed comparable artefacts. All corrections were required to clearly visualize the pancreas. Increased numbers of subsets and iterations improved the quantitative performance but decreased homogeneity both in the pancreas and the background. Based on the phantom analyses, the Hybrid Recon reconstruction with 6 iterations and 16 subsets was found to be most suitable for clinical use. This work strongly contributed to quantification of pancreatic 111 In-exendin uptake. It showed how clinical images of 111 In-exendin can be interpreted and enabled selection of the most appropriate protocol for clinical use.
ESA space spin-offs benefits for the health sector
NASA Astrophysics Data System (ADS)
Szalai, Bianca; Detsis, Emmanouil; Peeters, Walter
2012-11-01
Humanity will be faced with an important number of future challenges, including an expansion of the lifespan, a considerable increase of the population (estimated 9 billion by 2050) and a depletion of resources. These factors could trigger an increase of chronic diseases and various other health concerns that would bear a heavy weight on finances worldwide. Scientific advances can play an important role in solving a number of these problems, space technology; in general, can propose a panoply of possible solutions and applications that can make life on Earth easier and better for everyone. Satellites, Earth Observation, the International Space Station (ISS) and the European Space Agency (ESA) may not be the first tools that come to mind when thinking of improving health, yet there are many ways in which ESA and its programmes contribute to the health care arena. The research focuses on quantifying two ESA spin-offs to provide an initial view on how space can contribute to worldwide health. This quantification is part of the present strategy not only to show macroeconomic return factors for space in general, but also to identify and describe samples of 'best practice' type of examples close to the general public's interest. For each of the 'best practices' the methodology takes into account the cost of the space hardware/software, a number of tangible and intangible benefits, as well as some logical assumptions in order to determine the potential overall returns. Some of the hindering factors for a precise quantification are also highlighted. In conclusion, the study recommends a way in which ESA's spin-offs can be taken into account early on in the development process of space programmes in order to generate higher awareness with the general public and also to provide measurable returns.
Uncertainty Quantification for Robust Control of Wind Turbines using Sliding Mode Observer
NASA Astrophysics Data System (ADS)
Schulte, Horst
2016-09-01
A new quantification method of uncertain models for robust wind turbine control using sliding-mode techniques is presented with the objective to improve active load mitigation. This approach is based on the so-called equivalent output injection signal, which corresponds to the average behavior of the discontinuous switching term, establishing and maintaining a motion on a so-called sliding surface. The injection signal is directly evaluated to obtain estimates of the uncertainty bounds of external disturbances and parameter uncertainties. The applicability of the proposed method is illustrated by the quantification of a four degree-of-freedom model of the NREL 5MW reference turbine containing uncertainties.
Improving the Effectiveness of Program Managers
2006-05-03
Improving the Effectiveness of Program Managers Systems and Software Technology Conference Salt Lake City, Utah May 3, 2006 Presented by GAO’s...Companies’ best practices Motorola Caterpillar Toyota FedEx NCR Teradata Boeing Hughes Space and Communications Disciplined software and management...and total ownership costs Collection of metrics data to improve software reliability Technology readiness levels and design maturity Statistical
Almalki, Manal; Gray, Kathleen; Sanchez, Fernando Martin
2015-01-01
Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Self-quantification in personal health maintenance appears promising and exciting. However, more studies are needed to support its use in this field. The proposed model will in the future lead to developing a measure for assessing the effectiveness of interventions to support using SQS for health self-management (e.g., assessing the complexity of self-quantification activities, and activation of the individuals).
2015-01-01
Background Self-quantification is seen as an emerging paradigm for health care self-management. Self-quantification systems (SQS) can be used for tracking, monitoring, and quantifying health aspects including mental, emotional, physical, and social aspects in order to gain self-knowledge. However, there has been a lack of a systematic approach for conceptualising and mapping the essential activities that are undertaken by individuals who are using SQS in order to improve health outcomes. In this paper, we propose a new model of personal health information self-quantification systems (PHI-SQS). PHI-SQS model describes two types of activities that individuals go through during their journey of health self-managed practice, which are 'self-quantification' and 'self-activation'. Objectives In this paper, we aimed to examine thoroughly the first type of activity in PHI-SQS which is 'self-quantification'. Our objectives were to review the data management processes currently supported in a representative set of self-quantification tools and ancillary applications, and provide a systematic approach for conceptualising and mapping these processes with the individuals' activities. Method We reviewed and compared eleven self-quantification tools and applications (Zeo Sleep Manager, Fitbit, Actipressure, MoodPanda, iBGStar, Sensaris Senspod, 23andMe, uBiome, Digifit, BodyTrack, and Wikilife), that collect three key health data types (Environmental exposure, Physiological patterns, Genetic traits). We investigated the interaction taking place at different data flow stages between the individual user and the self-quantification technology used. Findings We found that these eleven self-quantification tools and applications represent two major tool types (primary and secondary self-quantification systems). In each type, the individuals experience different processes and activities which are substantially influenced by the technologies' data management capabilities. Conclusions Self-quantification in personal health maintenance appears promising and exciting. However, more studies are needed to support its use in this field. The proposed model will in the future lead to developing a measure for assessing the effectiveness of interventions to support using SQS for health self-management (e.g., assessing the complexity of self-quantification activities, and activation of the individuals). PMID:26019809
Ganapathy, Sreelatha; Muraleedharan, Aparna; Sathidevi, Puthumangalathu Savithri; Chand, Parkash; Rajkumar, Ravi Philip
2016-09-01
DNA damage analysis plays an important role in determining the approaches for treatment and prevention of various diseases like cancer, schizophrenia and other heritable diseases. Comet assay is a sensitive and versatile method for DNA damage analysis. The main objective of this work is to implement a fully automated tool for the detection and quantification of DNA damage by analysing comet assay images. The comet assay image analysis consists of four stages: (1) classifier (2) comet segmentation (3) comet partitioning and (4) comet quantification. Main features of the proposed software are the design and development of four comet segmentation methods, and the automatic routing of the input comet assay image to the most suitable one among these methods depending on the type of the image (silver stained or fluorescent stained) as well as the level of DNA damage (heavily damaged or lightly/moderately damaged). A classifier stage, based on support vector machine (SVM) is designed and implemented at the front end, to categorise the input image into one of the above four groups to ensure proper routing. Comet segmentation is followed by comet partitioning which is implemented using a novel technique coined as modified fuzzy clustering. Comet parameters are calculated in the comet quantification stage and are saved in an excel file. Our dataset consists of 600 silver stained images obtained from 40 Schizophrenia patients with different levels of severity, admitted to a tertiary hospital in South India and 56 fluorescent stained images obtained from different internet sources. The performance of "CometQ", the proposed standalone application for automated analysis of comet assay images, is evaluated by a clinical expert and is also compared with that of a most recent and related software-OpenComet. CometQ gave 90.26% positive predictive value (PPV) and 93.34% sensitivity which are much higher than those of OpenComet, especially in the case of silver stained images. The results are validated using confusion matrix and Jaccard index (JI). Comet assay images obtained after DNA damage repair by incubation in the nutrient medium were also analysed, and CometQ showed a significant change in all the comet parameters in most of the cases. Results show that CometQ is an accurate and efficient tool with good sensitivity and PPV for DNA damage analysis using comet assay images. Copyright © 2016 Elsevier Ireland Ltd. All rights reserved.
Quantitative CMMI Assessment for Offshoring through the Analysis of Project Management Repositories
NASA Astrophysics Data System (ADS)
Sunetnanta, Thanwadee; Nobprapai, Ni-On; Gotel, Olly
The nature of distributed teams and the existence of multiple sites in offshore software development projects pose a challenging setting for software process improvement. Often, the improvement and appraisal of software processes is achieved through a turnkey solution where best practices are imposed or transferred from a company’s headquarters to its offshore units. In so doing, successful project health checks and monitoring for quality on software processes requires strong project management skills, well-built onshore-offshore coordination, and often needs regular onsite visits by software process improvement consultants from the headquarters’ team. This paper focuses on software process improvement as guided by the Capability Maturity Model Integration (CMMI) and proposes a model to evaluate the status of such improvement efforts in the context of distributed multi-site projects without some of this overhead. The paper discusses the application of quantitative CMMI assessment through the collection and analysis of project data gathered directly from project repositories to facilitate CMMI implementation and reduce the cost of such implementation for offshore-outsourced software development projects. We exemplify this approach to quantitative CMMI assessment through the analysis of project management data and discuss the future directions of this work in progress.
Kumar, Avs Anil; Kumar, P G; Swami, Ajay; Dinker, Yateendra
2018-01-01
After a primary transluminal coronary angioplasty (PTCA) following AMI (acute myocardial infarction), the perfusion defect and LV (left ventricular) function recover/change over a period of time. The analysis immediately after the procedure may not be true depiction of the exact success of the procedure. There is varying and scanty information available on the natural course of changes in these parameters after a successful PTCA. We hypothesized that majority of change occurs at 3-4 month period. Hence, we undertook this study on the natural course of recovery/changes occurring in perfusion defect size and LV function in first 3 months after primary angioplasty MATERIAL AND METHODS: 30 consecutive cases of first AMI who were taken up for Primary angioplasty were enrolled into the study. Resting MPI(Myocardial perfusion imaging) was done within 24-72 hrs of admission using Tc-99m-Tetrofosmin and after 10-14 weeks. Analysis of LVEF (left ventricular ejection fraction), summed segmental score and extent of perfusion defect was done. Images were processed using autocardiac software of emory tool box and quantification was done using QPS (quantitative perfusion SPECT) and QGS (qualitative perfusion SPECT) softwares. 20 segment scoring method was used for quantification on bull's eye images. Student t test (two tailed, dependent) was used to find the significance of study parameters on continuous scale within each group. Effect size was computed to find the effect. Pearson correlation between perfusion defect and LVEF was performed at acute stage and after 10-14 weeks. The average acute perfusion defect extent was 19.76 ± 12.89% which after 3months became 16.79 ± 12.61%. The summed segmental score changed from 14.31 ± 10.58 to 11.38 ± 10.03 and LVEF improved from 48.40 ± 13.15% to 53.37 ± 12.8%. There was significant improvement in LVEF from acute setting to 10-14 weeks (p = 0.001). There was significant lowering of summed score (p = 0.007). Perfusion defect size showed significant reduction (p = 0.030). Three patients showed deterioration in perfusion defect size and in summed score with reduction in LVEF. Four patients had no change in any of the parameters. Correlation between perfusion defect and LVEF was strong both at baseline (r = -0.705, p < 0.001) and after 10-18 weeks (r = -0.766, p < 0.001). The changes we found in 3 months are similar to earlier studies and also to studies using follow up at 6 months to 1 year. We feel that 3 months is a good enough time to accurately assess the success of primary angioplasty.
Software process improvement in the NASA software engineering laboratory
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon; Basili, Victor; Zelkowitz, Marvin
1994-01-01
The Software Engineering Laboratory (SEL) was established in 1976 for the purpose of studying and measuring software processes with the intent of identifying improvements that could be applied to the production of ground support software within the Flight Dynamics Division (FDD) at the National Aeronautics and Space Administration (NASA)/Goddard Space Flight Center (GSFC). The SEL has three member organizations: NASA/GSFC, the University of Maryland, and Computer Sciences Corporation (CSC). The concept of process improvement within the SEL focuses on the continual understanding of both process and product as well as goal-driven experimentation and analysis of process change within a production environment.
Ramezani, Alireza; Ahmadieh, Hamid; Azarmina, Mohsen; Soheilian, Masoud; Dehghan, Mohammad H; Mohebbi, Mohammad R
2009-12-01
To evaluate the validity of a new method for the quantitative analysis of fundus or angiographic images using Photoshop 7.0 (Adobe, USA) software by comparing with clinical evaluation. Four hundred and eighteen fundus and angiographic images of diabetic patients were evaluated by three retina specialists and then by computing using Photoshop 7.0 software. Four variables were selected for comparison: amount of hard exudates (HE) on color pictures, amount of HE on red-free pictures, severity of leakage, and the size of the foveal avascular zone (FAZ). The coefficient of agreement (Kappa) between the two methods in the amount of HE on color and red-free photographs were 85% (0.69) and 79% (0.59), respectively. The agreement for severity of leakage was 72% (0.46). In the two methods for the evaluation of the FAZ size using the magic and lasso software tools, the agreement was 54% (0.09) and 89% (0.77), respectively. Agreement in the estimation of the FAZ size by the lasso magnetic tool was excellent and was almost as good in the quantification of HE on color and on red-free images. Considering the agreement of this new technique for the measurement of variables in fundus images using Photoshop software with the clinical evaluation, this method seems to have sufficient validity to be used for the quantitative analysis of HE, leakage, and FAZ size on the angiograms of diabetic patients.
The research and practice of spacecraft software engineering
NASA Astrophysics Data System (ADS)
Chen, Chengxin; Wang, Jinghua; Xu, Xiaoguang
2017-06-01
In order to ensure the safety and reliability of spacecraft software products, it is necessary to execute engineering management. Firstly, the paper introduces the problems of unsystematic planning, uncertain classified management and uncontinuous improved mechanism in domestic and foreign spacecraft software engineering management. Then, it proposes a solution for software engineering management based on system-integrated ideology in the perspective of spacecraft system. Finally, a application result of spacecraft is given as an example. The research can provides a reference for executing spacecraft software engineering management and improving software product quality.
Software IV and V Research Priorities and Applied Program Accomplishments Within NASA
NASA Technical Reports Server (NTRS)
Blazy, Louis J.
2000-01-01
The mission of this research is to be world-class creators and facilitators of innovative, intelligent, high performance, reliable information technologies that enable NASA missions to (1) increase software safety and quality through error avoidance, early detection and resolution of errors, by utilizing and applying empirically based software engineering best practices; (2) ensure customer software risks are identified and/or that requirements are met and/or exceeded; (3) research, develop, apply, verify, and publish software technologies for competitive advantage and the advancement of science; and (4) facilitate the transfer of science and engineering data, methods, and practices to NASA, educational institutions, state agencies, and commercial organizations. The goals are to become a national Center Of Excellence (COE) in software and system independent verification and validation, and to become an international leading force in the field of software engineering for improving the safety, quality, reliability, and cost performance of software systems. This project addresses the following problems: Ensure safety of NASA missions, ensure requirements are met, minimize programmatic and technological risks of software development and operations, improve software quality, reduce costs and time to delivery, and improve the science of software engineering
Weather Augmented Risk Determination (WARD) System
NASA Astrophysics Data System (ADS)
Niknejad, M.; Mazdiyasni, O.; Momtaz, F.; AghaKouchak, A.
2017-12-01
Extreme climatic events have direct and indirect impacts on society, economy and the environment. Based on the United States Bureau of Economic Analysis (BEA) data, over one third of the U.S. GDP can be considered as weather-sensitive involving some degree of weather risk. This expands from a local scale concrete foundation construction to large scale transportation systems. Extreme and unexpected weather conditions have always been considered as one of the probable risks to human health, productivity and activities. The construction industry is a large sector of the economy, and is also greatly influenced by weather-related risks including work stoppage and low labor productivity. Identification and quantification of these risks, and providing mitigation of their effects are always the concerns of construction project managers. In addition to severe weather conditions' destructive effects, seasonal changes in weather conditions can also have negative impacts on human health. Work stoppage and reduced labor productivity can be caused by precipitation, wind, temperature, relative humidity and other weather conditions. Historical and project-specific weather information can improve better project management and mitigation planning, and ultimately reduce the risk of weather-related conditions. This paper proposes new software for project-specific user-defined data analysis that offers (a) probability of work stoppage and the estimated project length considering weather conditions; (b) information on reduced labor productivity and its impacts on project duration; and (c) probabilistic information on the project timeline based on both weather-related work stoppage and labor productivity. The software (WARD System) is designed such that it can be integrated into the already available project management tools. While the system and presented application focuses on the construction industry, the developed software is general and can be used for any application that involves labor productivity (e.g., farming) and work stoppage due to weather conditions (e.g., transportation, agriculture industry).
DOE Office of Scientific and Technical Information (OSTI.GOV)
Kronewitter, Scott R.; Slysz, Gordon W.; Marginean, Ioan
2014-05-31
Dense LC-MS datasets have convoluted extracted ion chromatograms with multiple chromatographic peaks that cloud the differentiation between intact compounds with their overlapping isotopic distributions, peaks due to insource ion fragmentation, and noise. Making this differentiation is critical in glycomics datasets because chromatographic peaks correspond to different intact glycan structural isomers. The GlyQ-IQ software is targeted chromatography centric software designed for chromatogram and mass spectra data processing and subsequent glycan composition annotation. The targeted analysis approach offers several key advantages to LC-MS data processing and annotation over traditional algorithms. A priori information about the individual target’s elemental composition allows for exactmore » isotope profile modeling for improved feature detection and increased sensitivity by focusing chromatogram generation and peak fitting on the isotopic species in the distribution having the highest intensity and data quality. Glycan target annotation is corroborated by glycan family relationships and in source fragmentation detection. The GlyQ-IQ software is developed in this work (Part 1) and was used to profile N-glycan compositions from human serum LC-MS Datasets. The companion manuscript GlyQ-IQ Part 2 discusses developments in human serum N-glycan sample preparation, glycan isomer separation, and glycan electrospray ionization. A case study is presented to demonstrate how GlyQ-IQ identifies and removes confounding chromatographic peaks from high mannose glycan isomers from human blood serum. In addition, GlyQ-IQ was used to generate a broad N-glycan profile from a high resolution (100K/60K) nESI-LS-MS/MS dataset including CID and HCD fragmentation acquired on a Velos Pro Mass spectrometer. 101 glycan compositions and 353 isomer peaks were detected from a single sample. 99% of the GlyQ-IQ glycan-feature assignments passed manual validation and are backed with high resolution mass spectra and mass accuracies less than 7 ppm.« less
PyQuant: A Versatile Framework for Analysis of Quantitative Mass Spectrometry Data*
Mitchell, Christopher J.; Kim, Min-Sik; Na, Chan Hyun; Pandey, Akhilesh
2016-01-01
Quantitative mass spectrometry data necessitates an analytical pipeline that captures the accuracy and comprehensiveness of the experiments. Currently, data analysis is often coupled to specific software packages, which restricts the analysis to a given workflow and precludes a more thorough characterization of the data by other complementary tools. To address this, we have developed PyQuant, a cross-platform mass spectrometry data quantification application that is compatible with existing frameworks and can be used as a stand-alone quantification tool. PyQuant supports most types of quantitative mass spectrometry data including SILAC, NeuCode, 15N, 13C, or 18O and chemical methods such as iTRAQ or TMT and provides the option of adding custom labeling strategies. In addition, PyQuant can perform specialized analyses such as quantifying isotopically labeled samples where the label has been metabolized into other amino acids and targeted quantification of selected ions independent of spectral assignment. PyQuant is capable of quantifying search results from popular proteomic frameworks such as MaxQuant, Proteome Discoverer, and the Trans-Proteomic Pipeline in addition to several standalone search engines. We have found that PyQuant routinely quantifies a greater proportion of spectral assignments, with increases ranging from 25–45% in this study. Finally, PyQuant is capable of complementing spectral assignments between replicates to quantify ions missed because of lack of MS/MS fragmentation or that were omitted because of issues such as spectra quality or false discovery rates. This results in an increase of biologically useful data available for interpretation. In summary, PyQuant is a flexible mass spectrometry data quantification platform that is capable of interfacing with a variety of existing formats and is highly customizable, which permits easy configuration for custom analysis. PMID:27231314
Song, Jiao; Liu, Xuejun; Wu, Jiejun; Meehan, Michael J; Blevitt, Jonathan M; Dorrestein, Pieter C; Milla, Marcos E
2013-02-15
We have developed an ultra-performance liquid chromatography-multiple reaction monitoring/mass spectrometry (UPLC-MRM/MS)-based, high-content, high-throughput platform that enables simultaneous profiling of multiple lipids produced ex vivo in human whole blood (HWB) on treatment with calcium ionophore and its modulation with pharmacological agents. HWB samples were processed in a 96-well plate format compatible with high-throughput sample processing instrumentation. We employed a scheduled MRM (sMRM) method, with a triple-quadrupole mass spectrometer coupled to a UPLC system, to measure absolute amounts of 122 distinct eicosanoids using deuterated internal standards. In a 6.5-min run, we resolved and detected with high sensitivity (lower limit of quantification in the range of 0.4-460 pg) all targeted analytes from a very small HWB sample (2.5 μl). Approximately 90% of the analytes exhibited a dynamic range exceeding 1000. We also developed a tailored software package that dramatically sped up the overall data quantification and analysis process with superior consistency and accuracy. Matrix effects from HWB and precision of the calibration curve were evaluated using this newly developed automation tool. This platform was successfully applied to the global quantification of changes on all 122 eicosanoids in HWB samples from healthy donors in response to calcium ionophore stimulation. Copyright © 2012 Elsevier Inc. All rights reserved.
Biofilm Quantification on Nasolacrimal Silastic Stents After Dacryocystorhinostomy.
Murphy, Jae; Ali, Mohammed Javed; Psaltis, Alkis James
2015-01-01
Biofilms are now recognized as potential factors in the pathogenesis of chronic inflammatory and infective diseases. The aim of this study was to examine the presence of biofilms and quantify their biomass on silastic nasolacrimal duct stents inserted after dacryocystorhinostomy (DCR). A prospective study was performed on a series of patients undergoing DCR with O'Donoghue stent insertion. After removal, the stents were subjected to biofilm analysis using standard protocols of confocal laser scanning microscopy (CLSM) and scanning electron microscopy. These stents were compared against negative controls and positive in vitro ones established using Staphylococcus aureus strain ATCC 25923. Biofilm quantification was performed using the COMSTAT2 software and the total biofilm biomass was calculated. A total of nine consecutive patient samples were included in this prospective study. None of the patients had any evidence of postoperative infection. All the stents demonstrated evidence of biofilm formation using both imaging modalities. The presence of various different sized organisms within a common exopolysaccharide matrix on CLSM suggested the existence of polymicrobial communities. The mean biomass of patient samples was 0.9385 μm³/μm² (range: 0.3901-1.9511 μm³/μm²). This is the first study to report the quantification of biomass on lacrimal stents. The presence of biofilms on lacrimal stents after DCR is a common finding but this need not necessarily translate to postoperative clinical infection.
Krishnamurthy, Gerbail T; Krishnamurthy, Shakuntala; Gambhir, Sanjiv Sam; Rodrigues, Cesar; Rosenberg, Jarrett; Schiepers, Christiaan; Buxton-Thomas, Muriel
2009-12-01
To develop a software tool for quantification of liver and gallbladder function, and to assess the repeatability and reproducibility of measurements made with it. The software tool developed with the JAVA programming language uses the JAVA2 Standard Edition framework. After manual selection of the regions of interest on a 99mTc hepatic iminodiacetic acid study, the program calculates differential hepatic bile flow, basal duodeno-gastric bile reflux (B-DGBR), hepatic extraction fraction (HEF) of both the lobes with deconvolutional analysis and excretion half-time with nonlinear least squares fit. Gallbladder ejection fraction, ejection period (EP), ejection rate (ER), and postcholecystokinin (CCK) DGBR are calculated after stimulation with CCK-8. To assess intra-observer repeatability and intra-observer reproducibility, measurements from 10 normal participants were analyzed twice by three nuclear medicine technologists at the primary center. To assess inter-site reproducibility, measurements from a superset of 24 normal participants were also assessed once by three observers at the primary center and single observer at three other sites. For the 24 control participants, mean+/-SD of hepatic bile flow into gallbladder was 63.87+/-28.7%, HEF of the right lobe 100+/-0%, left lobe 99.43+2.63%, excretion half-time of the right lobe 21.50+6.98 min, left lobe 28.3+/-11.3 min. Basal DGBR was 1.2+/-1.0%. Gallbladder ejection fraction was 80+/-11%, EP 15.0+/-3.0 min, ER 5.8+/-1.6%/min, and DGBR-CCK 1.3+/-2.3%. Left and right lobe HEF was virtually identical across readers. All measures showed high repeatability except for gallbladder bile flow, basal DGBR, and EP, which exhibited marginal repeatability. Ejection fraction exhibited high reproducibility. There was high concordance among the three primary center observers except for basal DGBR, EP, and ER. Concordance between the primary site and one of the other sites was high, one was fair, and one was poor. New United States Food and Drug Administration-approved personal computer-based Krishnamurthy Hepato-Biliary Software for quantification of the liver and gallbladder function shows promise for consistently repeatable and reproducible results both within and between institutions, and may help to promote universal standardization of data acquisition and analysis in nuclear hepatology.
Fully Employing Software Inspections Data
NASA Technical Reports Server (NTRS)
Shull, Forrest; Feldmann, Raimund L.; Seaman, Carolyn; Regardie, Myrna; Godfrey, Sally
2009-01-01
Software inspections provide a proven approach to quality assurance for software products of all kinds, including requirements, design, code, test plans, among others. Common to all inspections is the aim of finding and fixing defects as early as possible, and thereby providing cost savings by minimizing the amount of rework necessary later in the lifecycle. Measurement data, such as the number and type of found defects and the effort spent by the inspection team, provide not only direct feedback about the software product to the project team but are also valuable for process improvement activities. In this paper, we discuss NASA's use of software inspections and the rich set of data that has resulted. In particular, we present results from analysis of inspection data that illustrate the benefits of fully utilizing that data for process improvement at several levels. Examining such data across multiple inspections or projects allows team members to monitor and trigger cross project improvements. Such improvements may focus on the software development processes of the whole organization as well as improvements to the applied inspection process itself.
The Challenges of Credible Thermal Protection System Reliability Quantification
NASA Technical Reports Server (NTRS)
Green, Lawrence L.
2013-01-01
The paper discusses several of the challenges associated with developing a credible reliability estimate for a human-rated crew capsule thermal protection system. The process of developing such a credible estimate is subject to the quantification, modeling and propagation of numerous uncertainties within a probabilistic analysis. The development of specific investment recommendations, to improve the reliability prediction, among various potential testing and programmatic options is then accomplished through Bayesian analysis.
Ortega, Nàdia; Macià, Alba; Romero, Maria-Paz; Trullols, Esther; Morello, Jose-Ramón; Anglès, Neus; Motilva, Maria-Jose
2009-08-26
An improved chromatographic method was developed using ultra-performance liquid chromatography-tandem mass spectrometry to identify and quantify phenolic compounds and alkaloids, theobromine and caffeine, in carob flour samples. The developed method has been validated in terms of speed, sensitivity, selectivity, peak efficiency, linearity, reproducibility, limits of detection, and limits of quantification. The chromatographic method allows the identification and quantification of 20 phenolic compounds, that is, phenolic acids, flavonoids, and their aglycone and glucoside forms, together with the determination of the alkaloids, caffeine and theobromine, at low concentration levels all in a short analysis time of less than 20 min.
Antibiotic Resistome: Improving Detection and Quantification Accuracy for Comparative Metagenomics.
Elbehery, Ali H A; Aziz, Ramy K; Siam, Rania
2016-04-01
The unprecedented rise of life-threatening antibiotic resistance (AR), combined with the unparalleled advances in DNA sequencing of genomes and metagenomes, has pushed the need for in silico detection of the resistance potential of clinical and environmental metagenomic samples through the quantification of AR genes (i.e., genes conferring antibiotic resistance). Therefore, determining an optimal methodology to quantitatively and accurately assess AR genes in a given environment is pivotal. Here, we optimized and improved existing AR detection methodologies from metagenomic datasets to properly consider AR-generating mutations in antibiotic target genes. Through comparative metagenomic analysis of previously published AR gene abundance in three publicly available metagenomes, we illustrate how mutation-generated resistance genes are either falsely assigned or neglected, which alters the detection and quantitation of the antibiotic resistome. In addition, we inspected factors influencing the outcome of AR gene quantification using metagenome simulation experiments, and identified that genome size, AR gene length, total number of metagenomics reads and selected sequencing platforms had pronounced effects on the level of detected AR. In conclusion, our proposed improvements in the current methodologies for accurate AR detection and resistome assessment show reliable results when tested on real and simulated metagenomic datasets.
Jeudy, Jeremy; Salvador, Arnaud; Simon, Romain; Jaffuel, Aurore; Fonbonne, Catherine; Léonard, Jean-François; Gautier, Jean-Charles; Pasquier, Olivier; Lemoine, Jerome
2014-02-01
Targeted mass spectrometry in the so-called multiple reaction monitoring mode (MRM) is certainly a promising way for the precise, accurate, and multiplexed measurement of proteins and their genetic or posttranslationally modified isoforms. MRM carried out on a low-resolution triple quadrupole instrument faces a lack of specificity when addressing the quantification of weakly concentrated proteins. In this case, extensive sample fractionation or immunoenrichment alleviates signal contamination by interferences, but in turn decreases assay performance and throughput. Recently, MRM(3) was introduced as an alternative to MRM to improve the limit of quantification of weakly concentrated protein biomarkers. In the present work, we compare MRM and MRM(3) modes for the detection of biomarkers in plasma and urine. Calibration curves drawn with MRM and MRM(3) showed a similar range of linearity (R(2) > 0.99 for both methods) with protein concentrations above 1 μg/mL in plasma and a few nanogram per milliliter in urine. In contrast, optimized MRM(3) methods improve the limits of quantification by a factor of 2 to 4 depending on the targeted peptide. This gain arises from the additional MS(3) fragmentation step, which significantly removes or decreases interfering signals within the targeted transition channels.
Barricklow, Jason; Ryder, Tim F; Furlong, Michael T
2009-08-01
During LC-MS/MS quantification of a small molecule in human urine samples from a clinical study, an unexpected peak was observed to nearly co-elute with the analyte of interest in many study samples. Improved chromatographic resolution revealed the presence of at least 3 non-analyte peaks, which were identified as cysteine metabolites and N-acetyl (mercapturic acid) derivatives thereof. These metabolites produced artifact responses in the parent compound MRM channel due to decomposition in the ionization source of the mass spectrometer. Quantitative comparison of the analyte concentrations in study samples using the original chromatographic method and the improved chromatographic separation method demonstrated that the original method substantially over-estimated the analyte concentration in many cases. The substitution of electrospray ionization (ESI) for atmospheric pressure chemical ionization (APCI) nearly eliminated the source instability of these metabolites, which would have mitigated their interference in the quantification of the analyte, even without chromatographic separation. These results 1) demonstrate the potential for thiol metabolite interferences during the quantification of small molecules in pharmacokinetic samples, and 2) underscore the need to carefully evaluate LC-MS/MS methods for molecules that can undergo metabolism to thiol adducts to ensure that they are not susceptible to such interferences during quantification.
Systematic Errors in Peptide and Protein Identification and Quantification by Modified Peptides*
Bogdanow, Boris; Zauber, Henrik; Selbach, Matthias
2016-01-01
The principle of shotgun proteomics is to use peptide mass spectra in order to identify corresponding sequences in a protein database. The quality of peptide and protein identification and quantification critically depends on the sensitivity and specificity of this assignment process. Many peptides in proteomic samples carry biochemical modifications, and a large fraction of unassigned spectra arise from modified peptides. Spectra derived from modified peptides can erroneously be assigned to wrong amino acid sequences. However, the impact of this problem on proteomic data has not yet been investigated systematically. Here we use combinations of different database searches to show that modified peptides can be responsible for 20–50% of false positive identifications in deep proteomic data sets. These false positive hits are particularly problematic as they have significantly higher scores and higher intensities than other false positive matches. Furthermore, these wrong peptide assignments lead to hundreds of false protein identifications and systematic biases in protein quantification. We devise a “cleaned search” strategy to address this problem and show that this considerably improves the sensitivity and specificity of proteomic data. In summary, we show that modified peptides cause systematic errors in peptide and protein identification and quantification and should therefore be considered to further improve the quality of proteomic data annotation. PMID:27215553
North, Frederick; Varkey, Prathiba; Caraballo, Pedro; Vsetecka, Darlene; Bartel, Greg
2007-10-11
Complex decision support software can require significant effort in maintenance and enhancement. A quality improvement tool, the prioritization matrix, was successfully used to guide software enhancement of algorithms in a symptom assessment call center.
Process Correlation Analysis Model for Process Improvement Identification
Park, Sooyong
2014-01-01
Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data. PMID:24977170
Process correlation analysis model for process improvement identification.
Choi, Su-jin; Kim, Dae-Kyoo; Park, Sooyong
2014-01-01
Software process improvement aims at improving the development process of software systems. It is initiated by process assessment identifying strengths and weaknesses and based on the findings, improvement plans are developed. In general, a process reference model (e.g., CMMI) is used throughout the process of software process improvement as the base. CMMI defines a set of process areas involved in software development and what to be carried out in process areas in terms of goals and practices. Process areas and their elements (goals and practices) are often correlated due to the iterative nature of software development process. However, in the current practice, correlations of process elements are often overlooked in the development of an improvement plan, which diminishes the efficiency of the plan. This is mainly attributed to significant efforts and the lack of required expertise. In this paper, we present a process correlation analysis model that helps identify correlations of process elements from the results of process assessment. This model is defined based on CMMI and empirical data of improvement practices. We evaluate the model using industrial data.
The TAME Project: Towards improvement-oriented software environments
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Rombach, H. Dieter
1988-01-01
Experience from a dozen years of analyzing software engineering processes and products is summarized as a set of software engineering and measurement principles that argue for software engineering process models that integrate sound planning and analysis into the construction process. In the TAME (Tailoring A Measurement Environment) project at the University of Maryland, such an improvement-oriented software engineering process model was developed that uses the goal/question/metric paradigm to integrate the constructive and analytic aspects of software development. The model provides a mechanism for formalizing the characterization and planning tasks, controlling and improving projects based on quantitative analysis, learning in a deeper and more systematic way about the software process and product, and feeding the appropriate experience back into the current and future projects. The TAME system is an instantiation of the TAME software engineering process model as an ISEE (integrated software engineering environment). The first in a series of TAME system prototypes has been developed. An assessment of experience with this first limited prototype is presented including a reassessment of its initial architecture.
IPLaminator: an ImageJ plugin for automated binning and quantification of retinal lamination.
Li, Shuai; Woodfin, Michael; Long, Seth S; Fuerst, Peter G
2016-01-16
Information in the brain is often segregated into spatially organized layers that reflect the function of the embedded circuits. This is perhaps best exemplified in the layering, or lamination, of the retinal inner plexiform layer (IPL). The neurites of the retinal ganglion, amacrine and bipolar cell subtypes that form synapses in the IPL are precisely organized in highly refined strata within the IPL. Studies focused on developmental organization and cell morphology often use this layered stratification to characterize cells and identify the function of genes in development of the retina. A current limitation to such analysis is the lack of standardized tools to quantitatively analyze this complex structure. Most previous work on neuron stratification in the IPL is qualitative and descriptive. In this study we report the development of an intuitive platform to rapidly and reproducibly assay IPL lamination. The novel ImageJ based software plugin we developed: IPLaminator, rapidly analyzes neurite stratification patterns in the retina and other neural tissues. A range of user options allows researchers to bin IPL stratification based on fixed points, such as the neurites of cholinergic amacrine cells, or to define a number of bins into which the IPL will be divided. Options to analyze tissues such as cortex were also added. Statistical analysis of the output then allows a quantitative value to be assigned to differences in laminar patterning observed in different models, genotypes or across developmental time. IPLaminator is an easy to use software application that will greatly speed and standardize quantification of neuron organization.
Polarization sensitive camera for the in vitro diagnostic and monitoring of dental erosion
NASA Astrophysics Data System (ADS)
Bossen, Anke; Rakhmatullina, Ekaterina; Lussi, Adrian; Meier, Christoph
Due to a frequent consumption of acidic food and beverages, the prevalence of dental erosion increases worldwide. In an initial erosion stage, the hard dental tissue is softened due to acidic demineralization. As erosion progresses, a gradual tissue wear occurs resulting in thinning of the enamel. Complete loss of the enamel tissue can be observed in severe clinical cases. Therefore, it is essential to provide a diagnosis tool for an accurate detection and monitoring of dental erosion already at early stages. In this manuscript, we present the development of a polarization sensitive imaging camera for the visualization and quantification of dental erosion. The system consists of two CMOS cameras mounted on two sides of a polarizing beamsplitter. A horizontal linearly polarized light source is positioned orthogonal to the camera to ensure an incidence illumination and detection angles of 45°. The specular reflected light from the enamel surface is collected with an objective lens mounted on the beam splitter and divided into horizontal (H) and vertical (V) components on each associate camera. Images of non-eroded and eroded enamel surfaces at different erosion degrees were recorded and assessed with diagnostic software. The software was designed to generate and display two types of images: distribution of the reflection intensity (V) and a polarization ratio (H-V)/(H+V) throughout the analyzed tissue area. The measurements and visualization of these two optical parameters, i.e. specular reflection intensity and the polarization ratio, allowed detection and quantification of enamel erosion at early stages in vitro.
NASA Astrophysics Data System (ADS)
Marchadier, A.; Vidal, C.; Ordureau, S.; Lédée, R.; Léger, C.; Young, M.; Goldberg, M.
2011-03-01
Research on bone and teeth mineralization in animal models is critical for understanding human pathologies. Genetically modified mice represent highly valuable models for the study of osteo/dentinogenesis defects and osteoporosis. Current investigations on mice dental and skeletal phenotype use destructive and time consuming methods such as histology and scanning microscopy. Micro-CT imaging is quicker and provides high resolution qualitative phenotypic description. However reliable quantification of mineralization processes in mouse bone and teeth are still lacking. We have established novel CT imaging-based software for accurate qualitative and quantitative analysis of mouse mandibular bone and molars. Data were obtained from mandibles of mice lacking the Fibromodulin gene which is involved in mineralization processes. Mandibles were imaged with a micro-CT originally devoted to industrial applications (Viscom, X8060 NDT). 3D advanced visualization was performed using the VoxBox software (UsefulProgress) with ray casting algorithms. Comparison between control and defective mice mandibles was made by applying the same transfer function for each 3D data, thus allowing to detect shape, colour and density discrepencies. The 2D images of transverse slices of mandible and teeth were similar and even more accurate than those obtained with scanning electron microscopy. Image processing of the molars allowed the 3D reconstruction of the pulp chamber, providing a unique tool for the quantitative evaluation of dentinogenesis. This new method is highly powerful for the study of oro-facial mineralizations defects in mice models, complementary and even competitive to current histological and scanning microscopy appoaches.
Abbatiello, Susan E; Mani, D R; Keshishian, Hasmik; Carr, Steven A
2010-02-01
Multiple reaction monitoring mass spectrometry (MRM-MS) of peptides with stable isotope-labeled internal standards (SISs) is increasingly being used to develop quantitative assays for proteins in complex biological matrices. These assays can be highly precise and quantitative, but the frequent occurrence of interferences requires that MRM-MS data be manually reviewed, a time-intensive process subject to human error. We developed an algorithm that identifies inaccurate transition data based on the presence of interfering signal or inconsistent recovery among replicate samples. The algorithm objectively evaluates MRM-MS data with 2 orthogonal approaches. First, it compares the relative product ion intensities of the analyte peptide to those of the SIS peptide and uses a t-test to determine if they are significantly different. A CV is then calculated from the ratio of the analyte peak area to the SIS peak area from the sample replicates. The algorithm identified problematic transitions and achieved accuracies of 94%-100%, with a sensitivity and specificity of 83%-100% for correct identification of errant transitions. The algorithm was robust when challenged with multiple types of interferences and problematic transitions. This algorithm for automated detection of inaccurate and imprecise transitions (AuDIT) in MRM-MS data reduces the time required for manual and subjective inspection of data, improves the overall accuracy of data analysis, and is easily implemented into the standard data-analysis work flow. AuDIT currently works with results exported from MRM-MS data-processing software packages and may be implemented as an analysis tool within such software.
Abbatiello, Susan E.; Mani, D. R.; Keshishian, Hasmik; Carr, Steven A.
2010-01-01
BACKGROUND Multiple reaction monitoring mass spectrometry (MRM-MS) of peptides with stable isotope–labeled internal standards (SISs) is increasingly being used to develop quantitative assays for proteins in complex biological matrices. These assays can be highly precise and quantitative, but the frequent occurrence of interferences requires that MRM-MS data be manually reviewed, a time-intensive process subject to human error. We developed an algorithm that identifies inaccurate transition data based on the presence of interfering signal or inconsistent recovery among replicate samples. METHODS The algorithm objectively evaluates MRM-MS data with 2 orthogonal approaches. First, it compares the relative product ion intensities of the analyte peptide to those of the SIS peptide and uses a t-test to determine if they are significantly different. A CV is then calculated from the ratio of the analyte peak area to the SIS peak area from the sample replicates. RESULTS The algorithm identified problematic transitions and achieved accuracies of 94%–100%, with a sensitivity and specificity of 83%–100% for correct identification of errant transitions. The algorithm was robust when challenged with multiple types of interferences and problematic transitions. CONCLUSIONS This algorithm for automated detection of inaccurate and imprecise transitions (AuDIT) in MRM-MS data reduces the time required for manual and subjective inspection of data, improves the overall accuracy of data analysis, and is easily implemented into the standard data-analysis work flow. AuDIT currently works with results exported from MRM-MS data-processing software packages and may be implemented as an analysis tool within such software. PMID:20022980
Software Engineering Improvement Plan
NASA Technical Reports Server (NTRS)
2006-01-01
In performance of this task order, bd Systems personnel provided support to the Flight Software Branch and the Software Working Group through multiple tasks related to software engineering improvement and to activities of the independent Technical Authority (iTA) Discipline Technical Warrant Holder (DTWH) for software engineering. To ensure that the products, comments, and recommendations complied with customer requirements and the statement of work, bd Systems personnel maintained close coordination with the customer. These personnel performed work in areas such as update of agency requirements and directives database, software effort estimation, software problem reports, a web-based process asset library, miscellaneous documentation review, software system requirements, issue tracking software survey, systems engineering NPR, and project-related reviews. This report contains a summary of the work performed and the accomplishments in each of these areas.
Statistical Validation for Clinical Measures: Repeatability and Agreement of Kinect™-Based Software
Tello, Emanuel; Rodrigo, Alejandro; Valentinuzzi, Max E.
2018-01-01
Background The rehabilitation process is a fundamental stage for recovery of people's capabilities. However, the evaluation of the process is performed by physiatrists and medical doctors, mostly based on their observations, that is, a subjective appreciation of the patient's evolution. This paper proposes a tracking platform of the movement made by an individual's upper limb using Kinect sensor(s) to be applied for the patient during the rehabilitation process. The main contribution is the development of quantifying software and the statistical validation of its performance, repeatability, and clinical use in the rehabilitation process. Methods The software determines joint angles and upper limb trajectories for the construction of a specific rehabilitation protocol and quantifies the treatment evolution. In turn, the information is presented via a graphical interface that allows the recording, storage, and report of the patient's data. For clinical purposes, the software information is statistically validated with three different methodologies, comparing the measures with a goniometer in terms of agreement and repeatability. Results The agreement of joint angles measured with the proposed software and goniometer is evaluated with Bland-Altman plots; all measurements fell well within the limits of agreement, meaning interchangeability of both techniques. Additionally, the results of Bland-Altman analysis of repeatability show 95% confidence. Finally, the physiotherapists' qualitative assessment shows encouraging results for the clinical use. Conclusion The main conclusion is that the software is capable of offering a clinical history of the patient and is useful for quantification of the rehabilitation success. The simplicity, low cost, and visualization possibilities enhance the use of the software Kinect for rehabilitation and other applications, and the expert's opinion endorses the choice of our approach for clinical practice. Comparison of the new measurement technique with established goniometric methods determines that the proposed software agrees sufficiently to be used interchangeably. PMID:29750166
The Software Engineering Laboratory: An operational software experience factory
NASA Technical Reports Server (NTRS)
Basili, Victor R.; Caldiera, Gianluigi; Mcgarry, Frank; Pajerski, Rose; Page, Gerald; Waligora, Sharon
1992-01-01
For 15 years, the Software Engineering Laboratory (SEL) has been carrying out studies and experiments for the purpose of understanding, assessing, and improving software and software processes within a production software development environment at NASA/GSFC. The SEL comprises three major organizations: (1) NASA/GSFC, Flight Dynamics Division; (2) University of Maryland, Department of Computer Science; and (3) Computer Sciences Corporation, Flight Dynamics Technology Group. These organizations have jointly carried out several hundred software studies, producing hundreds of reports, papers, and documents, all of which describe some aspect of the software engineering technology that was analyzed in the flight dynamics environment at NASA. The studies range from small, controlled experiments (such as analyzing the effectiveness of code reading versus that of functional testing) to large, multiple project studies (such as assessing the impacts of Ada on a production environment). The organization's driving goal is to improve the software process continually, so that sustained improvement may be observed in the resulting products. This paper discusses the SEL as a functioning example of an operational software experience factory and summarizes the characteristics of and major lessons learned from 15 years of SEL operations.
NASA Astrophysics Data System (ADS)
Restaino, Stephen M.; White, Ian M.
2017-03-01
Surface Enhanced Raman spectroscopy (SERS) provides significant improvements over conventional methods for single and multianalyte quantification. Specifically, the spectroscopic fingerprint provided by Raman scattering allows for a direct multiplexing potential far beyond that of fluorescence and colorimetry. Additionally, SERS generates a comparatively low financial and spatial footprint compared with common fluorescence based systems. Despite the advantages of SERS, it has remained largely an academic pursuit. In the field of biosensing, techniques to apply SERS to molecular diagnostics are constantly under development but, most often, assay protocols are redesigned around the use of SERS as a quantification method and ultimately complicate existing protocols. Our group has sought to rethink common SERS methodologies in order to produce translational technologies capable of allowing SERS to compete in the evolving, yet often inflexible biosensing field. This work will discuss the development of two techniques for quantification of microRNA, a promising biomarker for homeostatic and disease conditions ranging from cancer to HIV. First, an inkjet-printed paper SERS sensor has been developed to allow on-demand production of a customizable and multiplexable single-step lateral flow assay for miRNA quantification. Second, as miRNA concentrations commonly exist in relatively low concentrations, amplification methods (e.g. PCR) are therefore required to facilitate quantification. This work presents a novel miRNA assay alongside a novel technique for quantification of nuclease driven nucleic acid amplification strategies that will allow SERS to be used directly with common amplification strategies for quantification of miRNA and other nucleic acid biomarkers.
Statistical modeling of isoform splicing dynamics from RNA-seq time series data.
Huang, Yuanhua; Sanguinetti, Guido
2016-10-01
Isoform quantification is an important goal of RNA-seq experiments, yet it remains problematic for genes with low expression or several isoforms. These difficulties may in principle be ameliorated by exploiting correlated experimental designs, such as time series or dosage response experiments. Time series RNA-seq experiments, in particular, are becoming increasingly popular, yet there are no methods that explicitly leverage the experimental design to improve isoform quantification. Here, we present DICEseq, the first isoform quantification method tailored to correlated RNA-seq experiments. DICEseq explicitly models the correlations between different RNA-seq experiments to aid the quantification of isoforms across experiments. Numerical experiments on simulated datasets show that DICEseq yields more accurate results than state-of-the-art methods, an advantage that can become considerable at low coverage levels. On real datasets, our results show that DICEseq provides substantially more reproducible and robust quantifications, increasing the correlation of estimates from replicate datasets by up to 10% on genes with low or moderate expression levels (bottom third of all genes). Furthermore, DICEseq permits to quantify the trade-off between temporal sampling of RNA and depth of sequencing, frequently an important choice when planning experiments. Our results have strong implications for the design of RNA-seq experiments, and offer a novel tool for improved analysis of such datasets. Python code is freely available at http://diceseq.sf.net G.Sanguinetti@ed.ac.uk Supplementary data are available at Bioinformatics online. © The Author 2016. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Tsugawa, Hiroshi; Ohta, Erika; Izumi, Yoshihiro; Ogiwara, Atsushi; Yukihira, Daichi; Bamba, Takeshi; Fukusaki, Eiichiro; Arita, Masanori
2014-01-01
Based on theoretically calculated comprehensive lipid libraries, in lipidomics as many as 1000 multiple reaction monitoring (MRM) transitions can be monitored for each single run. On the other hand, lipid analysis from each MRM chromatogram requires tremendous manual efforts to identify and quantify lipid species. Isotopic peaks differing by up to a few atomic masses further complicate analysis. To accelerate the identification and quantification process we developed novel software, MRM-DIFF, for the differential analysis of large-scale MRM assays. It supports a correlation optimized warping (COW) algorithm to align MRM chromatograms and utilizes quality control (QC) sample datasets to automatically adjust the alignment parameters. Moreover, user-defined reference libraries that include the molecular formula, retention time, and MRM transition can be used to identify target lipids and to correct peak abundances by considering isotopic peaks. Here, we demonstrate the software pipeline and introduce key points for MRM-based lipidomics research to reduce the mis-identification and overestimation of lipid profiles. The MRM-DIFF program, example data set and the tutorials are downloadable at the "Standalone software" section of the PRIMe (Platform for RIKEN Metabolomics, http://prime.psc.riken.jp/) database website.
Informed-Proteomics: open-source software package for top-down proteomics
DOE Office of Scientific and Technical Information (OSTI.GOV)
Park, Jungkap; Piehowski, Paul D.; Wilkins, Christopher
Top-down proteomics involves the analysis of intact proteins. This approach is very attractive as it allows for analyzing proteins in their endogenous form without proteolysis, preserving valuable information about post-translation modifications, isoforms, proteolytic processing or their combinations collectively called proteoforms. Moreover, the quality of the top-down LC-MS/MS datasets is rapidly increasing due to advances in the liquid chromatography and mass spectrometry instrumentation and sample processing protocols. However, the top-down mass spectra are substantially more complex compare to the more conventional bottom-up data. To take full advantage of the increasing quality of the top-down LC-MS/MS datasets there is an urgent needmore » to develop algorithms and software tools for confident proteoform identification and quantification. In this study we present a new open source software suite for top-down proteomics analysis consisting of an LC-MS feature finding algorithm, a database search algorithm, and an interactive results viewer. The presented tool along with several other popular tools were evaluated using human-in-mouse xenograft luminal and basal breast tumor samples that are known to have significant differences in protein abundance based on bottom-up analysis.« less
Selecting information technology for physicians' practices: a cross-sectional study.
Eden, Karen Beekman
2002-04-05
Many physicians are transitioning from paper to electronic formats for billing, scheduling, medical charts, communications, etc. The primary objective of this research was to identify the relationship (if any) between the software selection process and the office staff's perceptions of the software's impact on practice activities. A telephone survey was conducted with office representatives of 407 physician practices in Oregon who had purchased information technology. The respondents, usually office managers, answered scripted questions about their selection process and their perceptions of the software after implementation. Multiple logistic regression revealed that software type, selection steps, and certain factors influencing the purchase were related to whether the respondents felt the software improved the scheduling and financial analysis practice activities. Specifically, practices that selected electronic medical record or practice management software, that made software comparisons, or that considered prior user testimony as important were more likely to have perceived improvements in the scheduling process than were other practices. Practices that considered value important, that did not consider compatibility important, that selected managed care software, that spent less than 10,000 dollars, or that provided learning time (most dramatic increase in odds ratio, 8.2) during implementation were more likely to perceive that the software had improved the financial analysis process than were other practices. Perhaps one of the most important predictors of improvement was providing learning time during implementation, particularly when the software involves several practice activities. Despite this importance, less than half of the practices reported performing this step.
A performance improvement plan to increase nurse adherence to use of medication safety software.
Gavriloff, Carrie
2012-08-01
Nurses can protect patients receiving intravenous (IV) medication by using medication safety software to program "smart" pumps to administer IV medications. After a patient safety event identified inconsistent use of medication safety software by nurses, a performance improvement team implemented the Deming Cycle performance improvement methodology. The combined use of improved direct care nurse communication, programming strategies, staff education, medication safety champions, adherence monitoring, and technology acquisition resulted in a statistically significant (p < .001) increase in nurse adherence to using medication safety software from 28% to above 85%, exceeding national benchmark adherence rates (Cohen, Cooke, Husch & Woodley, 2007; Carefusion, 2011). Copyright © 2012 Elsevier Inc. All rights reserved.
Process improvement as an investment: Measuring its worth
NASA Technical Reports Server (NTRS)
Mcgarry, Frank; Jeletic, Kellyann
1993-01-01
This paper discusses return on investment (ROI) generated from software process improvement programs. It details the steps needed to compute ROI and compares these steps from the perspective of two process improvement approaches: the widely known Software Engineering Institute's capability maturity model and the approach employed by NASA's Software Engineering Laboratory (SEL). The paper then describes the specific investments made in the SEL over the past 18 years and discusses the improvements gained from this investment by the production organization in the SEL.
Mediratta, Anuj; Addetia, Karima; Medvedofsky, Diego; Schneider, Robert J; Kruse, Eric; Shah, Atman P; Nathan, Sandeep; Paul, Jonathan D; Blair, John E; Ota, Takeyoshi; Balkhy, Husam H; Patel, Amit R; Mor-Avi, Victor; Lang, Roberto M
2017-05-01
With the increasing use of transcatheter aortic valve replacement (TAVR) in patients with aortic stenosis (AS), computed tomography (CT) remains the standard for annulus sizing. However, 3D transesophageal echocardiography (TEE) has been an alternative in patients with contraindications to CT. We sought to (1) test the feasibility, accuracy, and reproducibility of prototype 3DTEE analysis software (Philips) for aortic annular measurements and (2) compare the new approach to the existing echocardiographic techniques. We prospectively studied 52 patients who underwent gated contrast CT, procedural 3DTEE, and TAVR. 3DTEE images were analyzed using novel semi-automated software designed for 3D measurements of the aortic root, which uses multiplanar reconstruction, similar to CT analysis. Aortic annulus measurements included area, perimeter, and diameter calculations from these measurements. The results were compared to CT-derived values. Additionally, 3D echocardiographic measurements (3D planimetry and mitral valve analysis software adapted for the aortic valve) were also compared to the CT reference values. 3DTEE image quality was sufficient in 90% of patients for aortic annulus measurements using the new software, which were in good agreement with CT (r-values: .89-.91) and small (<4%) inter-modality nonsignificant biases. Repeated measurements showed <10% measurements variability. The new 3D analysis was the more accurate and reproducible of the existing echocardiographic techniques. Novel semi-automated 3DTEE analysis software can accurately measure aortic annulus in patients with severe AS undergoing TAVR, in better agreement with CT than the existing methodology. Accordingly, intra-procedural TEE could potentially replace CT in patients where CT carries significant risk. © 2017, Wiley Periodicals, Inc.
Bennett, Joseph R.; French, Connor M.
2017-01-01
SDMtoolbox 2.0 is a software package for spatial studies of ecology, evolution, and genetics. The release of SDMtoolbox 2.0 allows researchers to use the most current ArcGIS software and MaxEnt software, and reduces the amount of time that would be spent developing common solutions. The central aim of this software is to automate complicated and repetitive spatial analyses in an intuitive graphical user interface. One core tenant facilitates careful parameterization of species distribution models (SDMs) to maximize each model’s discriminatory ability and minimize overfitting. This includes carefully processing of occurrence data, environmental data, and model parameterization. This program directly interfaces with MaxEnt, one of the most powerful and widely used species distribution modeling software programs, although SDMtoolbox 2.0 is not limited to species distribution modeling or restricted to modeling in MaxEnt. Many of the SDM pre- and post-processing tools have ‘universal’ analogs for use with any modeling software. The current version contains a total of 79 scripts that harness the power of ArcGIS for macroecology, landscape genetics, and evolutionary studies. For example, these tools allow for biodiversity quantification (such as species richness or corrected weighted endemism), generation of least-cost paths and corridors among shared haplotypes, assessment of the significance of spatial randomizations, and enforcement of dispersal limitations of SDMs projected into future climates—to only name a few functions contained in SDMtoolbox 2.0. Lastly, dozens of generalized tools exists for batch processing and conversion of GIS data types or formats, which are broadly useful to any ArcMap user. PMID:29230356
Husen, Peter; Tarasov, Kirill; Katafiasz, Maciej; Sokol, Elena; Vogt, Johannes; Baumgart, Jan; Nitsch, Robert; Ekroos, Kim; Ejsing, Christer S
2013-01-01
Global lipidomics analysis across large sample sizes produces high-content datasets that require dedicated software tools supporting lipid identification and quantification, efficient data management and lipidome visualization. Here we present a novel software-based platform for streamlined data processing, management and visualization of shotgun lipidomics data acquired using high-resolution Orbitrap mass spectrometry. The platform features the ALEX framework designed for automated identification and export of lipid species intensity directly from proprietary mass spectral data files, and an auxiliary workflow using database exploration tools for integration of sample information, computation of lipid abundance and lipidome visualization. A key feature of the platform is the organization of lipidomics data in "database table format" which provides the user with an unsurpassed flexibility for rapid lipidome navigation using selected features within the dataset. To demonstrate the efficacy of the platform, we present a comparative neurolipidomics study of cerebellum, hippocampus and somatosensory barrel cortex (S1BF) from wild-type and knockout mice devoid of the putative lipid phosphate phosphatase PRG-1 (plasticity related gene-1). The presented framework is generic, extendable to processing and integration of other lipidomic data structures, can be interfaced with post-processing protocols supporting statistical testing and multivariate analysis, and can serve as an avenue for disseminating lipidomics data within the scientific community. The ALEX software is available at www.msLipidomics.info.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Li, Baoqiang; Berti, Romain; Abran, Maxime
2014-05-15
Ultrasound imaging, having the advantages of low-cost and non-invasiveness over MRI and X-ray CT, was reported by several studies as an adequate complement to fluorescence molecular tomography with the perspective of improving localization and quantification of fluorescent molecular targets in vivo. Based on the previous work, an improved dual-modality Fluorescence-Ultrasound imaging system was developed and then validated in imaging study with preclinical tumor model. Ultrasound imaging and a profilometer were used to obtain the anatomical prior information and 3D surface, separately, to precisely extract the tissue boundary on both sides of sample in order to achieve improved fluorescence reconstruction. Furthermore,more » a pattern-based fluorescence reconstruction on the detection side was incorporated to enable dimensional reduction of the dataset while keeping the useful information for reconstruction. Due to its putative role in the current imaging geometry and the chosen reconstruction technique, we developed an attenuation compensated Born-normalization method to reduce the attenuation effects and cancel off experimental factors when collecting quantitative fluorescence datasets over large area. Results of both simulation and phantom study demonstrated that fluorescent targets could be recovered accurately and quantitatively using this reconstruction mechanism. Finally, in vivo experiment confirms that the imaging system associated with the proposed image reconstruction approach was able to extract both functional and anatomical information, thereby improving quantification and localization of molecular targets.« less
Improving Software Engineering on NASA Projects
NASA Technical Reports Server (NTRS)
Crumbley, Tim; Kelly, John C.
2010-01-01
Software Engineering Initiative: Reduces risk of software failure -Increases mission safety. More predictable software cost estimates and delivery schedules. Smarter buyer of contracted out software. More defects found and removed earlier. Reduces duplication of efforts between projects. Increases ability to meet the challenges of evolving software technology.
Software engineering and Ada in design
NASA Technical Reports Server (NTRS)
Oneill, Don
1986-01-01
Modern software engineering promises significant reductions in software costs and improvements in software quality. The Ada language is the focus for these software methodology and tool improvements. The IBM FSD approach, including the software engineering practices that guide the systematic design and development of software products and the management of the software process are examined. The revised Ada design language adaptation is revealed. This four level design methodology is detailed including the purpose of each level, the management strategy that integrates the software design activity with the program milestones, and the technical strategy that maps the Ada constructs to each level of design. A complete description of each design level is provided along with specific design language recording guidelines for each level. Finally, some testimony is offered on education, tools, architecture, and metrics resulting from project use of the four level Ada design language adaptation.
A Framework of the Use of Information in Software Testing
ERIC Educational Resources Information Center
Kaveh, Payman
2010-01-01
With the increasing role that software systems play in our daily lives, software quality has become extremely important. Software quality is impacted by the efficiency of the software testing process. There are a growing number of software testing methodologies, models, and initiatives to satisfy the need to improve software quality. The main…
Leveraging Code Comments to Improve Software Reliability
ERIC Educational Resources Information Center
Tan, Lin
2009-01-01
Commenting source code has long been a common practice in software development. This thesis, consisting of three pieces of work, made novel use of the code comments written in natural language to improve software reliability. Our solution combines Natural Language Processing (NLP), Machine Learning, Statistics, and Program Analysis techniques to…
Publishing Platform for Scientific Software - Lessons Learned
NASA Astrophysics Data System (ADS)
Hammitzsch, Martin; Fritzsch, Bernadette; Reusser, Dominik; Brembs, Björn; Deinzer, Gernot; Loewe, Peter; Fenner, Martin; van Edig, Xenia; Bertelmann, Roland; Pampel, Heinz; Klump, Jens; Wächter, Joachim
2015-04-01
Scientific software has become an indispensable commodity for the production, processing and analysis of empirical data but also for modelling and simulation of complex processes. Software has a significant influence on the quality of research results. For strengthening the recognition of the academic performance of scientific software development, for increasing its visibility and for promoting the reproducibility of research results, concepts for the publication of scientific software have to be developed, tested, evaluated, and then transferred into operations. For this, the publication and citability of scientific software have to fulfil scientific criteria by means of defined processes and the use of persistent identifiers, similar to data publications. The SciForge project is addressing these challenges. Based on interviews a blueprint for a scientific software publishing platform and a systematic implementation plan has been designed. In addition, the potential of journals, software repositories and persistent identifiers have been evaluated to improve the publication and dissemination of reusable software solutions. It is important that procedures for publishing software as well as methods and tools for software engineering are reflected in the architecture of the platform, in order to improve the quality of the software and the results of research. In addition, it is necessary to work continuously on improving specific conditions that promote the adoption and sustainable utilization of scientific software publications. Among others, this would include policies for the development and publication of scientific software in the institutions but also policies for establishing the necessary competencies and skills of scientists and IT personnel. To implement the concepts developed in SciForge a combined bottom-up / top-down approach is considered that will be implemented in parallel in different scientific domains, e.g. in earth sciences, climate research and the life sciences. Based on the developed blueprints a scientific software publishing platform will be iteratively implemented, tested, and evaluated. Thus the platform should be developed continuously on the basis of gained experiences and results. The platform services will be extended one by one corresponding to the requirements of the communities. Thus the implemented platform for the publication of scientific software can be improved and stabilized incrementally as a tool with software, science, publishing, and user oriented features.
Image processing tools dedicated to quantification in 3D fluorescence microscopy
NASA Astrophysics Data System (ADS)
Dieterlen, A.; De Meyer, A.; Colicchio, B.; Le Calvez, S.; Haeberlé, O.; Jacquey, S.
2006-05-01
3-D optical fluorescent microscopy now becomes an efficient tool for the volume investigation of living biological samples. Developments in instrumentation have permitted to beat off the conventional Abbe limit. In any case the recorded image can be described by the convolution equation between the original object and the Point Spread Function (PSF) of the acquisition system. Due to the finite resolution of the instrument, the original object is recorded with distortions and blurring, and contaminated by noise. This induces that relevant biological information cannot be extracted directly from raw data stacks. If the goal is 3-D quantitative analysis, then to assess optimal performance of the instrument and to ensure the data acquisition reproducibility, the system characterization is mandatory. The PSF represents the properties of the image acquisition system; we have proposed the use of statistical tools and Zernike moments to describe a 3-D PSF system and to quantify the variation of the PSF. This first step toward standardization is helpful to define an acquisition protocol optimizing exploitation of the microscope depending on the studied biological sample. Before the extraction of geometrical information and/or intensities quantification, the data restoration is mandatory. Reduction of out-of-focus light is carried out computationally by deconvolution process. But other phenomena occur during acquisition, like fluorescence photo degradation named "bleaching", inducing an alteration of information needed for restoration. Therefore, we have developed a protocol to pre-process data before the application of deconvolution algorithms. A large number of deconvolution methods have been described and are now available in commercial package. One major difficulty to use this software is the introduction by the user of the "best" regularization parameters. We have pointed out that automating the choice of the regularization level; also greatly improves the reliability of the measurements although it facilitates the use. Furthermore, to increase the quality and the repeatability of quantitative measurements a pre-filtering of images improves the stability of deconvolution process. In the same way, the PSF prefiltering stabilizes the deconvolution process. We have shown that Zemike polynomials can be used to reconstruct experimental PSF, preserving system characteristics and removing the noise contained in the PSF.
Improved Ant Algorithms for Software Testing Cases Generation
Yang, Shunkun; Xu, Jiaqi
2014-01-01
Existing ant colony optimization (ACO) for software testing cases generation is a very popular domain in software testing engineering. However, the traditional ACO has flaws, as early search pheromone is relatively scarce, search efficiency is low, search model is too simple, positive feedback mechanism is easy to porduce the phenomenon of stagnation and precocity. This paper introduces improved ACO for software testing cases generation: improved local pheromone update strategy for ant colony optimization, improved pheromone volatilization coefficient for ant colony optimization (IPVACO), and improved the global path pheromone update strategy for ant colony optimization (IGPACO). At last, we put forward a comprehensive improved ant colony optimization (ACIACO), which is based on all the above three methods. The proposed technique will be compared with random algorithm (RND) and genetic algorithm (GA) in terms of both efficiency and coverage. The results indicate that the improved method can effectively improve the search efficiency, restrain precocity, promote case coverage, and reduce the number of iterations. PMID:24883391
SEL's Software Process-Improvement Program
NASA Technical Reports Server (NTRS)
Basili, Victor; Zelkowitz, Marvin; McGarry, Frank; Page, Jerry; Waligora, Sharon; Pajerski, Rose
1995-01-01
The goals and operations of the Software Engineering Laboratory (SEL) is reviewed. For nearly 20 years the SEL has worked to understand, assess, and improve software and the development process within the production environment of the Flight Dynamics Division (FDD) of NASA's Goddard Space Flight Center. The SEL was established in 1976 with the goals of reducing: (1) the defect rate of delivered software, (2) the cost of software to support flight projects, and (3) the average time to produce mission-support software. After studying over 125 projects of FDD, the results have guided the standards, management practices, technologies, and the training within the division. The results of the studies have been a 75 percent reduction in defects, a 50 percent reduction in cost, and a 25 percent reduction in development time. Over time the goals of SEL have been clarified. The goals are now stated as: (1) Understand baseline processes and product characteristics, (2) Assess improvements that have been incorporated into the development projects, (3) Package and infuse improvements into the standard SEL process. The SEL improvement goal is to demonstrate continual improvement of the software process by carrying out analysis, measurement and feedback to projects with in the FDD environment. The SEL supports the understanding of the process by study of several processes including, the effort distribution, and error detection rates. The SEL assesses and refines the processes. Once the assessment and refinement of a process is completed, the SEL packages the process by capturing the process in standards, tools and training.
The Effect of the Ill-posed Problem on Quantitative Error Assessment in Digital Image Correlation
DOE Office of Scientific and Technical Information (OSTI.GOV)
Lehoucq, R. B.; Reu, P. L.; Turner, D. Z.
Here, this work explores the effect of the ill-posed problem on uncertainty quantification for motion estimation using digital image correlation (DIC) (Sutton et al. 2009). We develop a correction factor for standard uncertainty estimates based on the cosine of the angle between the true motion and the image gradients, in an integral sense over a subregion of the image. This correction factor accounts for variability in the DIC solution previously unaccounted for when considering only image noise, interpolation bias, contrast, and the software settings such as subset size and spacing.
The Effect of the Ill-posed Problem on Quantitative Error Assessment in Digital Image Correlation
Lehoucq, R. B.; Reu, P. L.; Turner, D. Z.
2017-11-27
Here, this work explores the effect of the ill-posed problem on uncertainty quantification for motion estimation using digital image correlation (DIC) (Sutton et al. 2009). We develop a correction factor for standard uncertainty estimates based on the cosine of the angle between the true motion and the image gradients, in an integral sense over a subregion of the image. This correction factor accounts for variability in the DIC solution previously unaccounted for when considering only image noise, interpolation bias, contrast, and the software settings such as subset size and spacing.
Demand Response Resource Quantification with Detailed Building Energy Models
DOE Office of Scientific and Technical Information (OSTI.GOV)
Hale, Elaine; Horsey, Henry; Merket, Noel
Demand response is a broad suite of technologies that enables changes in electrical load operations in support of power system reliability and efficiency. Although demand response is not a new concept, there is new appetite for comprehensively evaluating its technical potential in the context of renewable energy integration. The complexity of demand response makes this task difficult -- we present new methods for capturing the heterogeneity of potential responses from buildings, their time-varying nature, and metrics such as thermal comfort that help quantify likely acceptability of specific demand response actions. Computed with an automated software framework, the methods are scalable.
Closing the loop on improvement: Packaging experience in the Software Engineering Laboratory
NASA Technical Reports Server (NTRS)
Waligora, Sharon R.; Landis, Linda C.; Doland, Jerry T.
1994-01-01
As part of its award-winning software process improvement program, the Software Engineering Laboratory (SEL) has developed an effective method for packaging organizational best practices based on real project experience into useful handbooks and training courses. This paper shares the SEL's experience over the past 12 years creating and updating software process handbooks and training courses. It provides cost models and guidelines for successful experience packaging derived from SEL experience.
Assessing students' performance in software requirements engineering education using scoring rubrics
NASA Astrophysics Data System (ADS)
Mkpojiogu, Emmanuel O. C.; Hussain, Azham
2017-10-01
The study investigates how helpful the use of scoring rubrics is, in the performance assessment of software requirements engineering students and whether its use can lead to students' performance improvement in the development of software requirements artifacts and models. Scoring rubrics were used by two instructors to assess the cognitive performance of a student in the design and development of software requirements artifacts. The study results indicate that the use of scoring rubrics is very helpful in objectively assessing the performance of software requirements or software engineering students. Furthermore, the results revealed that the use of scoring rubrics can also produce a good achievement assessments direction showing whether a student is either improving or not in a repeated or iterative assessment. In a nutshell, its use leads to the performance improvement of students. The results provided some insights for further investigation and will be beneficial to researchers, requirements engineers, system designers, developers and project managers.
2013-01-01
The formalin-fixed, paraffin-embedded (FFPE) biopsy is a challenging sample for molecular assays such as targeted next-generation sequencing (NGS). We compared three methods for FFPE DNA quantification, including a novel PCR assay (‘QFI-PCR’) that measures the absolute copy number of amplifiable DNA, across 165 residual clinical specimens. The results reveal the limitations of commonly used approaches, and demonstrate the value of an integrated workflow using QFI-PCR to improve the accuracy of NGS mutation detection and guide changes in input that can rescue low quality FFPE DNA. These findings address a growing need for improved quality measures in NGS-based patient testing. PMID:24001039
Model Update of a Micro Air Vehicle (MAV) Flexible Wing Frame with Uncertainty Quantification
NASA Technical Reports Server (NTRS)
Reaves, Mercedes C.; Horta, Lucas G.; Waszak, Martin R.; Morgan, Benjamin G.
2004-01-01
This paper describes a procedure to update parameters in the finite element model of a Micro Air Vehicle (MAV) to improve displacement predictions under aerodynamics loads. Because of fabrication, materials, and geometric uncertainties, a statistical approach combined with Multidisciplinary Design Optimization (MDO) is used to modify key model parameters. Static test data collected using photogrammetry are used to correlate with model predictions. Results show significant improvements in model predictions after parameters are updated; however, computed probabilities values indicate low confidence in updated values and/or model structure errors. Lessons learned in the areas of wing design, test procedures, modeling approaches with geometric nonlinearities, and uncertainties quantification are all documented.
Quality measures and assurance for AI (Artificial Intelligence) software
NASA Technical Reports Server (NTRS)
Rushby, John
1988-01-01
This report is concerned with the application of software quality and evaluation measures to AI software and, more broadly, with the question of quality assurance for AI software. Considered are not only the metrics that attempt to measure some aspect of software quality, but also the methodologies and techniques (such as systematic testing) that attempt to improve some dimension of quality, without necessarily quantifying the extent of the improvement. The report is divided into three parts Part 1 reviews existing software quality measures, i.e., those that have been developed for, and applied to, conventional software. Part 2 considers the characteristics of AI software, the applicability and potential utility of measures and techniques identified in the first part, and reviews those few methods developed specifically for AI software. Part 3 presents an assessment and recommendations for the further exploration of this important area.
Quantification of moving target cyber defenses
NASA Astrophysics Data System (ADS)
Farris, Katheryn A.; Cybenko, George
2015-05-01
Current network and information systems are static, making it simple for attackers to maintain an advantage. Adaptive defenses, such as Moving Target Defenses (MTD) have been developed as potential "game-changers" in an effort to increase the attacker's workload. With many new methods being developed, it is difficult to accurately quantify and compare their overall costs and effectiveness. This paper compares the tradeoffs between current approaches to the quantification of MTDs. We present results from an expert opinion survey on quantifying the overall effectiveness, upfront and operating costs of a select set of MTD techniques. We find that gathering informed scientific opinions can be advantageous for evaluating such new technologies as it offers a more comprehensive assessment. We end by presenting a coarse ordering of a set of MTD techniques from most to least dominant. We found that seven out of 23 methods rank as the more dominant techniques. Five of which are techniques of either address space layout randomization or instruction set randomization. The remaining two techniques are applicable to software and computer platforms. Among the techniques that performed the worst are those primarily aimed at network randomization.
Shrestha, Sachin L; Breen, Andrew J; Trimby, Patrick; Proust, Gwénaëlle; Ringer, Simon P; Cairney, Julie M
2014-02-01
The identification and quantification of the different ferrite microconstituents in steels has long been a major challenge for metallurgists. Manual point counting from images obtained by optical and scanning electron microscopy (SEM) is commonly used for this purpose. While classification systems exist, the complexity of steel microstructures means that identifying and quantifying these phases is still a great challenge. Moreover, point counting is extremely tedious, time consuming, and subject to operator bias. This paper presents a new automated identification and quantification technique for the characterisation of complex ferrite microstructures by electron backscatter diffraction (EBSD). This technique takes advantage of the fact that different classes of ferrite exhibit preferential grain boundary misorientations, aspect ratios and mean misorientation, all of which can be detected using current EBSD software. These characteristics are set as criteria for identification and linked to grain size to determine the area fractions. The results of this method were evaluated by comparing the new automated technique with point counting results. The technique could easily be applied to a range of other steel microstructures. © 2013 Published by Elsevier B.V.
Armigliato, Aldo; Frabboni, Stefano; Gazzadi, Gian Carlo; Rosa, Rodolfo
2013-02-01
A method for the fabrication of a wedge-shaped thin NiO lamella by focused ion beam is reported. The starting sample is an oxidized bulk single crystalline, <100> oriented, Ni commercial standard. The lamella is employed for the determination, by analytical electron microscopy at 200 kV of the experimental k(O-Ni) Cliff-Lorimer (G. Cliff & G.W. Lorimer, J Microsc 103, 203-207, 1975) coefficient, according to the extrapolation method by Van Cappellen (E. Van Cappellen, Microsc Microstruct Microanal 1, 1-22, 1990). The result thus obtained is compared to the theoretical k(O-Ni) values either implemented into the commercial software for X-ray microanalysis quantification of the scanning transmission electron microscopy/energy dispersive spectrometry equipment or calculated by the Monte Carlo method. Significant differences among the three values are found. This confirms that for a reliable quantification of binary alloys containing light elements, the choice of the Cliff-Lorimer coefficients is crucial and experimental values are recommended.
Mapping proteins in the presence of paralogs using units of coevolution
2013-01-01
Background We study the problem of mapping proteins between two protein families in the presence of paralogs. This problem occurs as a difficult subproblem in coevolution-based computational approaches for protein-protein interaction prediction. Results Similar to prior approaches, our method is based on the idea that coevolution implies equal rates of sequence evolution among the interacting proteins, and we provide a first attempt to quantify this notion in a formal statistical manner. We call the units that are central to this quantification scheme the units of coevolution. A unit consists of two mapped protein pairs and its score quantifies the coevolution of the pairs. This quantification allows us to provide a maximum likelihood formulation of the paralog mapping problem and to cast it into a binary quadratic programming formulation. Conclusion CUPID, our software tool based on a Lagrangian relaxation of this formulation, makes it, for the first time, possible to compute state-of-the-art quality pairings in a few minutes of runtime. In summary, we suggest a novel alternative to the earlier available approaches, which is statistically sound and computationally feasible. PMID:24564758
2010-04-29
Technology: From the Office Larry Smith Software Technology Support Center to the Enterprise 517 SMXS/MXDEA 6022 Fir Avenue Hill AFB, UT 84056 801...2010 to 00-00-2010 4. TITLE AND SUBTITLE Accelerating Project and Process Improvement using Advanced Software Simulation Technology: From the Office to
Kim, Jong-Seo; Fillmore, Thomas L; Liu, Tao; Robinson, Errol; Hossain, Mahmud; Champion, Boyd L; Moore, Ronald J; Camp, David G; Smith, Richard D; Qian, Wei-Jun
2011-12-01
Selected reaction monitoring (SRM)-MS is an emerging technology for high throughput targeted protein quantification and verification in biomarker discovery studies; however, the cost associated with the application of stable isotope-labeled synthetic peptides as internal standards can be prohibitive for screening a large number of candidate proteins as often required in the preverification phase of discovery studies. Herein we present a proof of concept study using an (18)O-labeled proteome reference as global internal standards (GIS) for SRM-based relative quantification. The (18)O-labeled proteome reference (or GIS) can be readily prepared and contains a heavy isotope ((18)O)-labeled internal standard for every possible tryptic peptide. Our results showed that the percentage of heavy isotope ((18)O) incorporation applying an improved protocol was >99.5% for most peptides investigated. The accuracy, reproducibility, and linear dynamic range of quantification were further assessed based on known ratios of standard proteins spiked into the labeled mouse plasma reference. Reliable quantification was observed with high reproducibility (i.e. coefficient of variance <10%) for analyte concentrations that were set at 100-fold higher or lower than those of the GIS based on the light ((16)O)/heavy ((18)O) peak area ratios. The utility of (18)O-labeled GIS was further illustrated by accurate relative quantification of 45 major human plasma proteins. Moreover, quantification of the concentrations of C-reactive protein and prostate-specific antigen was illustrated by coupling the GIS with standard additions of purified protein standards. Collectively, our results demonstrated that the use of (18)O-labeled proteome reference as GIS provides a convenient, low cost, and effective strategy for relative quantification of a large number of candidate proteins in biological or clinical samples using SRM.
Venhuizen, Freerk G; van Ginneken, Bram; Liefers, Bart; van Asten, Freekje; Schreur, Vivian; Fauser, Sascha; Hoyng, Carel; Theelen, Thomas; Sánchez, Clara I
2018-04-01
We developed a deep learning algorithm for the automatic segmentation and quantification of intraretinal cystoid fluid (IRC) in spectral domain optical coherence tomography (SD-OCT) volumes independent of the device used for acquisition. A cascade of neural networks was introduced to include prior information on the retinal anatomy, boosting performance significantly. The proposed algorithm approached human performance reaching an overall Dice coefficient of 0.754 ± 0.136 and an intraclass correlation coefficient of 0.936, for the task of IRC segmentation and quantification, respectively. The proposed method allows for fast quantitative IRC volume measurements that can be used to improve patient care, reduce costs, and allow fast and reliable analysis in large population studies.
Venhuizen, Freerk G.; van Ginneken, Bram; Liefers, Bart; van Asten, Freekje; Schreur, Vivian; Fauser, Sascha; Hoyng, Carel; Theelen, Thomas; Sánchez, Clara I.
2018-01-01
We developed a deep learning algorithm for the automatic segmentation and quantification of intraretinal cystoid fluid (IRC) in spectral domain optical coherence tomography (SD-OCT) volumes independent of the device used for acquisition. A cascade of neural networks was introduced to include prior information on the retinal anatomy, boosting performance significantly. The proposed algorithm approached human performance reaching an overall Dice coefficient of 0.754 ± 0.136 and an intraclass correlation coefficient of 0.936, for the task of IRC segmentation and quantification, respectively. The proposed method allows for fast quantitative IRC volume measurements that can be used to improve patient care, reduce costs, and allow fast and reliable analysis in large population studies. PMID:29675301
DOE Office of Scientific and Technical Information (OSTI.GOV)
Peck, T; Sparkman, D; Storch, N
''The LLNL Site-Specific Advanced Simulation and Computing (ASCI) Software Quality Engineering Recommended Practices VI.I'' document describes a set of recommended software quality engineering (SQE) practices for ASCI code projects at Lawrence Livermore National Laboratory (LLNL). In this context, SQE is defined as the process of building quality into software products by applying the appropriate guiding principles and management practices. Continual code improvement and ongoing process improvement are expected benefits. Certain practices are recommended, although projects may select the specific activities they wish to improve, and the appropriate time lines for such actions. Additionally, projects can rely on the guidance ofmore » this document when generating ASCI Verification and Validation (VSrV) deliverables. ASCI program managers will gather information about their software engineering practices and improvement. This information can be shared to leverage the best SQE practices among development organizations. It will further be used to ensure the currency and vitality of the recommended practices. This Overview is intended to provide basic information to the LLNL ASCI software management and development staff from the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. Additionally the Overview provides steps to using the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I'' document. For definitions of terminology and acronyms, refer to the Glossary and Acronyms sections in the ''LLNL Site-Specific ASCI Software Quality Engineering Recommended Practices VI.I''.« less
DOE Office of Scientific and Technical Information (OSTI.GOV)
Turgeon, Jennifer L.; Minana, Molly A.; Hackney, Patricia
2009-01-01
The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in the US Department of Energy/National Nuclear Security Agency (DOE/NNSA) Quality Criteria, Revision 10 (QC-1) as 'conformance to customer requirements and expectations'. This quality plan defines the SNL ASC Program software quality engineering (SQE) practices and provides a mapping of these practices to the SNL Corporate Process Requirement (CPR) 001.3.6; 'Corporate Software Engineering Excellence'. This plan also identifies ASC management's and themore » software project teams responsibilities in implementing the software quality practices and in assessing progress towards achieving their software quality goals. This SNL ASC Software Quality Plan establishes the signatories commitments to improving software products by applying cost-effective SQE practices. This plan enumerates the SQE practices that comprise the development of SNL ASC's software products and explains the project teams opportunities for tailoring and implementing the practices.« less
Institutionalizing urban forestry as a "biotechnology" to improve environmental quality
David J. Nowak
2006-01-01
Urban forests can provide multiple environmental benefits. As urban areas expand, the role of urban vegetation in improving environmental quality will increase in importance. Quantification of these benefits has revealed that urban forests can significantly improve air quality. As a result, national air quality regulations are now willing to potentially credit tree...
Software and electronic developments for TUG - T60 robotic telescope
NASA Astrophysics Data System (ADS)
Parmaksizoglu, M.; Dindar, M.; Kirbiyik, H.; Helhel, S.
2014-12-01
A robotic telescope is a telescope that can make observations without hands-on human control. Its low level behavior is automatic and computer-controlled. Robotic telescopes usually run under the control of a scheduler, which provides high-level control by selecting astronomical targets for observation. TUBITAK National Observatory (TUG) T60 Robotic Telescope is controlled by open source OCAAS software, formally named TALON. This study introduces the improvements on TALON software, new electronic and mechanic designs. The designs and software improvements were implemented in the T60 telescope control software and tested on the real system successfully.
Engel, A; Plöger, M; Mulac, D; Langer, K
2014-01-30
Nanoparticles composed of poly(DL-lactide-co-glycolide) (PLGA) represent promising colloidal drug carriers for improved drug targeting. Although most research activities are focused on intravenous application of these carriers the peroral administration is described to improve bioavailability of poorly soluble drugs. Based on these insights the manuscript describes a model tablet formulation for PLGA-nanoparticles and especially its analytical characterisation with regard to a nanosized drug carrier. Besides physico-chemical tablet characterisation according to pharmacopoeias the main goal of the study was the development of a suitable analytical method for the quantification of nanoparticle release from tablets. An analytical flow field-flow fractionation (AF4) method was established and validated which enables determination of nanoparticle content in solid dosage forms as well as quantification of particle release during dissolution testing. For particle detection a multi-angle light scattering (MALS) detector was coupled to the AF4-system. After dissolution testing, the presence of unaltered PLGA-nanoparticles was successfully proved by dynamic light scattering and scanning electron microscopy. Copyright © 2013 Elsevier B.V. All rights reserved.
Improving microstructural quantification in FIB/SEM nanotomography.
Taillon, Joshua A; Pellegrinelli, Christopher; Huang, Yi-Lin; Wachsman, Eric D; Salamanca-Riba, Lourdes G
2018-01-01
FIB/SEM nanotomography (FIB-nt) is a powerful technique for the determination and quantification of the three-dimensional microstructure in subsurface features. Often times, the microstructure of a sample is the ultimate determiner of the overall performance of a system, and a detailed understanding of its properties is crucial in advancing the materials engineering of a resulting device. While the FIB-nt technique has developed significantly in the 15 years since its introduction, advanced nanotomographic analysis is still far from routine, and a number of challenges remain in data acquisition and post-processing. In this work, we present a number of techniques to improve the quality of the acquired data, together with easy-to-implement methods to obtain "advanced" microstructural quantifications. The techniques are applied to a solid oxide fuel cell cathode of interest to the electrochemistry community, but the methodologies are easily adaptable to a wide range of material systems. Finally, results from an analyzed sample are presented as a practical example of how these techniques can be implemented. Copyright © 2017 Elsevier B.V. All rights reserved.
NASA Astrophysics Data System (ADS)
Slamnoiu, Stefan; Vlad, Camelia; Stumbaum, Mihaela; Moise, Adrian; Lindner, Kathrin; Engel, Nicole; Vilanova, Mar; Diaz, Mireia; Karreman, Christiaan; Leist, Marcel; Ciossek, Thomas; Hengerer, Bastian; Vilaseca, Marta; Przybylski, Michael
2014-08-01
Bioaffinity analysis using a variety of biosensors has become an established tool for detection and quantification of biomolecular interactions. Biosensors, however, are generally limited by the lack of chemical structure information of affinity-bound ligands. On-line bioaffinity-mass spectrometry using a surface-acoustic wave biosensor (SAW-MS) is a new combination providing the simultaneous affinity detection, quantification, and mass spectrometric structural characterization of ligands. We describe here an on-line SAW-MS combination for direct identification and affinity determination, using a new interface for MS of the affinity-isolated ligand eluate. Key element of the SAW-MS combination is a microfluidic interface that integrates affinity-isolation on a gold chip, in-situ sample concentration, and desalting with a microcolumn for MS of the ligand eluate from the biosensor. Suitable MS- acquisition software has been developed that provides coupling of the SAW-MS interface to a Bruker Daltonics ion trap-MS, FTICR-MS, and Waters Synapt-QTOF- MS systems. Applications are presented for mass spectrometric identifications and affinity (KD) determinations of the neurodegenerative polypeptides, ß-amyloid (Aß), and pathophysiological and physiological synucleins (α- and ß-synucleins), two key polypeptide systems for Alzheimer's disease and Parkinson's disease, respectively. Moreover, first in vivo applications of αSyn polypeptides from brain homogenate show the feasibility of on-line affinity-MS to the direct analysis of biological material. These results demonstrate on-line SAW-bioaffinity-MS as a powerful tool for structural and quantitative analysis of biopolymer interactions.
NASA Astrophysics Data System (ADS)
Conerty, Michelle D.; Castracane, James; Cacace, Anthony T.; Parnes, Steven M.; Gardner, Glendon M.; Miller, Mitchell B.
1995-05-01
Electronic Speckle Pattern Interferometry (ESPI) is a nondestructive optical evaluation technique that is capable of determining surface and subsurface integrity through the quantitative evaluation of static or vibratory motion. By utilizing state of the art developments in the areas of lasers, fiber optics and solid state detector technology, this technique has become applicable in medical research and diagnostics. Based on initial support from NIDCD and continued support from InterScience, Inc., we have been developing a range of instruments for improved diagnostic evaluation in otolaryngological applications based on the technique of ESPI. These compact fiber optic instruments are capable of making real time interferometric measurements of the target tissue. Ongoing development of image post- processing software is currently capable of extracting the desired quantitative results from the acquired interferometric images. The goal of the research is to develop a fully automated system in which the image processing and quantification will be performed in hardware in near real-time. Subsurface details of both the tympanic membrane and vocal cord dynamics could speed the diagnosis of otosclerosis, laryngeal tumors, and aid in the evaluation of surgical procedures.
Zhao, Shanrong; Prenger, Kurt; Smith, Lance
2013-01-01
RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets. PMID:25937948
HASEGAWA, Mitsuhiro; NOURI, Mohsen; FUJISAWA, Hironori; HAYASHI, Yutaka; INAMASU, Joji; HIROSE, Yuichi; YAMASHITA, Junkoh
2015-01-01
There are many reports on position-related complications in neurosurgical literature but so far, continuous quantification of the patient’s position during the surgery has not been reported. This study aims to explore the utility of a new surgical table system and its software in displaying the patient’s body positions during surgery on real-time basis. More than 200 neurosurgical cases were monitored for their positions intra-operatively. The position was digitally recorded and could be seen by all the members in the operating team. It also displayed the three-dimensional relationship between the head and the heart positions. No position-related complications were observed during the study. The system was able to serve as an excellent indicator for monitoring the patient’s position. The recordings were analyzed and even used to reproduce or improve the position in the subsequent operations. The novel technique of monitoring the position of the head and the heart of the patients and the operating table planes are considered to be useful during delicate neurosurgical procedures thereby, preventing inadvertent procedural errors. This can be used to quantify various surgical positions in the future and define safety measures accordingly. PMID:25797776
Zhao, Shanrong; Prenger, Kurt; Smith, Lance
2013-01-01
RNA-Seq is becoming a promising replacement to microarrays in transcriptome profiling and differential gene expression study. Technical improvements have decreased sequencing costs and, as a result, the size and number of RNA-Seq datasets have increased rapidly. However, the increasing volume of data from large-scale RNA-Seq studies poses a practical challenge for data analysis in a local environment. To meet this challenge, we developed Stormbow, a cloud-based software package, to process large volumes of RNA-Seq data in parallel. The performance of Stormbow has been tested by practically applying it to analyse 178 RNA-Seq samples in the cloud. In our test, it took 6 to 8 hours to process an RNA-Seq sample with 100 million reads, and the average cost was $3.50 per sample. Utilizing Amazon Web Services as the infrastructure for Stormbow allows us to easily scale up to handle large datasets with on-demand computational resources. Stormbow is a scalable, cost effective, and open-source based tool for large-scale RNA-Seq data analysis. Stormbow can be freely downloaded and can be used out of box to process Illumina RNA-Seq datasets.
Signal Partitioning Algorithm for Highly Efficient Gaussian Mixture Modeling in Mass Spectrometry
Polanski, Andrzej; Marczyk, Michal; Pietrowska, Monika; Widlak, Piotr; Polanska, Joanna
2015-01-01
Mixture - modeling of mass spectra is an approach with many potential applications including peak detection and quantification, smoothing, de-noising, feature extraction and spectral signal compression. However, existing algorithms do not allow for automated analyses of whole spectra. Therefore, despite highlighting potential advantages of mixture modeling of mass spectra of peptide/protein mixtures and some preliminary results presented in several papers, the mixture modeling approach was so far not developed to the stage enabling systematic comparisons with existing software packages for proteomic mass spectra analyses. In this paper we present an efficient algorithm for Gaussian mixture modeling of proteomic mass spectra of different types (e.g., MALDI-ToF profiling, MALDI-IMS). The main idea is automated partitioning of protein mass spectral signal into fragments. The obtained fragments are separately decomposed into Gaussian mixture models. The parameters of the mixture models of fragments are then aggregated to form the mixture model of the whole spectrum. We compare the elaborated algorithm to existing algorithms for peak detection and we demonstrate improvements of peak detection efficiency obtained by using Gaussian mixture modeling. We also show applications of the elaborated algorithm to real proteomic datasets of low and high resolution. PMID:26230717
Hasegawa, Mitsuhiro; Nouri, Mohsen; Fujisawa, Hironori; Hayashi, Yutaka; Inamasu, Joji; Hirose, Yuichi; Yamashita, Junkoh
2015-01-01
There are many reports on position-related complications in neurosurgical literature but so far, continuous quantification of the patient's position during the surgery has not been reported. This study aims to explore the utility of a new surgical table system and its software in displaying the patient's body positions during surgery on real-time basis. More than 200 neurosurgical cases were monitored for their positions intra-operatively. The position was digitally recorded and could be seen by all the members in the operating team. It also displayed the three-dimensional relationship between the head and the heart positions. No position-related complications were observed during the study. The system was able to serve as an excellent indicator for monitoring the patient's position. The recordings were analyzed and even used to reproduce or improve the position in the subsequent operations. The novel technique of monitoring the position of the head and the heart of the patients and the operating table planes are considered to be useful during delicate neurosurgical procedures thereby, preventing inadvertent procedural errors. This can be used to quantify various surgical positions in the future and define safety measures accordingly.
Arkas: Rapid reproducible RNAseq analysis
Colombo, Anthony R.; J. Triche Jr, Timothy; Ramsingh, Giridharan
2017-01-01
The recently introduced Kallisto pseudoaligner has radically simplified the quantification of transcripts in RNA-sequencing experiments. We offer cloud-scale RNAseq pipelines Arkas-Quantification, and Arkas-Analysis available within Illumina’s BaseSpace cloud application platform which expedites Kallisto preparatory routines, reliably calculates differential expression, and performs gene-set enrichment of REACTOME pathways . Due to inherit inefficiencies of scale, Illumina's BaseSpace computing platform offers a massively parallel distributive environment improving data management services and data importing. Arkas-Quantification deploys Kallisto for parallel cloud computations and is conveniently integrated downstream from the BaseSpace Sequence Read Archive (SRA) import/conversion application titled SRA Import. Arkas-Analysis annotates the Kallisto results by extracting structured information directly from source FASTA files with per-contig metadata, calculates the differential expression and gene-set enrichment analysis on both coding genes and transcripts. The Arkas cloud pipeline supports ENSEMBL transcriptomes and can be used downstream from the SRA Import facilitating raw sequencing importing, SRA FASTQ conversion, RNA quantification and analysis steps. PMID:28868134
Quaternary ammonium isobaric tag for a relative and absolute quantification of peptides.
Setner, Bartosz; Stefanowicz, Piotr; Szewczuk, Zbigniew
2018-02-01
Isobaric labeling quantification of peptides has become a method of choice for mass spectrometry-based proteomics studies. However, despite of wide variety of commercially available isobaric tags, none of the currently available methods offers significant improvement of sensitivity of detection during MS experiment. Recently, many strategies were applied to increase the ionization efficiency of peptides involving chemical modifications introducing quaternary ammonium fixed charge. Here, we present a novel quaternary ammonium-based isobaric tag for relative and absolute quantification of peptides (QAS-iTRAQ 2-plex). Upon collisional activation, the new stable benzylic-type cationic reporter ion is liberated from the tag. Deuterium atoms were used to offset the differential masses of a reporter group. We tested the applicability of QAS-iTRAQ 2-plex reagent on a series of model peptides as well as bovine serum albumin tryptic digest. Obtained results suggest usefulness of this isobaric ionization tag for relative and absolute quantification of peptides. Copyright © 2017 John Wiley & Sons, Ltd.
Fujii, Yosuke; Ding, Yuqi; Umezawa, Taichi; Akimoto, Takafumi; Xu, Jiawei; Uchida, Takashi; Fujino, Tatsuya
2018-01-01
Food additives generally used in carbonated drinks, such as 4-methylimidazole (4MI), caffeine (Caf?), citric acid (CA), and aspartame (Apm), were measured by matrix-assisted laser desorption ionization mass spectrometry (MALDI MS) using nanometer-sized particles of iron oxide (Fe 2 O 3 NPs). The quantification of 4MI in Coca Cola (C-cola) was carried out. In order to improve the reproducibility of the peak intensities, Fe 2 O 3 NPs loaded on ZSM5 zeolite were used as the matrix for quantification. By using 2-ethylimidazole (2EI) as the internal standard, the amount of 4MI in C-cola was determined to range from 88 to 65 μg/355 mL. The results agree with the published value (approx. 72 μg/355 mL). It was found that MALDI using Fe 2 O 3 was applicable to the quantification of 4MI in C-cola.
The Virtual Genetics Lab II: Improvements to a Freely Available Software Simulation of Genetics
ERIC Educational Resources Information Center
White, Brian T.
2012-01-01
The Virtual Genetics Lab II (VGLII) is an improved version of the highly successful genetics simulation software, the Virtual Genetics Lab (VGL). The software allows students to use the techniques of genetic analysis to design crosses and interpret data to solve realistic genetics problems involving a hypothetical diploid insect. This is a brief…
The Perseus computational platform for comprehensive analysis of (prote)omics data.
Tyanova, Stefka; Temu, Tikira; Sinitcyn, Pavel; Carlson, Arthur; Hein, Marco Y; Geiger, Tamar; Mann, Matthias; Cox, Jürgen
2016-09-01
A main bottleneck in proteomics is the downstream biological analysis of highly multivariate quantitative protein abundance data generated using mass-spectrometry-based analysis. We developed the Perseus software platform (http://www.perseus-framework.org) to support biological and biomedical researchers in interpreting protein quantification, interaction and post-translational modification data. Perseus contains a comprehensive portfolio of statistical tools for high-dimensional omics data analysis covering normalization, pattern recognition, time-series analysis, cross-omics comparisons and multiple-hypothesis testing. A machine learning module supports the classification and validation of patient groups for diagnosis and prognosis, and it also detects predictive protein signatures. Central to Perseus is a user-friendly, interactive workflow environment that provides complete documentation of computational methods used in a publication. All activities in Perseus are realized as plugins, and users can extend the software by programming their own, which can be shared through a plugin store. We anticipate that Perseus's arsenal of algorithms and its intuitive usability will empower interdisciplinary analysis of complex large data sets.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Adams, Brian M.; Ebeida, Mohamed Salah; Eldred, Michael S.
The Dakota (Design Analysis Kit for Optimization and Terascale Applications) toolkit provides a exible and extensible interface between simulation codes and iterative analysis methods. Dakota contains algorithms for optimization with gradient and nongradient-based methods; uncertainty quanti cation with sampling, reliability, and stochastic expansion methods; parameter estimation with nonlinear least squares methods; and sensitivity/variance analysis with design of experiments and parameter study methods. These capabilities may be used on their own or as components within advanced strategies such as surrogate-based optimization, mixed integer nonlinear programming, or optimization under uncertainty. By employing object-oriented design to implement abstractions of the key components requiredmore » for iterative systems analyses, the Dakota toolkit provides a exible and extensible problem-solving environment for design and performance analysis of computational models on high performance computers. This report serves as a user's manual for the Dakota software and provides capability overviews and procedures for software execution, as well as a variety of example studies.« less
NASA Astrophysics Data System (ADS)
Denis, Vincent
2008-09-01
This paper presents a statistical method for determining the dimensions, tolerance and specifications of components for the Laser MegaJoule (LMJ). Numerous constraints inherent to a large facility require specific tolerances: the huge number of optical components; the interdependence of these components between the beams of same bundle; angular multiplexing for the amplifier section; distinct operating modes between the alignment and firing phases; the definition and use of alignment software in the place of classic optimization. This method provides greater flexibility to determine the positioning and manufacturing specifications of the optical components. Given the enormous power of the Laser MegaJoule (over 18 kJ in the infrared and 9 kJ in the ultraviolet), one of the major risks is damage the optical mounts and pollution of the installation by mechanical ablation. This method enables estimation of the beam occultation probabilities and quantification of the risks for the facility. All the simulations were run using the ZEMAX-EE optical design software.
jmzTab: a java interface to the mzTab data standard.
Xu, Qing-Wei; Griss, Johannes; Wang, Rui; Jones, Andrew R; Hermjakob, Henning; Vizcaíno, Juan Antonio
2014-06-01
mzTab is the most recent standard format developed by the Proteomics Standards Initiative. mzTab is a flexible tab-delimited file that can capture identification and quantification results coming from MS-based proteomics and metabolomics approaches. We here present an open-source Java application programming interface for mzTab called jmzTab. The software allows the efficient processing of mzTab files, providing read and write capabilities, and is designed to be embedded in other software packages. The second key feature of the jmzTab model is that it provides a flexible framework to maintain the logical integrity between the metadata and the table-based sections in the mzTab files. In this article, as two example implementations, we also describe two stand-alone tools that can be used to validate mzTab files and to convert PRIDE XML files to mzTab. The library is freely available at http://mztab.googlecode.com. © 2014 The Authors PROTEOMICS Published by Wiley-VCH Verlag GmbH & Co. KGaA, Weinheim.
Efficient visualization of high-throughput targeted proteomics experiments: TAPIR.
Röst, Hannes L; Rosenberger, George; Aebersold, Ruedi; Malmström, Lars
2015-07-15
Targeted mass spectrometry comprises a set of powerful methods to obtain accurate and consistent protein quantification in complex samples. To fully exploit these techniques, a cross-platform and open-source software stack based on standardized data exchange formats is required. We present TAPIR, a fast and efficient Python visualization software for chromatograms and peaks identified in targeted proteomics experiments. The input formats are open, community-driven standardized data formats (mzML for raw data storage and TraML encoding the hierarchical relationships between transitions, peptides and proteins). TAPIR is scalable to proteome-wide targeted proteomics studies (as enabled by SWATH-MS), allowing researchers to visualize high-throughput datasets. The framework integrates well with existing automated analysis pipelines and can be extended beyond targeted proteomics to other types of analyses. TAPIR is available for all computing platforms under the 3-clause BSD license at https://github.com/msproteomicstools/msproteomicstools. © The Author 2015. Published by Oxford University Press. All rights reserved. For Permissions, please e-mail: journals.permissions@oup.com.
Tracking and Quantifying Developmental Processes in C. elegans Using Open-source Tools.
Dutta, Priyanka; Lehmann, Christina; Odedra, Devang; Singh, Deepika; Pohl, Christian
2015-12-16
Quantitatively capturing developmental processes is crucial to derive mechanistic models and key to identify and describe mutant phenotypes. Here protocols are presented for preparing embryos and adult C. elegans animals for short- and long-term time-lapse microscopy and methods for tracking and quantification of developmental processes. The methods presented are all based on C. elegans strains available from the Caenorhabditis Genetics Center and on open-source software that can be easily implemented in any laboratory independently of the microscopy system used. A reconstruction of a 3D cell-shape model using the modelling software IMOD, manual tracking of fluorescently-labeled subcellular structures using the multi-purpose image analysis program Endrov, and an analysis of cortical contractile flow using PIVlab (Time-Resolved Digital Particle Image Velocimetry Tool for MATLAB) are shown. It is discussed how these methods can also be deployed to quantitatively capture other developmental processes in different models, e.g., cell tracking and lineage tracing, tracking of vesicle flow.
JAva GUi for Applied Research (JAGUAR) v 3.0
DOE Office of Scientific and Technical Information (OSTI.GOV)
JAGUAR is a Java software tool for automatically rendering a graphical user interface (GUI) from a structured input specification. It is designed as a plug-in to the Eclipse workbench to enable users to create, edit, and externally execute analysis application input decks and then view the results. JAGUAR serves as a GUI for Sandia's DAKOTA software toolkit for optimization and uncertainty quantification. It will include problem (input deck)set-up, option specification, analysis execution, and results visualization. Through the use of wizards, templates, and views, JAGUAR helps uses navigate the complexity of DAKOTA's complete input specification. JAGUAR is implemented in Java, leveragingmore » Eclipse extension points and Eclipse user interface. JAGUAR parses a DAKOTA NIDR input specification and presents the user with linked graphical and plain text representations of problem set-up and option specification for DAKOTA studies. After the data has been input by the user, JAGUAR generates one or more input files for DAKOTA, executes DAKOTA, and captures and interprets the results« less
Jacques, Eveline; Lewandowski, Michal; Buytaert, Jan; Fierens, Yves; Verbelen, Jean-Pierre; Vissenberg, Kris
2013-01-01
The plant cytoskeleton plays a crucial role in the cells’ growth and development during different developmental stages and it undergoes many rearrangements. In order to describe the arrangements of the F-actin cytoskeleton in root epidermal cells of Arabidopsis thaliana, the recently developed software MicroFilament Analyzer (MFA) was exploited. This software enables high-throughput identification and quantification of the orientation of filamentous structures on digital images in a highly standardized and fast way. Using confocal microscopy and transgenic GFP-FABD2-GFP plants the actin cytoskeleton was visualized in the root epidermis. MFA analysis revealed that during the early stages of cell development F-actin is organized in a mainly random pattern. As the cells grow, they preferentially adopt a longitudinal organization, a pattern that is also preserved in the largest cells. In the evolution from young to old cells, an approximately even distribution of transverse, oblique or combined orientations is always present besides the switch from random to a longitudinal oriented actin cytoskeleton. PMID:23656865
NASA Astrophysics Data System (ADS)
Jin, Hyeongmin; Heo, Changyong; Kim, Jong Hyo
2018-02-01
Differing reconstruction kernels are known to strongly affect the variability of imaging biomarkers and thus remain as a barrier in translating the computer aided quantification techniques into clinical practice. This study presents a deep learning application to CT kernel conversion which converts a CT image of sharp kernel to that of standard kernel and evaluates its impact on variability reduction of a pulmonary imaging biomarker, the emphysema index (EI). Forty cases of low-dose chest CT exams obtained with 120kVp, 40mAs, 1mm thickness, of 2 reconstruction kernels (B30f, B50f) were selected from the low dose lung cancer screening database of our institution. A Fully convolutional network was implemented with Keras deep learning library. The model consisted of symmetric layers to capture the context and fine structure characteristics of CT images from the standard and sharp reconstruction kernels. Pairs of the full-resolution CT data set were fed to input and output nodes to train the convolutional network to learn the appropriate filter kernels for converting the CT images of sharp kernel to standard kernel with a criterion of measuring the mean squared error between the input and target images. EIs (RA950 and Perc15) were measured with a software package (ImagePrism Pulmo, Seoul, South Korea) and compared for the data sets of B50f, B30f, and the converted B50f. The effect of kernel conversion was evaluated with the mean and standard deviation of pair-wise differences in EI. The population mean of RA950 was 27.65 +/- 7.28% for B50f data set, 10.82 +/- 6.71% for the B30f data set, and 8.87 +/- 6.20% for the converted B50f data set. The mean of pair-wise absolute differences in RA950 between B30f and B50f is reduced from 16.83% to 1.95% using kernel conversion. Our study demonstrates the feasibility of applying the deep learning technique for CT kernel conversion and reducing the kernel-induced variability of EI quantification. The deep learning model has a potential to improve the reliability of imaging biomarker, especially in evaluating the longitudinal changes of EI even when the patient CT scans were performed with different kernels.
Improved CLARAty Functional-Layer/Decision-Layer Interface
NASA Technical Reports Server (NTRS)
Estlin, Tara; Rabideau, Gregg; Gaines, Daniel; Johnston, Mark; Chouinard, Caroline; Nessnas, Issa; Shu, I-Hsiang
2008-01-01
Improved interface software for communication between the CLARAty Decision and Functional layers has been developed. [The Coupled Layer Architecture for Robotics Autonomy (CLARAty) was described in Coupled-Layer Robotics Architecture for Autonomy (NPO-21218), NASA Tech Briefs, Vol. 26, No. 12 (December 2002), page 48. To recapitulate: the CLARAty architecture was developed to improve the modularity of robotic software while tightening coupling between planning/execution and basic control subsystems. Whereas prior robotic software architectures typically contained three layers, the CLARAty contains two layers: a decision layer (DL) and a functional layer (FL).] Types of communication supported by the present software include sending commands from DL modules to FL modules and sending data updates from FL modules to DL modules. The present software supplants prior interface software that had little error-checking capability, supported data parameters in string form only, supported commanding at only one level of the FL, and supported only limited updates of the state of the robot. The present software offers strong error checking, and supports complex data structures and commanding at multiple levels of the FL, and relative to the prior software, offers a much wider spectrum of state-update capabilities.
Wakasaki, Rumie; Eiwaz, Mahaba; McClellan, Nicholas; Matsushita, Katsuyuki; Golgotiu, Kirsti; Hutchens, Michael P
2018-06-14
A technical challenge in translational models of kidney injury is determination of the extent of cell death. Histologic sections are commonly analyzed by area morphometry or unbiased stereology, but stereology requires specialized equipment. Therefore, a challenge to rigorous quantification would be addressed by an unbiased stereology tool with reduced equipment dependence. We hypothesized that it would be feasible to build a novel software component which would facilitate unbiased stereologic quantification on scanned slides, and that unbiased stereology would demonstrate greater precision and decreased bias compared with 2D morphometry. We developed a macro for the widely used image analysis program, Image J, and performed cardiac arrest with cardiopulmonary resuscitation (CA/CPR, a model of acute cardiorenal syndrome) in mice. Fluorojade-B stained kidney sections were analyzed using three methods to quantify cell death: gold standard stereology using a controlled stage and commercially-available software, unbiased stereology using the novel ImageJ macro, and quantitative 2D morphometry also using the novel macro. There was strong agreement between both methods of unbiased stereology (bias -0.004±0.006 with 95% limits of agreement -0.015 to 0.007). 2D morphometry demonstrated poor agreement and significant bias compared to either method of unbiased stereology. Unbiased stereology is facilitated by a novel macro for ImageJ and results agree with those obtained using gold-standard methods. Automated 2D morphometry overestimated tubular epithelial cell death and correlated modestly with values obtained from unbiased stereology. These results support widespread use of unbiased stereology for analysis of histologic outcomes of injury models.
Frauscher, Birgit; Gabelia, David; Biermayr, Marlene; Stefani, Ambra; Hackner, Heinz; Mitterling, Thomas; Poewe, Werner; Högl, Birgit
2014-10-01
Rapid eye movement sleep without atonia (RWA) is the polysomnographic hallmark of REM sleep behavior disorder (RBD). To partially overcome the disadvantages of manual RWA scoring, which is time consuming but essential for the accurate diagnosis of RBD, we aimed to validate software specifically developed and integrated with polysomnography for RWA detection against the gold standard of manual RWA quantification. Academic referral center sleep laboratory. Polysomnographic recordings of 20 patients with RBD and 60 healthy volunteers were analyzed. N/A. Motor activity during REM sleep was quantified manually and computer assisted (with and without artifact detection) according to Sleep Innsbruck Barcelona (SINBAR) criteria for the mentalis ("any," phasic, tonic electromyographic [EMG] activity) and the flexor digitorum superficialis (FDS) muscle (phasic EMG activity). Computer-derived indices (with and without artifact correction) for "any," phasic, tonic mentalis EMG activity, phasic FDS EMG activity, and the SINBAR index ("any" mentalis + phasic FDS) correlated well with the manually derived indices (all Spearman rhos 0.66-0.98). In contrast with computerized scoring alone, computerized scoring plus manual artifact correction (median duration 5.4 min) led to a significant reduction of false positives for "any" mentalis (40%), phasic mentalis (40.6%), and the SINBAR index (41.2%). Quantification of tonic mentalis and phasic FDS EMG activity was not influenced by artifact correction. The computer algorithm used here appears to be a promising tool for REM sleep behavior disorder detection in both research and clinical routine. A short check for plausibility of automatic detection should be a basic prerequisite for this and all other available computer algorithms. © 2014 Associated Professional Sleep Societies, LLC.
Maxeiner, Andreas; Fischer, Thomas; Schwabe, Julia; Baur, Alexander Daniel Jacques; Stephan, Carsten; Peters, Robert; Slowinski, Torsten; von Laffert, Maximilian; Marticorena Garcia, Stephan Rodrigo; Hamm, Bernd; Jung, Ernst-Michael
2018-06-06
The aim of this study was to investigate contrast-enhanced ultrasound (CEUS) parameters acquired by software during magnetic resonance imaging (MRI) US fusion-guided biopsy for prostate cancer (PCa) detection and discrimination. From 2012 to 2015, 158 out of 165 men with suspicion for PCa and with at least 1 negative biopsy of the prostate were included and underwent a multi-parametric 3 Tesla MRI and an MRI/US fusion-guided biopsy, consecutively. CEUS was conducted during biopsy with intravenous bolus application of 2.4 mL of SonoVue ® (Bracco, Milan, Italy). In the latter CEUS clips were investigated using quantitative perfusion analysis software (VueBox, Bracco). The area of strongest enhancement within the MRI pre-located region was investigated and all available parameters from the quantification tool box were collected and analyzed for PCa and its further differentiation was based on the histopathological results. The overall detection rate was 74 (47 %) PCa cases in 158 included patients. From these 74 PCa cases, 49 (66 %) were graded Gleason ≥ 3 + 4 = 7 (ISUP ≥ 2) PCa. The best results for cancer detection over all quantitative perfusion parameters were rise time (p = 0.026) and time to peak (p = 0.037). Within the subgroup analysis (> vs ≤ 3 + 4 = 7a (ISUP 2)), peak enhancement (p = 0.012), wash-in rate (p = 0.011), wash-out rate (p = 0.007) and wash-in perfusion index (p = 0.014) also showed statistical significance. The quantification of CEUS parameters was able to discriminate PCa aggressiveness during MRI/US fusion-guided prostate biopsy. © Georg Thieme Verlag KG Stuttgart · New York.
Contijoch, Francisco; Witschey, Walter R T; Rogers, Kelly; Rears, Hannah; Hansen, Michael; Yushkevich, Paul; Gorman, Joseph; Gorman, Robert C; Han, Yuchi
2015-05-21
Data obtained during arrhythmia is retained in real-time cardiovascular magnetic resonance (rt-CMR), but there is limited and inconsistent evidence to show that rt-CMR can accurately assess beat-to-beat variation in left ventricular (LV) function or during an arrhythmia. Multi-slice, short axis cine and real-time golden-angle radial CMR data was collected in 22 clinical patients (18 in sinus rhythm and 4 patients with arrhythmia). A user-initialized active contour segmentation (ACS) software was validated via comparison to manual segmentation on clinically accepted software. For each image in the 2D acquisitions, slice volume was calculated and global LV volumes were estimated via summation across the LV using multiple slices. Real-time imaging data was reconstructed using different image exposure times and frame rates to evaluate the effect of temporal resolution on measured function in each slice via ACS. Finally, global volumetric function of ectopic and non-ectopic beats was measured using ACS in patients with arrhythmias. ACS provides global LV volume measurements that are not significantly different from manual quantification of retrospectively gated cine images in sinus rhythm patients. With an exposure time of 95.2 ms and a frame rate of > 89 frames per second, golden-angle real-time imaging accurately captures hemodynamic function over a range of patient heart rates. In four patients with frequent ectopic contractions, initial quantification of the impact of ectopic beats on hemodynamic function was demonstrated. User-initialized active contours and golden-angle real-time radial CMR can be used to determine time-varying LV function in patients. These methods will be very useful for the assessment of LV function in patients with frequent arrhythmias.
van den Hoven, Allard T; Mc-Ghie, Jackie S; Chelu, Raluca G; Duijnhouwer, Anthonie L; Baggen, Vivan J M; Coenen, Adriaan; Vletter, Wim B; Dijkshoorn, Marcel L; van den Bosch, Annemien E; Roos-Hesselink, Jolien W
2017-12-01
Integration of volumetric heart chamber quantification by 3D echocardiography into clinical practice has been hampered by several factors which a new fully automated algorithm (Left Heart Model, (LHM)) may help overcome. This study therefore aims to evaluate the feasibility and accuracy of the LHM software in quantifying left atrial and left ventricular volumes and left ventricular ejection fraction in a cohort of patients with a bicuspid aortic valve. Patients with a bicuspid aortic valve were prospectively included. All patients underwent 2D and 3D transthoracic echocardiography and computed tomography. Left atrial and ventricular volumes were obtained using the automated program, which did not require manual contour detection. For comparison manual and semi-automated measurements were performed using conventional 2D and 3D datasets. 53 patients were included, in four of those patients no 3D dataset could be acquired. Additionally, 12 patients were excluded based on poor imaging quality. Left ventricular end-diastolic and end-systolic volumes and ejection fraction calculated by the LHM correlated well with manual 2D and 3D measurements (Pearson's r between 0.43 and 0.97, p < 0.05). Left atrial volume (LAV) also correlated significantly although LHM did estimate larger LAV compared to both 2DE and 3DE (Pearson's r between 0.61 and 0.81, p < 0.01). The fully automated software works well in a real-world setting and helps to overcome some of the major hurdles in integrating 3D analysis into daily practice, as it is user-independent and highly reproducible in a group of patients with a clearly defined and well-studied valvular abnormality.
Rutten, Iris J G; Ubachs, Jorne; Kruitwagen, Roy F P M; Beets-Tan, Regina G H; Olde Damink, Steven W M; Van Gorp, Toon
2017-08-01
Computed tomography measurements of total skeletal muscle area can detect changes and predict overall survival (OS) in patients with advanced ovarian cancer. This study investigates whether assessment of psoas muscle area reflects total muscle area and can be used to assess sarcopenia in ovarian cancer patients. Ovarian cancer patients (n = 150) treated with induction chemotherapy and interval debulking were enrolled retrospectively in this longitudinal study. Muscle was measured cross sectionally with computed tomography in three ways: (i) software quantification of total skeletal muscle area (SMA); (ii) software quantification of psoas muscle area (PA); and (iii) manual measurement of length and width of the psoas muscle to derive the psoas surface area (PLW). Pearson correlation between the different methods was studied. Patients were divided into two groups based on the extent of change in muscle area, and agreement was measured with kappa coefficients. Cox-regression was used to test predictors for OS. Correlation between SMA and both psoas muscle area measurements was poor (r = 0.52 and 0.39 for PA and PLW, respectively). After categorizing patients into muscle loss or gain, kappa agreement was also poor for all comparisons (all κ < 0.40). In regression analysis, SMA loss was predictive of poor OS (hazard ratio 1.698 (95%CI 1.038-2.778), P = 0.035). No relationship with OS was seen for PA or PLW loss. Change in psoas muscle area is not representative of total muscle area change and should not be used to substitute total skeletal muscle to predict survival in patients with ovarian cancer. © 2017 The Authors. Journal of Cachexia, Sarcopenia and Muscle published by John Wiley & Sons Ltd on behalf of the Society on Sarcopenia, Cachexia and Wasting Disorders.
MIQuant – Semi-Automation of Infarct Size Assessment in Models of Cardiac Ischemic Injury
Esteves, Tiago; de Pina, Maria de Fátima; Guedes, Joana G.; Freire, Ana; Quelhas, Pedro; Pinto-do-Ó, Perpétua
2011-01-01
Background The cardiac regenerative potential of newly developed therapies is traditionally evaluated in rodent models of surgically induced myocardial ischemia. A generally accepted key parameter for determining the success of the applied therapy is the infarct size. Although regarded as a gold standard method for infarct size estimation in heart ischemia, histological planimetry is time-consuming and highly variable amongst studies. The purpose of this work is to contribute towards the standardization and simplification of infarct size assessment by providing free access to a novel semi-automated software tool. The acronym MIQuant was attributed to this application. Methodology/Principal Findings Mice were subject to permanent coronary artery ligation and the size of chronic infarcts was estimated by area and midline-length methods using manual planimetry and with MIQuant. Repeatability and reproducibility of MIQuant scores were verified. The validation showed high correlation (r midline length = 0.981; r area = 0.970 ) and agreement (Bland-Altman analysis), free from bias for midline length and negligible bias of 1.21% to 3.72% for area quantification. Further analysis demonstrated that MIQuant reduced by 4.5-fold the time spent on the analysis and, importantly, MIQuant effectiveness is independent of user proficiency. The results indicate that MIQuant can be regarded as a better alternative to manual measurement. Conclusions We conclude that MIQuant is a reliable and an easy-to-use software for infarct size quantification. The widespread use of MIQuant will contribute towards the standardization of infarct size assessment across studies and, therefore, to the systematization of the evaluation of cardiac regenerative potential of emerging therapies. PMID:21980376
Teo, Guoshou; Kim, Sinae; Tsou, Chih-Chiang; Collins, Ben; Gingras, Anne-Claude; Nesvizhskii, Alexey I; Choi, Hyungwon
2015-11-03
Data independent acquisition (DIA) mass spectrometry is an emerging technique that offers more complete detection and quantification of peptides and proteins across multiple samples. DIA allows fragment-level quantification, which can be considered as repeated measurements of the abundance of the corresponding peptides and proteins in the downstream statistical analysis. However, few statistical approaches are available for aggregating these complex fragment-level data into peptide- or protein-level statistical summaries. In this work, we describe a software package, mapDIA, for statistical analysis of differential protein expression using DIA fragment-level intensities. The workflow consists of three major steps: intensity normalization, peptide/fragment selection, and statistical analysis. First, mapDIA offers normalization of fragment-level intensities by total intensity sums as well as a novel alternative normalization by local intensity sums in retention time space. Second, mapDIA removes outlier observations and selects peptides/fragments that preserve the major quantitative patterns across all samples for each protein. Last, using the selected fragments and peptides, mapDIA performs model-based statistical significance analysis of protein-level differential expression between specified groups of samples. Using a comprehensive set of simulation datasets, we show that mapDIA detects differentially expressed proteins with accurate control of the false discovery rates. We also describe the analysis procedure in detail using two recently published DIA datasets generated for 14-3-3β dynamic interaction network and prostate cancer glycoproteome. The software was written in C++ language and the source code is available for free through SourceForge website http://sourceforge.net/projects/mapdia/.This article is part of a Special Issue entitled: Computational Proteomics. Copyright © 2015 Elsevier B.V. All rights reserved.
Cuadros-Inostroza, Alvaro; Caldana, Camila; Redestig, Henning; Kusano, Miyako; Lisec, Jan; Peña-Cortés, Hugo; Willmitzer, Lothar; Hannah, Matthew A
2009-12-16
Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS). The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data.
2009-01-01
Background Metabolite profiling, the simultaneous quantification of multiple metabolites in an experiment, is becoming increasingly popular, particularly with the rise of systems-level biology. The workhorse in this field is gas-chromatography hyphenated with mass spectrometry (GC-MS). The high-throughput of this technology coupled with a demand for large experiments has led to data pre-processing, i.e. the quantification of metabolites across samples, becoming a major bottleneck. Existing software has several limitations, including restricted maximum sample size, systematic errors and low flexibility. However, the biggest limitation is that the resulting data usually require extensive hand-curation, which is subjective and can typically take several days to weeks. Results We introduce the TargetSearch package, an open source tool which is a flexible and accurate method for pre-processing even very large numbers of GC-MS samples within hours. We developed a novel strategy to iteratively correct and update retention time indices for searching and identifying metabolites. The package is written in the R programming language with computationally intensive functions written in C for speed and performance. The package includes a graphical user interface to allow easy use by those unfamiliar with R. Conclusions TargetSearch allows fast and accurate data pre-processing for GC-MS experiments and overcomes the sample number limitations and manual curation requirements of existing software. We validate our method by carrying out an analysis against both a set of known chemical standard mixtures and of a biological experiment. In addition we demonstrate its capabilities and speed by comparing it with other GC-MS pre-processing tools. We believe this package will greatly ease current bottlenecks and facilitate the analysis of metabolic profiling data. PMID:20015393
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and space station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
A process improvement model for software verification and validation
NASA Technical Reports Server (NTRS)
Callahan, John; Sabolish, George
1994-01-01
We describe ongoing work at the NASA Independent Verification and Validation (IV&V) Facility to establish a process improvement model for software verification and validation (V&V) organizations. This model, similar to those used by some software development organizations, uses measurement-based techniques to identify problem areas and introduce incremental improvements. We seek to replicate this model for organizations involved in V&V on large-scale software development projects such as EOS and Space Station. At the IV&V Facility, a university research group and V&V contractors are working together to collect metrics across projects in order to determine the effectiveness of V&V and improve its application. Since V&V processes are intimately tied to development processes, this paper also examines the repercussions for development organizations in large-scale efforts.
Profile of software engineering within the National Aeronautics and Space Administration (NASA)
NASA Technical Reports Server (NTRS)
Sinclair, Craig C.; Jeletic, Kellyann F.
1994-01-01
This paper presents findings of baselining activities being performed to characterize software practices within the National Aeronautics and Space Administration. It describes how such baseline findings might be used to focus software process improvement activities. Finally, based on the findings to date, it presents specific recommendations in focusing future NASA software process improvement efforts. The findings presented in this paper are based on data gathered and analyzed to date. As such, the quantitative data presented in this paper are preliminary in nature.
Software for simulation of a computed tomography imaging spectrometer using optical design software
NASA Astrophysics Data System (ADS)
Spuhler, Peter T.; Willer, Mark R.; Volin, Curtis E.; Descour, Michael R.; Dereniak, Eustace L.
2000-11-01
Our Imaging Spectrometer Simulation Software known under the name Eikon should improve and speed up the design of a Computed Tomography Imaging Spectrometer (CTIS). Eikon uses existing raytracing software to simulate a virtual instrument. Eikon enables designers to virtually run through the design, calibration and data acquisition, saving significant cost and time when designing an instrument. We anticipate that Eikon simulations will improve future designs of CTIS by allowing engineers to explore more instrument options.
Thaden, Jeremy J; Tsang, Michael Y; Ayoub, Chadi; Padang, Ratnasari; Nkomo, Vuyisile T; Tucker, Stephen F; Cassidy, Cynthia S; Bremer, Merri; Kane, Garvan C; Pellikka, Patricia A
2017-08-01
It is presumed that echocardiographic laboratory accreditation leads to improved quality, but there are few data. We sought to compare the quality of echocardiographic examinations performed at accredited versus nonaccredited laboratories for the evaluation of valvular heart disease. We enrolled 335 consecutive valvular heart disease subjects who underwent echocardiography at our institution and an external accredited or nonaccredited institution within 6 months. Completeness and quality of echocardiographic reports and images were assessed by investigators blinded to the external laboratory accreditation status and echocardiographic results. Compared with nonaccredited laboratories, accredited sites more frequently reported patient sex (94% versus 78%; P <0.001), height and weight (96% versus 63%; P <0.001), blood pressure (86% versus 39%; P <0.001), left ventricular size (96% versus 83%; P <0.001), right ventricular size (94% versus 80%; P =0.001), and right ventricular function (87% versus 73%; P =0.006). Accredited laboratories had higher rates of complete and diagnostic color (58% versus 35%; P =0.002) and spectral Doppler imaging (45% versus 21%; P <0.0001). Concordance between external and internal grading of external studies was improved when diagnostic quantification was performed (85% versus 69%; P =0.003), and in patients with mitral regurgitation, reproducibility was improved with higher quality color Doppler imaging. Accredited echocardiographic laboratories had more complete reporting and better image quality, while echocardiographic quantification and color Doppler image quality were associated with improved concordance in grading valvular heart disease. Future quality improvement initiatives should highlight the importance of high-quality color Doppler imaging and echocardiographic quantification to improve the accuracy, reproducibility, and quality of echocardiographic studies for valvular heart disease. © 2017 American Heart Association, Inc.
Finding the right wheel when you don't want to reinvent it
NASA Astrophysics Data System (ADS)
Hucka, Michael
2017-01-01
The increasing amount of software being developed in all areas of science brings new capabilities as well as new challenges. Two of these challenges are finding potentially-relevant software, and being able to reuse it. The notion that "surely someone must have written a tool to do XYZ" often runs into the reality of thousands of Google hits with little detail about capabilities and status of different options. Software directories such as ASCL can add tremendous value by helping to improve the signal-to-noise ratio when searching for software; in addition, developers themselves can also act to make their work more easily found and understood. In this context, it can be useful to know what people do in practice when they look for software, and some of the factors that help or hinder their ability to reuse the software they do find. The results point to some simple steps that developers can take. Improved findability and reusability of software has broad potential impact, ranging from improved reproducibility of research results to better return on investment by funding agencies.
Li, Wei; Abram, François; Pelletier, Jean-Pierre; Raynauld, Jean-Pierre; Dorais, Marc; d'Anjou, Marc-André; Martel-Pelletier, Johanne
2010-01-01
Joint effusion is frequently associated with osteoarthritis (OA) flare-up and is an important marker of therapeutic response. This study aimed at developing and validating a fully automated system based on magnetic resonance imaging (MRI) for the quantification of joint effusion volume in knee OA patients. MRI examinations consisted of two axial sequences: a T2-weighted true fast imaging with steady-state precession and a T1-weighted gradient echo. An automated joint effusion volume quantification system using MRI was developed and validated (a) with calibrated phantoms (cylinder and sphere) and effusion from knee OA patients; (b) with assessment by manual quantification; and (c) by direct aspiration. Twenty-five knee OA patients with joint effusion were included in the study. The automated joint effusion volume quantification was developed as a four stage sequencing process: bone segmentation, filtering of unrelated structures, segmentation of joint effusion, and subvoxel volume calculation. Validation experiments revealed excellent coefficients of variation with the calibrated cylinder (1.4%) and sphere (0.8%) phantoms. Comparison of the OA knee joint effusion volume assessed by the developed automated system and by manual quantification was also excellent (r = 0.98; P < 0.0001), as was the comparison with direct aspiration (r = 0.88; P = 0.0008). The newly developed fully automated MRI-based system provided precise quantification of OA knee joint effusion volume with excellent correlation with data from phantoms, a manual system, and joint aspiration. Such an automated system will be instrumental in improving the reproducibility/reliability of the evaluation of this marker in clinical application.
ERIC Educational Resources Information Center
Clarke, Peter J.; Davis, Debra; King, Tariq M.; Pava, Jairo; Jones, Edward L.
2014-01-01
As software becomes more ubiquitous and complex, the cost of software bugs continues to grow at a staggering rate. To remedy this situation, there needs to be major improvement in the knowledge and application of software validation techniques. Although there are several software validation techniques, software testing continues to be one of the…
2016 KIVA-hpFE Development: A Robust and Accurate Engine Modeling Software
DOE Office of Scientific and Technical Information (OSTI.GOV)
Carrington, David Bradley; Waters, Jiajia
Los Alamos National Laboratory and its collaborators are facilitating engine modeling by improving accuracy and robustness of the modeling, and improving the robustness of software. We also continue to improve the physical modeling methods. We are developing and implementing new mathematical algorithms, those that represent the physics within an engine. We provide software that others may use directly or that they may alter with various models e.g., sophisticated chemical kinetics, different turbulent closure methods or other fuel injection and spray systems.
2017-01-01
To improve point-of-care quantification using microchip capillary electrophoresis (MCE), the chip-to-chip variabilities inherent in disposable, single-use devices must be addressed. This work proposes to integrate an internal standard (ISTD) into the microchip by adding it to the background electrolyte (BGE) instead of the sample—thus eliminating the need for additional sample manipulation, microchip redesigns, and/or system expansions required for traditional ISTD usage. Cs and Li ions were added as integrated ISTDs to the BGE, and their effects on the reproducibility of Na quantification were explored. Results were then compared to the conclusions of our previous publication which used Cs and Li as traditional ISTDs. The in-house fabricated microchips, electrophoretic protocols, and solution matrixes were kept constant, allowing the proposed method to be reliably compared to the traditional method. Using the integrated ISTDs, both Cs and Li improved the Na peak area reproducibility approximately 2-fold, to final RSD values of 2.2–4.7% (n = 900). In contrast (to previous work), Cs as a traditional ISTD resulted in final RSDs of 2.5–8.8%, while the traditional Li ISTD performed poorly with RSDs of 6.3–14.2%. These findings suggest integrated ISTDs are a viable method to improve the precision of disposable MCE devices—giving matched or superior results to the traditional method in this study while neither increasing system cost nor complexity. PMID:28192985
Understanding Acceptance of Software Metrics--A Developer Perspective
ERIC Educational Resources Information Center
Umarji, Medha
2009-01-01
Software metrics are measures of software products and processes. Metrics are widely used by software organizations to help manage projects, improve product quality and increase efficiency of the software development process. However, metrics programs tend to have a high failure rate in organizations, and developer pushback is one of the sources…
User Documentation for Multiple Software Releases
NASA Technical Reports Server (NTRS)
Humphrey, R.
1982-01-01
In proposed solution to problems of frequent software releases and updates, documentation would be divided into smaller packages, each of which contains data relating to only one of several software components. Changes would not affect entire document. Concept would improve dissemination of information regarding changes and would improve quality of data supporting packages. Would help to insure both timeliness and more thorough scrutiny of changes.
ERIC Educational Resources Information Center
Prvinchandar, Sunita; Ayub, Ahmad Fauzi Mohd
2014-01-01
This study compared the effectiveness of two types of computer software for improving the English writing skills of pupils in a Malaysian primary school. Sixty students who participated in the seven-week training course were divided into two groups, with the experimental group using the StyleWriter software and the control group using the…
Defect measurement and analysis of JPL ground software: a case study
NASA Technical Reports Server (NTRS)
Powell, John D.; Spagnuolo, John N., Jr.
2004-01-01
Ground software systems at JPL must meet high assurance standards while remaining on schedule due to relatively immovable launch dates for spacecraft that will be controlled by such systems. Toward this end, the Software Quality Improvement (SQI) project's Measurement and Benchmarking (M&B) team is collecting and analyzing defect data of JPL ground system software projects to build software defect prediction models. The aim of these models is to improve predictability with regard to software quality activities. Predictive models will quantitatively define typical trends for JPL ground systems as well as Critical Discriminators (CDs) to provide explanations for atypical deviations from the norm at JPL. CDs are software characteristics that can be estimated or foreseen early in a software project's planning. Thus, these CDs will assist in planning for the predicted degree to which software quality activities for a project are likely to deviation from the normal JPL ground system based on pasted experience across the lab.
Kinematic Modeling of Normal Voluntary Mandibular Opening and Closing Velocity-Initial Study.
Gawriołek, Krzysztof; Gawriołek, Maria; Komosa, Marek; Piotrowski, Paweł R; Azer, Shereen S
2015-06-01
Determination and quantification of voluntary mandibular velocity movement has not been a thoroughly studied parameter of masticatory movement. This study attempted to objectively define kinematics of mandibular movement based on numerical (digital) analysis of the relations and interactions of velocity diagram records in healthy female individuals. Using a computerized mandibular scanner (K7 Evaluation Software), 72 diagrams of voluntary mandibular velocity movements (36 for opening, 36 for closing) for women with clinically normal motor and functional activities of the masticatory system were recorded. Multiple measurements were analyzed focusing on the curve for maximum velocity records. For each movement, the loop of temporary velocities was determined. The diagram was then entered into AutoCad calculation software where movement analysis was performed. The real maximum velocity values on opening (Vmax ), closing (V0 ), and average velocity values (Vav ) as well as movement accelerations (a) were recorded. Additionally, functional (A1-A2) and geometric (P1-P4) analysis of loop constituent phases were performed, and the relations between the obtained areas were defined. Velocity means and correlation coefficient values for various velocity phases were calculated. The Wilcoxon test produced the following maximum and average velocity results: Vmax = 394 ± 102, Vav = 222 ± 61 for opening, and Vmax = 409 ± 94, Vav = 225 ± 55 mm/s for closing. Both mandibular movement range and velocity change showed significant variability achieving the highest velocity in P2 phase. Voluntary mandibular velocity presents significant variations between healthy individuals. Maximum velocity is obtained when incisal separation is between 12.8 and 13.5 mm. An improved understanding of the patterns of normal mandibular movements may provide an invaluable diagnostic aid to pathological changes within the masticatory system. © 2014 by the American College of Prosthodontists.
A study of quantification of aortic compliance in mice using radial acquisition phase contrast MRI
NASA Astrophysics Data System (ADS)
Zhao, Xuandong
Spatiotemporal changes in blood flow velocity measured using Phase contrast Magnetic Resonance Imaging (MRI) can be used to quantify Pulse Wave Velocity (PWV) and Wall Shear Stress (WSS), well known indices of vessel compliance. A study was conducted to measure the PWV in the aortic arch in young healthy children using conventional phase contrast MRI and a post processing algorithm that automatically track the peak velocity in phase contrast images. It is shown that the PWV calculated using peak velocity-time data has less variability compared to that using mean velocity and flow. Conventional MR data acquisition techniques lack both the spatial and temporal resolution needed to accurately calculate PWV and WSS in in vivo studies using transgenic animal models of arterial diseases. Radial k-space acquisition can improve both spatial and temporal resolution. A major part of this thesis was devoted to developing technology for Radial Phase Contrast Magnetic Resonance (RPCMR) cine imaging on a 7 Tesla Animal scanner. A pulse sequence with asymmetric radial k-space acquisition was designed and implemented. Software developed to reconstruct the RPCMR images include gridding, density compensation and centering of k-Space that corrects the image ghosting introduced by hardware response time. Image processing software was developed to automatically segment the vessel lumen and correct for phase offset due to eddy currents. Finally, in vivo and ex vivo aortic compliance measurements were conducted in a well-established mouse model for atherosclerosis: Apolipoprotein E-knockout (ApoE-KO). Using RPCMR technique, a significantly higher PWV value as well as a higher average WSS was detected among 9 months old ApoE-KO mice compare to in wild type mice. A follow up ex-vivo test of tissue elasticity confirmed the impaired distensibility of aortic arteries among ApoE-KO mice.
Brainwave Monitoring Software Improves Distracted Minds
NASA Technical Reports Server (NTRS)
2014-01-01
Neurofeedback technology developed at Langley Research Center to monitor pilot awareness inspired Peter Freer to develop software for improving student performance. His company, Fletcher, North Carolina-based Unique Logic and Technology Inc., has gone on to develop technology for improving workplace and sports performance, monitoring drowsiness, and encouraging relaxation.
DOE Office of Scientific and Technical Information (OSTI.GOV)
Minana, Molly A.; Sturtevant, Judith E.; Heaphy, Robert
2005-01-01
The purpose of the Sandia National Laboratories (SNL) Advanced Simulation and Computing (ASC) Software Quality Plan is to clearly identify the practices that are the basis for continually improving the quality of ASC software products. Quality is defined in DOE/AL Quality Criteria (QC-1) as conformance to customer requirements and expectations. This quality plan defines the ASC program software quality practices and provides mappings of these practices to the SNL Corporate Process Requirements (CPR 1.3.2 and CPR 1.3.6) and the Department of Energy (DOE) document, ASCI Software Quality Engineering: Goals, Principles, and Guidelines (GP&G). This quality plan identifies ASC management andmore » software project teams' responsibilities for cost-effective software engineering quality practices. The SNL ASC Software Quality Plan establishes the signatories commitment to improving software products by applying cost-effective software engineering quality practices. This document explains the project teams opportunities for tailoring and implementing the practices; enumerates the practices that compose the development of SNL ASC's software products; and includes a sample assessment checklist that was developed based upon the practices in this document.« less
A workflow learning model to improve geovisual analytics utility
Roth, Robert E; MacEachren, Alan M; McCabe, Craig A
2011-01-01
Introduction This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. Objectives The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. Methodology The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. Results/Conclusions In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009. PMID:21983545
A workflow learning model to improve geovisual analytics utility.
Roth, Robert E; Maceachren, Alan M; McCabe, Craig A
2009-01-01
INTRODUCTION: This paper describes the design and implementation of the G-EX Portal Learn Module, a web-based, geocollaborative application for organizing and distributing digital learning artifacts. G-EX falls into the broader context of geovisual analytics, a new research area with the goal of supporting visually-mediated reasoning about large, multivariate, spatiotemporal information. Because this information is unprecedented in amount and complexity, GIScientists are tasked with the development of new tools and techniques to make sense of it. Our research addresses the challenge of implementing these geovisual analytics tools and techniques in a useful manner. OBJECTIVES: The objective of this paper is to develop and implement a method for improving the utility of geovisual analytics software. The success of software is measured by its usability (i.e., how easy the software is to use?) and utility (i.e., how useful the software is). The usability and utility of software can be improved by refining the software, increasing user knowledge about the software, or both. It is difficult to achieve transparent usability (i.e., software that is immediately usable without training) of geovisual analytics software because of the inherent complexity of the included tools and techniques. In these situations, improving user knowledge about the software through the provision of learning artifacts is as important, if not more so, than iterative refinement of the software itself. Therefore, our approach to improving utility is focused on educating the user. METHODOLOGY: The research reported here was completed in two steps. First, we developed a model for learning about geovisual analytics software. Many existing digital learning models assist only with use of the software to complete a specific task and provide limited assistance with its actual application. To move beyond task-oriented learning about software use, we propose a process-oriented approach to learning based on the concept of scientific workflows. Second, we implemented an interface in the G-EX Portal Learn Module to demonstrate the workflow learning model. The workflow interface allows users to drag learning artifacts uploaded to the G-EX Portal onto a central whiteboard and then annotate the workflow using text and drawing tools. Once completed, users can visit the assembled workflow to get an idea of the kind, number, and scale of analysis steps, view individual learning artifacts associated with each node in the workflow, and ask questions about the overall workflow or individual learning artifacts through the associated forums. An example learning workflow in the domain of epidemiology is provided to demonstrate the effectiveness of the approach. RESULTS/CONCLUSIONS: In the context of geovisual analytics, GIScientists are not only responsible for developing software to facilitate visually-mediated reasoning about large and complex spatiotemporal information, but also for ensuring that this software works. The workflow learning model discussed in this paper and demonstrated in the G-EX Portal Learn Module is one approach to improving the utility of geovisual analytics software. While development of the G-EX Portal Learn Module is ongoing, we expect to release the G-EX Portal Learn Module by Summer 2009.
Mato Abad, Virginia; García-Polo, Pablo; O'Daly, Owen; Hernández-Tamames, Juan Antonio; Zelaya, Fernando
2016-04-01
The method of Arterial Spin Labeling (ASL) has experienced a significant rise in its application to functional imaging, since it is the only technique capable of measuring blood perfusion in a truly non-invasive manner. Currently, there are no commercial packages for processing ASL data and there is no recognized standard for normalizing ASL data to a common frame of reference. This work describes a new Automated Software for ASL Processing (ASAP) that can automatically process several ASL datasets. ASAP includes functions for all stages of image pre-processing: quantification, skull-stripping, co-registration, partial volume correction and normalization. To assess the applicability and validity of the toolbox, this work shows its application in the study of hypoperfusion in a sample of healthy subjects at risk of progressing to Alzheimer's disease. ASAP requires limited user intervention, minimizing the possibility of random and systematic errors, and produces cerebral blood flow maps that are ready for statistical group analysis. The software is easy to operate and results in excellent quality of spatial normalization. The results found in this evaluation study are consistent with previous studies that find decreased perfusion in Alzheimer's patients in similar regions and demonstrate the applicability of ASAP. Copyright © 2015 Elsevier Inc. All rights reserved.
Mohammed, Yassene; Percy, Andrew J; Chambers, Andrew G; Borchers, Christoph H
2015-02-06
Multiplexed targeted quantitative proteomics typically utilizes multiple reaction monitoring and allows the optimized quantification of a large number of proteins. One challenge, however, is the large amount of data that needs to be reviewed, analyzed, and interpreted. Different vendors provide software for their instruments, which determine the recorded responses of the heavy and endogenous peptides and perform the response-curve integration. Bringing multiplexed data together and generating standard curves is often an off-line step accomplished, for example, with spreadsheet software. This can be laborious, as it requires determining the concentration levels that meet the required accuracy and precision criteria in an iterative process. We present here a computer program, Qualis-SIS, that generates standard curves from multiplexed MRM experiments and determines analyte concentrations in biological samples. Multiple level-removal algorithms and acceptance criteria for concentration levels are implemented. When used to apply the standard curve to new samples, the software flags each measurement according to its quality. From the user's perspective, the data processing is instantaneous due to the reactivity paradigm used, and the user can download the results of the stepwise calculations for further processing, if necessary. This allows for more consistent data analysis and can dramatically accelerate the downstream data analysis.
CognitionMaster: an object-based image analysis framework
2013-01-01
Background Automated image analysis methods are becoming more and more important to extract and quantify image features in microscopy-based biomedical studies and several commercial or open-source tools are available. However, most of the approaches rely on pixel-wise operations, a concept that has limitations when high-level object features and relationships between objects are studied and if user-interactivity on the object-level is desired. Results In this paper we present an open-source software that facilitates the analysis of content features and object relationships by using objects as basic processing unit instead of individual pixels. Our approach enables also users without programming knowledge to compose “analysis pipelines“ that exploit the object-level approach. We demonstrate the design and use of example pipelines for the immunohistochemistry-based cell proliferation quantification in breast cancer and two-photon fluorescence microscopy data about bone-osteoclast interaction, which underline the advantages of the object-based concept. Conclusions We introduce an open source software system that offers object-based image analysis. The object-based concept allows for a straight-forward development of object-related interactive or fully automated image analysis solutions. The presented software may therefore serve as a basis for various applications in the field of digital image analysis. PMID:23445542